Both sides previous revision
Previous revision
Next revision
|
Previous revision
|
spark [2022/12/14 12:26] straka [Basic Information] |
spark [2024/09/27 09:17] (current) straka [Basic Information] |
====== Spark: Framework for Distributed Computations (Under Construction) ====== | ====== Spark: Framework for Distributed Computations ====== |
| |
[[http://spark.apache.org|{{:spark:spark-logo.png?150 }}]] [[http://spark.apache.org|Spark]] is a framework for distributed computations. Natively it works in Python, Scala and Java, and can be used limitedly in Perl using pipes. | [[http://spark.apache.org|{{:spark:spark-logo.png?150 }}]] [[http://spark.apache.org|Spark]] is a framework for distributed computations. Natively it works in Python, Scala and Java, and can be used limitedly in Perl using pipes. |
All Python, Scala and Java bindings work well in UFAL Environment. The displayed examples here are in Python and Scala. We do not discuss the Java binding, because it has the same API as Spark (and if you are a Java fan or know Java substantially better than Spark, you will be able to use it by yourself). | All Python, Scala and Java bindings work well in UFAL Environment. The displayed examples here are in Python and Scala. We do not discuss the Java binding, because it has the same API as Spark (and if you are a Java fan or know Java substantially better than Spark, you will be able to use it by yourself). |
| |
Currently (Dec 2022), Spark 3.3.1 is available. | Currently (Oct 2024), Spark 3.5.3 is available. |
| |
===== Getting Started ===== | ===== Getting Started ===== |
* Official [[http://spark.apache.org/docs/latest/programming-guide.html|Spark Programming Guide]] | * Official [[http://spark.apache.org/docs/latest/programming-guide.html|Spark Programming Guide]] |
* Official [[http://spark.apache.org/docs/latest/mllib-guide.html|MLlib Programming Guide]] (Spark’s scalable machine learning library consisting of common learning algorithms and utilities, including classification, regression, clustering, collaborative filtering, dimensionality reduction, as well as underlying optimization primitives) | * Official [[http://spark.apache.org/docs/latest/mllib-guide.html|MLlib Programming Guide]] (Spark’s scalable machine learning library consisting of common learning algorithms and utilities, including classification, regression, clustering, collaborative filtering, dimensionality reduction, as well as underlying optimization primitives) |
* Official [[http://spark.apache.org/docs/latest/api/python/index.html|Python API Reference]]/[[http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.package|Scala API Reference]] | * Official [[http://spark.apache.org/docs/latest/api/python/index.html|Python API Reference]]/[[https://spark.apache.org/docs/latest/api/scala/org/apache/spark/index.html|Scala API Reference]] |
| |
===== Using Spark in UFAL Environment ===== | ===== Using Spark in UFAL Environment ===== |
| |
Latest supported version of Spark is available in ''/net/projects/spark''. To use it, add | Latest supported version of Spark is available in ''/net/projects/spark''. To use it, add |
export PATH="/net/projects/spark/bin:/net/projects/spark/sge:$PATH" | export PATH="/net/projects/spark/bin:/net/projects/spark/slurm:$PATH" |
to your ''.bashrc'' (or to your favourite shell config file). If you want to use Scala and do not have ''sbt'' already installed (or you do not know what ''sbt'' is), add also | to your ''.bashrc'' (or to your favourite shell config file). If you want to use Scala and do not have ''sbt'' already installed (or you do not know what ''sbt'' is), add also |
export PATH="/net/projects/spark/sbt/bin:$PATH" | export PATH="/net/projects/spark/sbt/bin:$PATH" |