[ Skip to the content ]

Institute of Formal and Applied Linguistics Wiki


[ Back to the navigation ]

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revision Both sides next revision
spark [2014/11/07 15:47]
straka
spark [2014/11/11 08:40]
straka
Line 16: Line 16:
   * Official [[http://spark.apache.org/docs/latest/quick-start.html|Quick Start]]   * Official [[http://spark.apache.org/docs/latest/quick-start.html|Quick Start]]
   * Official [[http://spark.apache.org/docs/latest/programming-guide.html|Spark Programming Guide]]   * Official [[http://spark.apache.org/docs/latest/programming-guide.html|Spark Programming Guide]]
 +  * Official [[http://spark.apache.org/docs/latest/mllib-guide.html|MLlib Programming Guide]] (Spark’s scalable machine learning library consisting of common learning algorithms and utilities, including classification, regression, clustering, collaborative filtering, dimensionality reduction, as well as underlying optimization primitives)
   * Official [[http://spark.apache.org/docs/latest/api/python/index.html|Python API Reference]]/[[http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.package|Scala API Reference]]   * Official [[http://spark.apache.org/docs/latest/api/python/index.html|Python API Reference]]/[[http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.package|Scala API Reference]]
  
Line 22: Line 23:
 Latest supported version of Spark is available in ''/net/projects/spark''. To use it, add Latest supported version of Spark is available in ''/net/projects/spark''. To use it, add
   export PATH="/net/projects/spark/bin:/net/projects/spark/sge:$PATH"   export PATH="/net/projects/spark/bin:/net/projects/spark/sge:$PATH"
-to your ''.bashrc'' (or ''.profile'' and log in again; or to your favourite shell config file). If you want to use Scala and do not have ''sbt'' already installed (or you do not know what ''sbt'' is), add also+to your ''.bashrc'' (or to your favourite shell config file). If you want to use Scala and do not have ''sbt'' already installed (or you do not know what ''sbt'' is), add also
   export PATH="/net/projects/spark/sbt/bin:$PATH"   export PATH="/net/projects/spark/sbt/bin:$PATH"
  
-  * [[spark:Running Spark on Single Machine]] +  * [[spark:Running Spark on Single Machine or on Cluster]]
-  * [[spark:Running Spark Cluster in UFAL Environment]]+
   * [[spark:Using Python]]   * [[spark:Using Python]]
   * [[spark:Using Scala]]   * [[spark:Using Scala]]

[ Back to the navigation ] [ Back to the content ]