[ Skip to the content ]

Institute of Formal and Applied Linguistics Wiki


[ Back to the navigation ]

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revision Both sides next revision
spark [2014/11/04 09:29]
straka
spark [2014/11/07 15:45]
straka
Line 1: Line 1:
 ====== Spark: Framework for Distributed Computations (Under Construction) ====== ====== Spark: Framework for Distributed Computations (Under Construction) ======
  
-{{:spark:spark-logo.png?150 }} +[[http://spark.apache.org|{{:spark:spark-logo.png?150 }}]] [[http://spark.apache.org|Spark]] is a framework for distributed computations. Natively it works in Python, Scala and Java, and can be used limitedly in Perl using pipes.
- +
-[[http://spark.apache.org|Spark]] is a framework for distributed computations. Natively it works in Python, Scala and Java, and can be used limitedly in Perl using pipes.+
  
 Apart from embarrassingly parallel computations, Spark framework is suitable for //in-memory// and/or //iterative// computations, making it suitable even for machine learning and complex data processing. (The Spark framework shares some underlying implementation with [[http://http://hadoop.apache.org/|Hadoop]], but it is quite different -- Hadoop framework does not offer in-memory computations and has only limited support for iterative computations.) Apart from embarrassingly parallel computations, Spark framework is suitable for //in-memory// and/or //iterative// computations, making it suitable even for machine learning and complex data processing. (The Spark framework shares some underlying implementation with [[http://http://hadoop.apache.org/|Hadoop]], but it is quite different -- Hadoop framework does not offer in-memory computations and has only limited support for iterative computations.)
Line 27: Line 25:
   export PATH="/net/projects/spark/sbt/bin:$PATH"   export PATH="/net/projects/spark/sbt/bin:$PATH"
  
 +  * [[spark:Using Spark on Single Machine]]
   * [[spark:Starting Spark Cluster in UFAL Environment]]   * [[spark:Starting Spark Cluster in UFAL Environment]]
   * [[spark:Using Python]]   * [[spark:Using Python]]

[ Back to the navigation ] [ Back to the content ]