[ Skip to the content ]

Institute of Formal and Applied Linguistics Wiki


[ Back to the navigation ]

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revision Both sides next revision
spark:spark-introduction [2014/11/11 09:13]
straka
spark:spark-introduction [2022/12/14 12:29]
straka [Running Spark Shell in Python]
Line 1: Line 1:
 ====== Spark Introduction ====== ====== Spark Introduction ======
  
-This introduction shows several simple examples to give you an idea what programming in Spark is like. See the official [[http://spark.apache.org/docs/latest/quick-start.html|Quick Start]] or [[http://spark.apache.org/docs/latest/programming-guide.html|Spark Programming Guide]] or [[http://spark.apache.org/docs/latest/api/python/index.html|Python API Reference]]/[[http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.package|Scala API Reference]] for more information.+This introduction shows several simple examples to give you an idea what programming in Spark is like. See the official [[http://spark.apache.org/docs/latest/quick-start.html|Quick Start]] or [[http://spark.apache.org/docs/latest/programming-guide.html|Spark Programming Guide]] or [[http://spark.apache.org/docs/latest/api/python/index.html|Python API Reference]]/[[https://spark.apache.org/docs/latest/api/scala/org/apache/spark/index.html|Scala API Reference]] for more information.
  
 ===== Running Spark Shell in Python ===== ===== Running Spark Shell in Python =====
  
 To run interactive Python shell in local Spark mode, run (on your local workstation or on cluster using ''qrsh'' from ''lrc1'') To run interactive Python shell in local Spark mode, run (on your local workstation or on cluster using ''qrsh'' from ''lrc1'')
-  IPYTHON=pyspark +  PYSPARK_DRIVER_PYTHON=ipython3 pyspark 
-The IPYTHON=parameter instructs Spark to use ''ipython'' instead of ''python'' (the ''ipython'' is an enhanced interactive shell than Python). If you do not want ''ipython'' or you do not have it installed (it is installed everywhere on the cluster, but maybe not on your local workstations -- ask our IT if you want it), use only ''pyspark'', but note that it has some issues when copy-pasting examples from this wiki.+The PYSPARK_DRIVER_PYTHON=ipython3 parameter instructs Spark to use ''ipython3'' instead of ''python3''.
  
-After a local Spark executor is started, the Python shell starts. Severel lines above +After a local Spark executor is started, the Python shell starts. Several lines above 
-the prompt line, the SparkUI address is listed in the following format: +the prompt line, the Spark UI address is listed in the following format: 
-  14/10/03 10:54:35 INFO SparkUI: Started SparkUI at http://tauri4.ufal.hide.ms.mff.cuni.cz:4040 +  Spark context Web UI available at http://hyperion7.ufal.hide.ms.mff.cuni.cz:4040 
-The SparkUI is an HTML interface which displays the state of the application -- if a distributed computation is taking place, how many workers are part of it, how many tasks are left to be processed, any error logs, also cached datasets and their properties (cached on disk / memory, their size) are displayed.+The Spark UI is an HTML interfacewhich displays the state of the application -- whether a distributed computation is taking place, how many workers are part of it, how many tasks are left to be processed, any error logs, also cached datasets and their properties (cached on disk / memory, their size) are displayed.
  
 ==== Running Spark Shell in Scala ==== ==== Running Spark Shell in Scala ====
Line 51: Line 51:
 val counts = words.map(word => (word,1)).reduceByKey((c1,c2) => c1+c2) val counts = words.map(word => (word,1)).reduceByKey((c1,c2) => c1+c2)
 val sorted = counts.sortBy({case (word, count) => count}, ascending=false) val sorted = counts.sortBy({case (word, count) => count}, ascending=false)
-sorted.saveAsTextFile('output')+sorted.saveAsTextFile("output")
  
 // Alternatively without variables and using placeholders in lambda parameters: // Alternatively without variables and using placeholders in lambda parameters:

[ Back to the navigation ] [ Back to the content ]