This is an old revision of the document!
−Table of Contents
Spark Introduction
This introduction shows several simple examples to give you an idea what programming in Spark is like. See the official Quick Start or Spark Programming Guide or Python API Reference/Scala API Reference for more information.
Running Spark Shell in Python
To run interactive Python shell in local Spark mode, run (on your local workstation or on cluster)
IPYSPARK=1 pyspark
The IPYSPARK=1 parameter instructs Spark to use ipython
instead of python
(the ipython
is an enhanced interactive shell than Python). If you do not want ipython
or you do not have it installed (it is installed everywhere on the cluster, but maybe not on your local workstations – ask Milan if you want it), leave out the IPYSPARK=1
.
After a local Spark executor is started, the Python shell starts.
14/10/03 10:54:35 INFO SparkUI: Started SparkUI at http://tauri4.ufal.hide.ms.mff.cuni.cz:4040
Running Spark Shell in Scala
To run interactive Scala shell in local Spark mode, run (on your local workstation or on cluster)
scala-shell
Once again, the SparkUI address is listed several lines above the shell prompt line.
Word Count Example
The central object of Spark is RDD – resilient distributed dataset. It contains ordered sequence of items.
[ Back to the navigation ] [ Back to the content ]