This is an old revision of the document!
Table of Contents
Running Spark on Single Machine or on Cluster
In order to use Spark, environment has to bee set up according to Using Spark in UFAL Environment.
When Spark computation starts, it uses environment variable MASTER
to determine the mode of computation. The following values are possible:
local
: Run locally using single thread.local[N]
(e.g.,local[2]
orlocal[4]
): Run locally usingN
threads.local[*]
(default ifMASTER
variable does not exist): Run locally using as many threads as there are processor cores.spark:/
/master_address:master_port
: Run in a distributed fashion using specified master.
Running Spark on Single Machine
Spark computations can be started both on desktop machines and on cluster machines, either by specifying MASTER
to one of local
modes, or by not specifying MASTER at all (local[*]
is used then).
Note that when you use qrsh
or qsub
, your job is usually expected to use one core, so you should specify MASTER=local
. If you do not, Spark will use all cores on the machine, even though SGE gave you only one.
Starting Spark Cluster
Spark cluster can be started using SGE. The cluster is user-specific, but it can be used for multiple Spark computations.
The Spark cluster can be started using one of the following two commands:
spark_qsub
: start Spark cluster and perform qsub[SGE_OPTS=additional_SGE_args] spark_qsub workers memory command [arguments]
spark_qrsh
: start Spark cluster and perform qrsh[SGE_OPTS=additional_SGE_args] spark_qrsh workers memory [command arguments]
Both spark_qsub
and spark_qrsh
commands start Spark cluster with specified number of workers, each with given number of memory. Then they set MASTER
and SPARK_ADDRESS
to address of the Spark master and SPARK_WEBUI
to http address of the master web interface. Both these values are also written on standard output and added to SGE job metadata. Lastly, specified command is started either as using qsub
or qrsh
. Note that when spark_qrsh
is used, no command may be specified, in which case an interactive shell is opened.
To use cluster for a long period time, run spark-qsub
with command sleep infinity
and then stop the cluster using qdel
.
Memory Specification
Memory for each worker must be specified. The memory can be specified either in bytes, or using kK/mM/gG
suffix. A reasonable default value is 512M or 1G.