[ Skip to the content ]

Institute of Formal and Applied Linguistics Wiki


[ Back to the navigation ]

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revision Both sides next revision
spark:running-spark-on-single-machine-or-on-cluster [2014/11/10 15:01]
straka
spark:running-spark-on-single-machine-or-on-cluster [2014/11/11 09:21]
straka
Line 23: Line 23:
   * ''spark-qrsh'': start Spark cluster and perform qrsh <file>[SGE_OPTS=additional_SGE_args] spark-qrsh workers memory [command arguments]</file>   * ''spark-qrsh'': start Spark cluster and perform qrsh <file>[SGE_OPTS=additional_SGE_args] spark-qrsh workers memory [command arguments]</file>
  
-Both ''spark-qsub'' and ''spark-qrsh'' commands start Spark cluster with specified number of workers, each with given number of memory. Then they set ''MASTER'' and ''SPARK_ADDRESS'' to address of the Spark master and ''SPARK_WEBUI'' to http address of the master web interface. Both these values are also written on standard output and added to SGE job metadata. Lastly, specified command is started either as using ''qsub'' or ''qrsh''. Note that when ''spark-qrsh'' is used, no command may be specified, in which case an interactive shell is opened.+Both ''spark-qsub'' and ''spark-qrsh'' commands start Spark cluster with specified number of workers, each with given number of memory. Then they set ''MASTER'' and ''SPARK_ADDRESS'' to address of the Spark master and ''SPARK_WEBUI'' to http address of the master web interface. Both these values are also written on standard output and added to SGE job metadata. Lastly, specified command is started either as using ''qsub'' or ''qrsh''. Note that when ''spark-qrsh'' is used, the command may be empty, in which case an interactive shell is opened.
  
 ==== Memory Specification ==== ==== Memory Specification ====
Line 33: Line 33:
  
 Start Spark cluster with 10 machines 1GB RAM each and then run interactive shell. The cluster stops after the shell is exited. Start Spark cluster with 10 machines 1GB RAM each and then run interactive shell. The cluster stops after the shell is exited.
-  spark-qrsh 10 1G+<file>spark-qrsh 10 1G</file>
  
 Start Spark cluster with 20 machines 512MB RAM each. The cluster has to be stopped manually using ''qdel''. Start Spark cluster with 20 machines 512MB RAM each. The cluster has to be stopped manually using ''qdel''.
-  spark-qsub 20 512m sleep infinity+<file>spark-qsub 20 512m sleep infinity</file> 
 + 
 +Note that a running Spark cluster can currently be used only from other cluster machines (connections to a running SGE Spark cluster from my workstation ends with timeout).
  
 ==== Additional SGE Options ==== ==== Additional SGE Options ====
  
-Additional ''qrsh'' or ''qsub'' options can be specified in ''SGE_OPTS'' environmental variable (not as ''spark-qsub'' or ''spark-qrsh'' arguments), as in the following example which schedules the Spark master and workers to machines different from ''hyperion*'' and ''pandora*'': +Additional ''qrsh'' or ''qsub'' options can be specified in ''SGE_OPTS'' environmental variable (not as ''spark-qsub'' or ''spark-qrsh'' arguments), as in the following example which schedules the Spark master and workers to machines different then ''hyperion*'' and ''pandora*'': 
-  SGE_OPTS='-q *@!(hyperion*|pandora*)' spark-qrsh 10 1G+<file>SGE_OPTS='-q *@!(hyperion*|pandora*)' spark-qrsh 10 1G</file>
  

[ Back to the navigation ] [ Back to the content ]