Both sides previous revision
Previous revision
Next revision
|
Previous revision
|
spark:using-scala [2014/11/10 17:40] straka |
spark:using-scala [2024/09/27 09:21] (current) straka [Usage Examples] |
</file> | </file> |
| |
* run interactive shell inside ''spark-qrsh'', or start local Spark cluster using as many threads as there are cores: | * run interactive shell using existing Spark cluster (i.e., inside ''spark-srun''), or start local Spark cluster using as many threads as there are cores if there is none: |
<file>spark-shell</file> | <file>spark-shell</file> |
* run interactive shell with local Spark cluster using one thread: | * run interactive shell with local Spark cluster using one thread: |
<file>MASTER=local spark-shell</file> | <file>MASTER=local spark-shell</file> |
* start Spark cluster (10 machines, 1GB RAM each) on SGE and run interactive shell: | * start Spark cluster (10 machines, 2GB RAM each) via Slurm and run interactive shell: |
<file>spark-qrsh 10 1G spark-shell</file> | <file>spark-srun 10 2G spark-shell</file> |
| |
| |
- copy ''/net/projects/spark/sbt/spark-template.sbt'' to your project directory and rename it to your project name (i.e., ''my-best-project.sbt'') | - copy ''/net/projects/spark/sbt/spark-template.sbt'' to your project directory and rename it to your project name (i.e., ''my-best-project.sbt'') |
- replace the ''spark-template'' by your project name in the first line (i.e., ''name := "my-best-project"'') | - replace the ''spark-template'' by your project name in the first line (i.e., ''name := "my-best-project"'') |
- run ''sbt package'' (note that first run of ''sbt'' will take several minutes) | - run ''sbt package'' to create JAR (note that first run of ''sbt'' will take several minutes) |
The resulting JAR can be found in ''target/scala-2.10'' subdirectory, named after your project. | The resulting JAR can be found in ''target/scala-2.11'' subdirectory, named after your project. |
| |
==== Usage Examples ==== | ==== Usage Examples ==== |
The ''sbt'' project file ''word_count.sbt'': | The ''sbt'' project file ''word_count.sbt'': |
<file> | <file> |
name := "word_count"name := "word_count" | name := "word_count" |
| |
version := "1.0" | version := "1.0" |
| |
scalaVersion := "2.10.4" | scalaVersion := "2.12.20" |
| |
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0" | libraryDependencies += "org.apache.spark" %% "spark-core" % "3.5.3" |
</file> | </file> |
| |
<file>sbt package</file> | <file>sbt package</file> |
| |
* run ''word_count'' application inside ''spark-qsub'', ''spark-qrsh'', or start local Spark cluster using as many threads as there are cores: | * run ''word_count'' application inside existing Spark cluster (i.e., inside ''spark-sbatch'' or ''spark-srun''), or start local Spark cluster using as many threads as there are cores if there is none: |
<file>spark-submit --class Main target/scala-2.10/word_count_2.10-1.0.jar input output</file> | <file>spark-submit target/scala-2.12/word_count_2.12-1.0.jar /net/projects/spark-example-data/wiki-cs outdir</file> |
* run ''word_count'' application with local Spark cluster using one thread: | * run ''word_count'' application with local Spark cluster using one thread: |
<file>MASTER=local spark-submit --class Main target/scala-2.10/word_count_2.10-1.0.jar input output</file> | <file>MASTER=local spark-submit target/scala-2.12/word_count_2.12-1.0.jar /net/projects/spark-example-data/wiki-cs outdir</file> |
* start Spark cluster (10 machines, 1GB RAM each) on SGE and run ''word_count'' application: | * start Spark cluster (10 machines, 2GB RAM each) on Slurm and run ''word_count'' application: |
<file>spark-qsub 10 1G spark-submit --class Main target/scala-2.10/word_count_2.10-1.0.jar input output</file> | <file>spark-sbatch 10 2G spark-submit target/scala-2.12/word_count_2.12-1.0.jar /net/projects/spark-example-data/wiki-cs outdir</file> |
| |
Note that the ''-''''-class Main'' arguments are needed only because of [[https://issues.apache.org/jira/browse/SPARK-4298|Spark bug 4298]] and will not be needed when fixed. | |
| |