[ Skip to the content ]

Institute of Formal and Applied Linguistics Wiki


[ Back to the navigation ]

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Last revision Both sides next revision
courses:mapreduce-tutorial:step-1 [2012/01/29 15:57]
straka
courses:mapreduce-tutorial:step-1 [2012/01/30 14:47]
straka
Line 4: Line 4:
  
 The tutorial expects you to be logged to a computer in the UFAL cluster. In this environment, Hadoop is installed in ''/SGE/HADOOP/active''. The tutorial expects you to be logged to a computer in the UFAL cluster. In this environment, Hadoop is installed in ''/SGE/HADOOP/active''.
- 
-You can go through the tutorial even without being connected to UFAL cluster, but you will need 
-  * local Hadoop installation 
-    - download ''http://www.apache.org/dist/hadoop/common/hadoop-1.0.0/hadoop-1.0.0.tar.gz'' 
-    - unpack it 
-    - edit ''conf/hadoop-env.sh'' file and make sure there is valid line <code>export JAVA_HOME=/path/to/your/jdk</code> 
-  * the repository ''hadoop'' containing the Perl API and Java extensions. 
-  * when using Perl API, set ''hadoop_prefix'' to point to your Hadoop installation 
-  * when using Java API, one of the ''Makefile''s contain absolute path to the ''hadoop'' repository -- please correct it 
-When using local Hadoop installation, you must run all jobs either locally in a single thread or start a local cluster and use ''-jt'' for the jobs to use it (see [[.:step-7#using-a-running-cluster]]). 
  
 ===== The Perl API ===== ===== The Perl API =====
Line 41: Line 31:
   echo 'export PERLLIB="$PERLLIB:/net/projects/hadoop/perl/"' >> ~/.bashrc   echo 'export PERLLIB="$PERLLIB:/net/projects/hadoop/perl/"' >> ~/.bashrc
   echo 'export PERL5LIB="$PERL5LIB:/net/projects/hadoop/perl"' >> ~/.bashrc   echo 'export PERL5LIB="$PERL5LIB:/net/projects/hadoop/perl"' >> ~/.bashrc
 +
 +===== When not logged in UFAL cluster =====
 +
 +**If you are not logged in the UFAL cluster, you will need:**
 +  * local Hadoop installation
 +    - download ''http://www.apache.org/dist/hadoop/common/hadoop-1.0.0/hadoop-1.0.0.tar.gz''
 +    - unpack it
 +    - edit ''conf/hadoop-env.sh'' file and make sure there is valid line <code>export JAVA_HOME=/path/to/your/jdk</code>
 +  * the repository ''hadoop'' containing the Perl API and Java extensions.
 +  * when using Perl API, set ''hadoop_prefix'' to point to your Hadoop installation
 +  * when using Java API, one of the ''Makefile''s contain absolute path to the ''hadoop'' repository -- please correct it
 +When using local Hadoop installation, you must run all jobs either locally in a single thread or start a local cluster and use ''-jt'' for the jobs to use it (see [[.:step-7#using-a-running-cluster]]).
  
 ---- ----

[ Back to the navigation ] [ Back to the content ]