This is an old revision of the document!
Table of Contents
MapReduce Tutorial : Counters and job configuration
Counters
As in the Perl API, a mapper or a reducer can increment various counters by using context.getCounter(“Group”, “Name”).increment(value):
public void map(Text key, Text value, Context context) throws IOException, InterruptedException { ... context.getCounter("Group", "Name").increment(value); ... }
The getCounter method returns a Counter object, so if a counter is incremented frequently, the getCounter method can be called only once:
public void reduce(Text key, Iterable<IntWritable> values, Context context) throws IOException, InterruptedException { ... Counter values = context.getCounter("Reducer", "Number of values"); for (IntWritable value : values) { ... values.increment(1); } }
Job configuration
The job properties can be set:
- on the command line – the ToolRunner parses options in format-Dname=value. See the syntax of the hadoop script.
- using the Job.getConfiguration()a Configuration object is retrieved. It provides following methods:- String get(String name)– get the value of the- nameproperty,- nullif it does not exist.
- String get(String name, String defaultValue)– get the value of the- nameproperty
- getBoolean,- getClass,- getFile,- getFloat,- getInt,- getLong,- getStrings– return a typed value of the- nameproperty (i.e., number, file name, class name, …).
- set(String name, String value)– set the value of the- nameproperty to- value.
- setBoolean,- setClass,- setFile,- setFloat,- setInt,- setLong,- setStrings– set the typed value of the- nameproperty (i.e., number, file name, class name, …).
 
Apart from already mentioned brief list of Hadoop properties, there is one important Java-specific property:
