2013년 4월 29일 월요일

hbase table size 측정, mapreduce hadoop 에서 output 없애기


hbase를 종료하고

hadoop fs -dus /hbase/TableName 을 한다.


mapreduce에서 hadoop output 없애기

conf.setOutputFormat(NullOutputFormat.class);

2013년 4월 19일 금요일


Passing parameters to Mappers and Reducers

There might be a requirement to pass additional parameters to the mapper and reducers, besides the the inputs which they process. Lets say we are interested in Matrix multiplication and there are multiple ways/algorithms of doing it. We could send an input parameter to the mapper and reducers, based on which the appropriate way/algorithm is picked. There are multiple ways of doing this

Setting the parameter:

1. Use the -D command line option to set the parameter while running the job.

2. Before launching the job using the old MR API

?
1
2
JobConf job = (JobConf) getConf();
job.set("test", "123");

3. Before launching the job using the new MR API

?
1
2
3
Configuration conf = new Configuration();
conf.set("test", "123");
Job job = new Job(conf);

Getting the parameter:

1. Using the old API in the Mapper and Reducer. The JobConfigurable#configure has to be implemented in the Mapper and Reducer class.

?
1
2
3
4
private static Long N;
public void configure(JobConf job) {
    N = Long.parseLong(job.get("test"));
}

The variable N can then be used with the map and reduce functions.

2. Using the new API in the Mapper and Reducer. The context is passed to the setup, map, reduce and cleanup functions.

?
1
2
Configuration conf = context.getConfiguration();
String param = conf.get("test");