Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
311 views
in Technique[技术] by (71.8m points)

java - How to get Filename/File Contents as key/value input for MAP when running a Hadoop MapReduce Job?

I am creating a program to analyze PDF, DOC and DOCX files. These files are stored in HDFS.

When I start my MapReduce job, I want the map function to have the Filename as key and the Binary Contents as value. I then want to create a stream reader which I can pass to the PDF parser library. How can I achieve that the key/value pair for the Map Phase is filename/filecontents?

I am using Hadoop 0.20.2

This is older code that starts a job:

public static void main(String[] args) throws Exception {
 JobConf conf = new JobConf(PdfReader.class);
 conf.setJobName("pdfreader");

 conf.setOutputKeyClass(Text.class);
 conf.setOutputValueClass(IntWritable.class);

 conf.setMapperClass(Map.class);
 conf.setReducerClass(Reduce.class);

 conf.setInputFormat(TextInputFormat.class);
 conf.setOutputFormat(TextOutputFormat.class);

 FileInputFormat.setInputPaths(conf, new Path(args[0]));
 FileOutputFormat.setOutputPath(conf, new Path(args[1]));

 JobClient.runJob(conf);
}

I Know there are other inputformat types. But is there one that does exactly what I want? I find the documentation quite vague. If there is one available, then how should the Map function input types look?

Thanks in advance!

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

The solution to this is to create your own FileInputFormat class that does this. You have access to the name of the input file from the FileSplit that this FileInputFormat receives (getPath). Be sure to overrule the isSplitable of your FileInputformat to always return false.

You will also need a custom RecordReader that returns the entire file as a single "Record" value.

Be careful in handling files that are too big. You will effectively load the entire file into RAM and the default setting for a task tracker is to have only 200MB RAM available.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...