首页 旅游资讯 线路攻略 景点大全 国内游 境外游 美食特产
您的当前位置:首页正文

WordCount分析

2022-10-07 来源:锐游网
WordCount分析

WordCount程序中Map和Reduce过程分析

Map过程:

(1)⽂件的拆分,测试⽤的⽂件较⼩,每个⽂件为⼀个split,并将⽂件按⾏分成对,如下图,这⼀步是由框架完成:

(2)将分割好的对交给⽤户定义的map⽅法进⾏处理,⽣成新的对,如下图,这⼀步是由⽤户的map函数完成的,其中的每⼀⾏对应⼀个map进程,总共需要四个map进程:

(3)得到map⽅法输出的后,框架会按照key值进⾏排序,并执⾏Combine过程,将相同的key值进⾏累加,这⼀步是由框架完成,如下图:

Reduce 过程:

Reducer 先对从Mapper接收的数据进⾏排序,再交由⽤户定义的Reduce⽅法处理,得到新的对,如下图,由四个reduce进程完成:

import java.io.IOException;import java.net.URI;

import org.apache.hadoop.conf.Configuration;import org.apache.hadoop.fs.FileSystem;import org.apache.hadoop.fs.Path;

import org.apache.hadoop.io.LongWritable;import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Job;import org.apache.hadoop.mapreduce.Mapper;import org.apache.hadoop.mapreduce.Reducer;

import org.apache.hadoop.mapreduce.Reducer.Context;

import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

public class WordCount {

/**

* @param args * @author nwpulisz * @date:2016.3.29 */

static final String INPUT_PATH=\"hdfs://192.168.255.132:9000/INPUT\";

static final String OUTPUT_PATH=\"hdfs://192.168.255.132:9000/OUTPUT\";

public static void main(String[] args) throws Throwable { // TODO Auto-generated method stub Configuration conf = new Configuration();

Path outPut_path= new Path(OUTPUT_PATH); Job job = new Job(conf, \"WordCount\");

//如果输出路径是存在的,则提前删除输出路径

FileSystem fileSystem = FileSystem.get(new URI(OUTPUT_PATH), conf); if(fileSystem.exists(outPut_path)) {

fileSystem.delete(outPut_path,true); }

FileInputFormat.setInputPaths(job, INPUT_PATH); FileOutputFormat.setOutputPath(job, outPut_path);

job.setMapperClass(MyMapper.class); job.setReducerClass(MyReducer.class); job.setOutputKeyClass(Text.class);

job.setOutputValueClass(LongWritable.class); job.waitForCompletion(true); }

static class MyMapper extends Mapper{ protected void map(LongWritable key, Text value,

Context context) throws IOException, InterruptedException { String[] splits = value.toString().split(\"\\\\W+\"); for (String word : splits) {

context.write(new Text(word), new LongWritable(1)); } } }

static class MyReducer extends Reducer{

protected void reduce(Text key, Iterable values, Context context ) throws IOException, InterruptedException { long times = 0L;

for(LongWritable value: values) { times+=value.get(); }

context.write(new Text(key),new LongWritable(times)); } } }

  

输⼊:

输出:

权限报错问题:

第⼀次运⾏出现权限报错问题:java.io.IOException: Failed to set permissions...修正⽅法:

1、找到Hadoop源码中的FileUtil.java⽂件,我的路径在E:\\hadoop-1.1.2\\hadoop-1.1.2\\src\\core\\org\\apache\\hadoop\\fs中;2、找到checkReturnValue⽅法,注释掉其中的内容,如下图:

3、重新编译该⽂件(需要导⼊Hadoop的相关jar包),之后⽣产两个.class⽂件

4、将新⽣成的两个类覆盖hadoop-core-1.1.2.jar\\org\\apache\\hadoop\\fs中的原来的两个类,即可。

因篇幅问题不能全部显示,请点此查看更多更全内容