Hadoop问题

飞来飞去1 发布于 2018/10/26 16:36
阅读 74
收藏 1
[WARN ] 2018-10-26 16:12:30,488 method:org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:587) job_local169316463_0001 java.lang.Exception: java.lang.RuntimeException: java.io.EOFException at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:489) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:556) Caused by: java.lang.RuntimeException: java.io.EOFException at org.apache.hadoop.io.WritableComparator.compare(WritableComparator.java:165) at org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKeyValue(ReduceContextImpl.java:158) at org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKey(ReduceContextImpl.java:121) at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.nextKey(WrappedReducer.java:302) at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:170) at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:627) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389) at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:346) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.io.EOFException at java.io.DataInputStream.readFully(DataInputStream.java:197) at java.io.DataInputStream.readUTF(DataInputStream.java:609) at java.io.DataInputStream.readUTF(DataInputStream.java:564) at cn.edu360.sy.order.OrderBean.readFields(OrderBean.java:91) at org.apache.hadoop.io.WritableComparator.compare(WritableComparator.java:158) ... 12 more [INFO ] 2018-10-26 16:12:31,133 method:org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1411) Job job_local169316463_0001 running in uber mode : false [INFO ] 2018-10-26 16:12:31,134 method:org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1418) map 100% reduce 0% [INFO ] 2018-10-26 16:12:31,135 method:org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1431) Job job_local169316463_0001 failed with state FAILED due to: NA [INFO ] 2018-10-26 16:12:31,140 method:org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1436) Counters: 30 File System Counters FILE: Number of bytes read=646 FILE: Number of bytes written=282249 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 Map-Reduce Framework Map input records=14 Map output records=14 Map output bytes=692 Map output materialized bytes=732 Input split bytes=109 Combine input records=0 Combine output records=0 Reduce input groups=0 Reduce shuffle bytes=732 Reduce input records=0 Reduce output records=0 Spilled Records=14 Shuffled Maps =2 Failed Shuffles=0 Merged Map outputs=2 GC time elapsed (ms)=0 Total committed heap usage (bytes)=257425408 Shuffle Errors BAD_ID=0 CONNECTION=0 IO_ERROR=0 WRONG_LENGTH=0 WRONG_MAP=0 WRONG_REDUCE=0 File Input Format Counters Bytes Read=481 File Output Format Counters Bytes Written=0 上面的是报错信息,找百度说是类的序列化与反序列化不对应,但感觉自己的不是这个原因: public class OrderBean implements WritableComparable{ private String order_id; private String user_id; private String product_name; private float product_price; private int product_sum; private float product_sum_price; public OrderBean() { } public void set(String order_id, String user_id, String product_name, float product_price, int product_sum, float product_sum_price) { this.order_id = order_id; this.user_id = user_id; this.product_name = product_name; this.product_price = product_price; this.product_sum = product_sum; this.product_sum_price = product_sum_price; } ...... @Override public void write(DataOutput paramDataOutput) throws IOException { paramDataOutput.writeUTF(this.getOrder_id()); paramDataOutput.writeUTF(this.getUser_id()); paramDataOutput.writeUTF(this.getProduct_name()); paramDataOutput.writeFloat(this.getProduct_price()); paramDataOutput.writeInt(this.getProduct_sum()); paramDataOutput.writeFloat(this.getProduct_sum_price()); } @Override public void readFields(DataInput paramDataInput) throws IOException { this.setOrder_id(paramDataInput.readUTF()); this.setUser_id(paramDataInput.readUTF()); this.setProduct_name(paramDataInput.readUTF()); this.setProduct_price(paramDataInput.readFloat()); this.setProduct_sum(paramDataInput.readInt()); this.setProduct_sum_price(paramDataInput.readFloat()); } @Override public int compareTo(OrderBean o) { int i = o.getOrder_id().compareTo(this.order_id); int code = Float.compare(o.getProduct_sum_price(), this.getProduct_sum_price()); return i==0?code:i; } } 上面的是我的一个实现WritableComparable接口的一个类,求大神帮忙看看
加载中
0
飞来飞去1
飞来飞去1

这个问题已经发现了,

public static class OrderStep2Mapper extends Mapper<LongWritable, Text, Text, OrderBean>{。。。}

一个改为:

public static class OrderStep2Mapper extends Mapper<LongWritable, Text, OrderBean, NullWritable>{
....}

原因是MapTask的排序调用的是key.compareTo()方法,所有OrderBean应该放在key里面而不是放在value里面,具体为什么会导致这样的报错还没有完全搞明白

返回顶部
顶部