hadoop协议消息太大了.可能是恶意的.使用CodedInputStream.setSizeLimit()来增加大小限制

栏目: 编程工具 · 发布时间: 5年前

内容简介:我在datanodes的日志中看到了这一点.这可能是因为我将500万个文件复制到hdfs:我只是使用hadoop fs -put ….将文件复制到hdfs.最近我开始在客户端获取这些类型的消息:15/06/30 15:00:58 INFO hdfs.DFSClient:无法完成/pdf-nxml/file1.nxml.COPYING重试…

我在datanodes的日志中看到了这一点.这可能是因为我将500万个文件复制到hdfs:

java.lang.IllegalStateException: com.google.protobuf.InvalidProtocolBufferException: Protocol message was too large.  May be malicious.  Use CodedInputStream.setSizeLimit() to increase the size limit.
at org.apache.hadoop.hdfs.protocol.BlockListAsLongs$BufferDecoder$1.next(BlockListAsLongs.java:332)
at org.apache.hadoop.hdfs.protocol.BlockListAsLongs$BufferDecoder$1.next(BlockListAsLongs.java:310)
at org.apache.hadoop.hdfs.protocol.BlockListAsLongs$BufferDecoder.getBlockListAsLongs(BlockListAsLongs.java:288)
at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.blockReport(DatanodeProtocolClientSideTranslatorPB.java:190)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.blockReport(BPServiceActor.java:507)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:738)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:874)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message was too large.  May be malicious.  Use CodedInputStream.setSizeLimit() to increase the size limit.
at com.google.protobuf.InvalidProtocolBufferException.sizeLimitExceeded(InvalidProtocolBufferException.java:110)
at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:755)
at com.google.protobuf.CodedInputStream.readRawByte(CodedInputStream.java:769)
at com.google.protobuf.CodedInputStream.readRawVarint64(CodedInputStream.java:462)
at com.google.protobuf.CodedInputStream.readSInt64(CodedInputStream.java:363)
at org.apache.hadoop.hdfs.protocol.BlockListAsLongs$BufferDecoder$1.next(BlockListAsLongs.java:326)
... 7 more

我只是使用hadoop fs -put ….将文件复制到hdfs.最近我开始在客户端获取这些类型的消息:

15/06/30 15:00:58 INFO hdfs.DFSClient:无法完成/pdf-nxml/file1.nxml.COPYING重试…

15/06/30 15:01:05 INFO hdfs.DFSClient:无法完成/pdf-nxml/2014-full/file2.nxml.COPYING重试…

我得到一个如上所述的msg每分钟说3次.但是例外在数据节点上更频繁.

我怎样才能解决这个问题?

—–编辑—-

我不得不重新启动hadoop,现在它无法在每个datanode的日志文件中正确启动:

2015-07-01 06:20:35,748 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Unsuccessfully sent block report 0x2ac82e1cf6e64,  containing 1 storage report(s), of which we sent 0. The reports had 6342936 total blocks and used 0 RPC(s). This took 542 msec to generate and 240 msecs for RPC and NN processing. Got back no commands.
2015-07-01 06:20:35,748 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in BPOfferService for Block pool BP-1043486900-10.0.1.42-1434126972501 (Datanode Uuid d5dcf9a0-c82d-49d8-8162-af5910c3e3fe) service to cruncher02/10.0.1.42:8020
java.lang.IllegalStateException: com.google.protobuf.InvalidProtocolBufferException: Protocol message was too large.  May be malicious.  Use CodedInputStream.setSizeLimit() to increase the size limit.
at org.apache.hadoop.hdfs.protocol.BlockListAsLongs$BufferDecoder$1.next(BlockListAsLongs.java:332)
at org.apache.hadoop.hdfs.protocol.BlockListAsLongs$BufferDecoder$1.next(BlockListAsLongs.java:310)
at org.apache.hadoop.hdfs.protocol.BlockListAsLongs$BufferDecoder.getBlockListAsLongs(BlockListAsLongs.java:288)
at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.blockReport(DatanodeProtocolClientSideTranslatorPB.java:190)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.blockReport(BPServiceActor.java:507)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.offerService(BPServiceActor.java:738)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:874)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message was too large.  May be malicious.  Use CodedInputStream.setSizeLimit() to increase the size limit.
at com.google.protobuf.InvalidProtocolBufferException.sizeLimitExceeded(InvalidProtocolBufferException.java:110)
at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:755)
at com.google.protobuf.CodedInputStream.readRawByte(CodedInputStream.java:769)
at com.google.protobuf.CodedInputStream.readRawVarint64(CodedInputStream.java:462)
at com.google.protobuf.CodedInputStream.readSInt64(CodedInputStream.java:363)
at org.apache.hadoop.hdfs.protocol.BlockListAsLongs$BufferDecoder$1.next(BlockListAsLongs.java:326)
... 7 more

以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

C陷阱与缺陷

C陷阱与缺陷

凯尼格 / 高巍 / 人民邮电出版社 / 2008-2-1 / 30.00元

作者以自己1985年在Bell实验室时发表的一篇论文为基础,结合自己的工作经验扩展成为这本对C程序员具有珍贵价值的经典著作。写作本书的出发点不是要批判C语言,而是要帮助C程序员绕过编程过程中的陷阱和障碍。.. 全书分为8章,分别从词法分析、语法语义、连接、库函数、预处理器、可移植性缺陷等几个方面分析了C编程中可能遇到的问题。最后,作者用一章的篇幅给出了若干具有实用价值的建议。.. 本书......一起来看看 《C陷阱与缺陷》 这本书的介绍吧!

CSS 压缩/解压工具
CSS 压缩/解压工具

在线压缩/解压 CSS 代码

JSON 在线解析
JSON 在线解析

在线 JSON 格式化工具

UNIX 时间戳转换
UNIX 时间戳转换

UNIX 时间戳转换