Expected timestamp in the Flume event headers, but it was null 1


Error Scenario:

Expected timestamp in the Flume event headers, but it was null – NullPointerException

This error message is received in ~/logs/flume.log file when starting a flume agent with HDFS sink with format escape sequences.(%Y, %M, %D, %H, %M, %S).

If a sink expects a header but does not find it, events will become stuck in the channel and Flume will log NullPointer and EventDelivery exceptions.

Root Cause:

We receive this error message when we use format escape sequences related to time stamp parameters (Year, Month, Day, Hour, Minute, Second) in HDFS sink. If there are timestamps in the header of events coming from channel, then HDFS sink can parse them correctly and places them into separate partitions.

But if there are no timestamps in the headers of events coming from channel into HDFS sink, then we will receive this error message.

Resolution:

We can either provide input files with timestamp in the headers of events to flume agent or we can use below HDFS Sink property to setup the agent to use local time of the agent to parse the events into partitions.

One way to add this automatically is to use the TimestampInterceptor.

After setting this property, restart the flume agent with updated flume.conf file. Now the agent parses the events based on the system time while running the flume agent process.


Profile photo of Siva

About Siva

Senior Hadoop developer with 4 years of experience in designing and architecture solutions for the Big Data domain and has been involved with several complex engagements. Technical strengths include Hadoop, YARN, Mapreduce, Hive, Sqoop, Flume, Pig, HBase, Phoenix, Oozie, Falcon, Kafka, Storm, Spark, MySQL and Java.


Leave a comment

Your email address will not be published. Required fields are marked *

One thought on “Expected timestamp in the Flume event headers, but it was null


Review Comments
default image

I am a plsql developer. Intrested to move into bigdata.

Neetika Singh ITA Hadoop in Dec/2016 December 22, 2016

.