Twitter Json数据未在Hive中被查询

时间:2015-06-25 14:46:05

标签: json hadoop twitter hive

我尝试使用Flume,Hadoop和Hive进行推特情绪分析。  我正在关注这个article。通过使用Flume,我能够成功地向HDFS发送推文。这是我的Twitter代理配置。



#setting properties of agent
Twitter-agent.sources=source1
Twitter-agent.channels=channel1
Twitter-agent.sinks=sink1

#configuring sources
Twitter-agent.sources.source1.type=com.cloudera.flume.source.TwitterSource
Twitter-agent.sources.source1.channels=channel1
Twitter-agent.sources.source1.consumerKey=<consumer-key>
Twitter-agent.sources.source1.consumerSecret=<consumer-secret>
Twitter-agent.sources.source1.accessToken=<access-token>
Twitter-agent.sources.source1.accessTokenSecret=<Access-Token-secret>
Twitter-agent.sources.source1.keywords= morning, night, hadoop, bigdata

#configuring channels
Twitter-agent.channels.channel1.type=memory
Twitter-agent.channels.channel1.capacity=10000
Twitter-agent.channels.channel1.transactionCapacity=100

#configuring sinks
Twitter-agent.sinks.sink1.channel=channel1
Twitter-agent.sinks.sink1.type=hdfs
Twitter-agent.sinks.sink1.hdfs.path=flume/tweets
Twitter-agent.sinks.sink1.rollSize=0
Twitter-agent.sinks.sink1.rollCount=10000
Twitter-agent.sinks.sink1.batchSize=1000
Twitter-agent.sinks.sink1.fileType=DataStream
Twitter-agent.sinks.sink1.writeFormat=Text
&#13;
&#13;
&#13;

然后我创建了一个表,就像在文章中创建表一样。当我查询表时,它会给出这样的错误

&#13;
&#13;
hive> select * from tweets;
OK
Failed with exception java.io.IOException:org.apache.hadoop.hive.serde2.SerDeException: org.codehaus.jackson.JsonParseException: Unexpected character ('S' (code 83)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
 at [Source: java.io.StringReader@31228d83; line: 1, column: 2]
Time taken: 0.914 seconds
&#13;
&#13;
&#13;

我尝试了其他查询,例如select count(id) from tweets,但它显示了很多错误。

这是HDFS中存在的FlumeData文件(推文)之一。

&#13;
&#13;
SEQ!org.apache.hadoop.io.LongWritable"org.apache.hadoop.io.BytesWritable;�@z_�>��<���N ����{"in_reply_to_status_id_str":"613363183034601472","in_reply_to_status_id":613363183034601472,"created_at":"Tue Jun 23 15:09:32 +0000 2015","in_reply_to_user_id_str":"604605328","source":"<a href=\"http://twitter.com/download/iphone\" rel=\"nofollow\">Twitter for iPhone<\/a>","retweet_count":0,"retweeted":false,"geo":null,"filter_level":"low","in_reply_to_screen_name":"AlexiBlue","id_str":"613363262760034304","in_reply_to_user_id":604605328,"favorite_count":0,"id":613363262760034304,"text":"@AlexiBlue good morning ☺️","place":null,"lang":"en","favorited":false,"possibly_sensitive":false,"coordinates":null,"truncated":false,"timestamp_ms":"1435072172237","entities":{"urls":[],"hashtags":[],"user_mentions":[{"indices":[0,10],"screen_name":"AlexiBlue","id_str":"604605328","name":"Alexi Blue ★","id":604605328}],"trends":[],"symbols":[]},"contributors":null,"user":{"utc_offset":null,"friends_count":1175,"profile_image_url_https":"https://pbs.twimg.com/profile_images/604664190763212800/Nmqxn_p5_normal.jpg","listed_count":6,"profile_background_image_url":"http://abs.twimg.com/images/themes/theme1/bg.png","default_profile_image":false,"favourites_count":31695,"description":"PIZZA & TACOS ARE LIFE. #flippinfamily #rudunation #ABNation #5quad #7squad #SamCollinsisbaeaf","created_at":"Sun Mar 09 02:40:15 +0000 2014","is_translator":false,"profile_background_image_url_https":"https://abs.twimg.com/images/themes/theme1/bg.png","protected":false,"screen_name":"Sonja_Campbell1","id_str":"2379671544","profile_link_color":"3B94D9","id":2379671544,"geo_enabled":true,"profile_background_color":"C0DEED","lang":"en","profile_sidebar_border_color":"C0DEED","profile_text_color":"333333","verified":false,"profile_image_url":"http://pbs.twimg.com/profile_images/604664190763212800/Nmqxn_p5_normal.jpg","time_zone":null,"url":null,"contributors_enabled":false,"profile_background_tile":false,"profile_banner_url":"https://pbs.twimg.com/profile_banners/2379671544/1434956813","statuses_count":17254,"follow_request_sent":null,"followers_count":871,"profile_use_background_image":true,"default_profile":false,"following":null,"name":"Sonita✨","location":"","profile_sidebar_fill_color":"DDEEF6","notifications":null}}
&#13;
&#13;
&#13;

任何人都可以帮我吗?

3 个答案:

答案 0 :(得分:0)

SerDe是Serializer / Deserializer的缩写。 Hive使用SerDe接口进行IO。 json是许多人支持的格式之一。我可以在您的错误消息中看到serde异常和json。因此它与hive表列中存在的json数据的编组和解组有关。确定要添加json数据的列。快乐编码

答案 1 :(得分:0)

下载&#34; hive-json-serde.jar&#34;并在查询任何包含诸如json等的SerDe数据的表之前将其添加到hive shell。

每次打开Hive shell时都必须这样做。

答案 2 :(得分:0)

你需要在你的hive shell中下载并添加hive-serdes-1.0-SNAPSHOT.jar,它有一个由clodera提供的JSON serde。然后,您需要根据所需的列创建表。

例如

create external table load_tweets(id BIGINT,text STRING) ROW FORMAT SERDE 'com.cloudera.hive.serde.JSONSerDe' LOCATION '/user/flume/tweets'

执行情绪分析tweet_id和tweeted_text就足够了。现在,如果你使用

select * from load_tweets;

然后你可以在你的hive表中看到包含tweet_id和tweet_text的数据。

您可以参考以下链接,其中使用屏幕截图清楚地解释了情绪分析。

https://acadgild.com/blog/sentiment-analysis-on-tweets-with-apache-hive-using-afinn-dictionary/