首先是我使用ubuntu服务器16.04配置MHN(现代蜜罐网络),并将传感器部署在MHN服务器中,在MHN配置过程中,我选择是,将麋鹿(elasticsearch,logstash和kibana)安装在MHN服务器中。当我检查sudo administratorctl状态时显示结果:
geoloc RUNNING pid 1248, uptime 2 days, 22:16:13
honeymap RUNNING pid 1246, uptime 2 days, 22:16:13
hpfeeds-broker RUNNING pid 1253, uptime 2 days, 22:16:13
hpfeeds-logger-json RUNNING pid 1250, uptime 2 days, 22:16:13
kibana RUNNING pid 1244, uptime 2 days, 22:16:13
logstash RUNNING pid 1252, uptime 2 days, 22:16:13
mhn-celery-beat RUNNING pid 1243, uptime 2 days, 22:16:13
mhn-celery-worker RUNNING pid 1247, uptime 2 days, 22:16:13
mhn-collector RUNNING pid 1251, uptime 2 days, 22:16:13
mhn-uwsgi RUNNING pid 1249, uptime 2 days, 22:16:13
mnemosyne RUNNING pid 1242, uptime 2 days, 22:16:13
但是,当我打开Kibana仪表板并尝试匹配MHN的索引模式时,却启用了获取我的mhn映射的功能。
在蜜罐传感器中进行MHN计数攻击时,我检查了日志文件
“ cd /var/log/mhn/mhn-json.log”
我在mhn-json.log文件中发现此错误,而我的Kibana无法可视化我的MHN(现代蜜罐网络)
{:timestamp =>“ 2019-10-03T16:00:01.170000 + 0800”,:message =>“发送大量操作时出错:找不到类org.jruby.RubyObject的序列化器”,:level =>:错误} {:timestamp =>“ 2019-10-03T16:00:01.170000 + 0800”,:message =>“未能刷新外发项目”,:outgoing_count => 1,:exception =>“ JrJackson :: ParseError”,:backtrace => [“ com / jrjackson / JrJacksonBase.java:78:in
generate'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/jrjackson-0.3.7/lib/jrjackson/jrjackson.rb:59:in
dump'”,“ /opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json/ adapters / jr_jackson.rb:20:indump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json/adapter.rb:25:in
dump'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/multi_json-1.11.2/lib/multi_json.rb:136:in { {1}} _ bulkify'“,” org / jruby / RubyArray.java:2414:indump'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.15/lib/elasticsearch/api/utils.rb:102:in
_ bulkify'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0 .15 / lib / elasticsearch / api / actions / bulk.rb:82:inmap'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.15/lib/elasticsearch/api/utils.rb:102:in
bulk'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.1 .0-java / lib / logstash / outputs / elasticsearch.rb:548:inbulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.1.0-java/lib/logstash/outputs/elasticsearch/protocol.rb:105:in
submit'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.1 .0-java / lib / logstash / outputs / elasticsearch.rb:572:在submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.1.0-java/lib/logstash/outputs/elasticsearch.rb:547:in
flush'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib中/stud/buffer.rb:219:inflush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.1.0-java/lib/logstash/outputs/elasticsearch.rb:571:in
中的每个“”,“ /opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:216:inbuffer_flush'", "org/jruby/RubyHash.java:1342:in
buffer_flush'“,”“ / opt / logstash / vendor / bundle /jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:159:inbuffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.21/lib/stud/buffer.rb:193:in
receive'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash -core-1.5.6-java / lib / logstash / outputs / base.rb:88:buffer_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.1.0-java/lib/logstash/outputs/elasticsearch.rb:537:in
output_func'“,” /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core中-1.5.6-java / lib / logstash / pipeline.rb:244:inhandle'", "(eval):163:in
start_outputs'“],:level =>:warn}