FluentD无法使用ElasticSearch(K8S,AWS)

时间:2018-03-15 11:26:31

标签: elasticsearch kubernetes fluentd

我有一个针对AWS运行的k8s集群(v1.9)。 有一个弹性搜索和运行,工作正常。

由于某种原因,fluentD日志没有到达elasticsearch。

fluentD Daemon Set:

{
  "kind": "DaemonSet",
  "apiVersion": "extensions/v1beta1",
  "metadata": {
    "name": "fluentd-es-v2.0.3",
    "namespace": "kube-system",
    "selfLink": "/apis/extensions/v1beta1/namespaces/kube-system/daemonsets/fluentd-es-v2.0.3",
    "uid": "f0a23779-fba7-11e7-a9be-12d5302c43be",
    "resourceVersion": "10549372",
    "generation": 1,
    "creationTimestamp": "2018-01-17T17:00:37Z",
    "labels": {
      "addonmanager.kubernetes.io/mode": "Reconcile",
      "k8s-app": "fluentd-es",
      "kubernetes.io/cluster-service": "true",
      "version": "v2.0.3"
    }
  },
  "spec": {
    "selector": {
      "matchLabels": {
        "k8s-app": "fluentd-es",
        "version": "v2.0.3"
      }
    },
    "template": {
      "metadata": {
        "creationTimestamp": null,
        "labels": {
          "k8s-app": "fluentd-es",
          "kubernetes.io/cluster-service": "true",
          "version": "v2.0.3"
        },
        "annotations": {
          "scheduler.alpha.kubernetes.io/critical-pod": ""
        }
      },
      "spec": {
        "volumes": [
          {
            "name": "varlog",
            "hostPath": {
              "path": "/var/log",
              "type": ""
            }
          },
          {
            "name": "varlibdockercontainers",
            "hostPath": {
              "path": "/var/lib/docker/containers",
              "type": ""
            }
          },
          {
            "name": "libsystemddir",
            "hostPath": {
              "path": "/usr/lib64",
              "type": ""
            }
          },
          {
            "name": "config-volume",
            "configMap": {
              "name": "fluentd-es-config-v0.1.2",
              "defaultMode": 420
            }
          }
        ],
        "containers": [
          {
            "name": "fluentd-es",
            "image": "gcr.io/google-containers/fluentd-elasticsearch:v2.0.3",
            "env": [
              {
                "name": "FLUENTD_ARGS",
                "value": "--no-supervisor -q"
              }
            ],
            "resources": {
              "limits": {
                "memory": "500Mi"
              },
              "requests": {
                "cpu": "100m",
                "memory": "200Mi"
              }
            },
            "volumeMounts": [
              {
                "name": "varlog",
                "mountPath": "/var/log"
              },
              {
                "name": "varlibdockercontainers",
                "readOnly": true,
                "mountPath": "/var/lib/docker/containers"
              },
              {
                "name": "libsystemddir",
                "readOnly": true,
                "mountPath": "/host/lib"
              },
              {
                "name": "config-volume",
                "mountPath": "/etc/fluent/config.d"
              }
            ],
            "livenessProbe": {
              "exec": {
                "command": [
                  "/bin/sh",
                  "-c",
                  "LIVENESS_THRESHOLD_SECONDS=${LIVENESS_THRESHOLD_SECONDS:-300}; STUCK_THRESHOLD_SECONDS=${LIVENESS_THRESHOLD_SECONDS:-900}; if [ ! -e /var/log/fluentd-buffers ]; then\n  exit 1;\nfi; LAST_MODIFIED_DATE=`stat /var/log/fluentd-buffers | grep Modify | sed -r \"s/Modify: (.*)/\\1/\"`; LAST_MODIFIED_TIMESTAMP=`date -d \"$LAST_MODIFIED_DATE\" +%s`; if [ `date +%s` -gt `expr $LAST_MODIFIED_TIMESTAMP + $STUCK_THRESHOLD_SECONDS` ]; then\n  rm -rf /var/log/fluentd-buffers;\n  exit 1;\nfi; if [ `date +%s` -gt `expr $LAST_MODIFIED_TIMESTAMP + $LIVENESS_THRESHOLD_SECONDS` ]; then\n  exit 1;\nfi;\n"
                ]
              },
              "initialDelaySeconds": 600,
              "timeoutSeconds": 1,
              "periodSeconds": 60,
              "successThreshold": 1,
              "failureThreshold": 3
            },
            "terminationMessagePath": "/dev/termination-log",
            "terminationMessagePolicy": "File",
            "imagePullPolicy": "IfNotPresent"
          }
        ],
        "restartPolicy": "Always",
        "terminationGracePeriodSeconds": 30,
        "dnsPolicy": "ClusterFirst",
        "nodeSelector": {
          "beta.kubernetes.io/arch": "amd64"
        },
        "securityContext": {},
        "schedulerName": "default-scheduler"
      }
    },
    "updateStrategy": {
      "type": "RollingUpdate",
      "rollingUpdate": {
        "maxUnavailable": 1
      }
    },
    "templateGeneration": 1,
    "revisionHistoryLimit": 10
  },
  "status": {
    "currentNumberScheduled": 2,
    "numberMisscheduled": 0,
    "desiredNumberScheduled": 2,
    "numberReady": 2,
    "observedGeneration": 1,
    "updatedNumberScheduled": 2,
    "numberAvailable": 2
  }
}

查看pod的日志时,似乎有错误:

  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/elasticsearch-transport-6.0.0/lib/elasticsearch/transport/transport/http/faraday.rb:20:in `perform_request'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/elasticsearch-transport-6.0.0/lib/elasticsearch/transport/client.rb:131:in `perform_request'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/elasticsearch-api-6.0.0/lib/elasticsearch/api/actions/ping.rb:20:in `ping'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluent-plugin-elasticsearch-1.9.7/lib/fluent/plugin/out_elasticsearch.rb:163:in `client'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluent-plugin-elasticsearch-1.9.7/lib/fluent/plugin/out_elasticsearch.rb:364:in `rescue in send_bulk'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluent-plugin-elasticsearch-1.9.7/lib/fluent/plugin/out_elasticsearch.rb:359:in `send_bulk'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluent-plugin-elasticsearch-1.9.7/lib/fluent/plugin/out_elasticsearch.rb:346:in `write_objects'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluentd-0.12.42/lib/fluent/output.rb:490:in `write'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluentd-0.12.42/lib/fluent/buffer.rb:354:in `write_chunk'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluentd-0.12.42/lib/fluent/buffer.rb:333:in `pop'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluentd-0.12.42/lib/fluent/output.rb:342:in `try_flush'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluentd-0.12.42/lib/fluent/output.rb:149:in `run'
2018-03-15 11:20:09 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2018-03-15 11:20:10 +0000 error_class="Elasticsearch::Transport::Transport::Errors::Forbidden" error="[403] " plugin_id="object:3f7e3ab9f4b8"
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/elasticsearch-transport-6.0.0/lib/elasticsearch/transport/transport/base.rb:202:in `__raise_transport_error'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/elasticsearch-transport-6.0.0/lib/elasticsearch/transport/transport/base.rb:319:in `perform_request'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/elasticsearch-transport-6.0.0/lib/elasticsearch/transport/transport/http/faraday.rb:20:in `perform_request'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/elasticsearch-transport-6.0.0/lib/elasticsearch/transport/client.rb:131:in `perform_request'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/elasticsearch-api-6.0.0/lib/elasticsearch/api/actions/ping.rb:20:in `ping'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluent-plugin-elasticsearch-1.9.7/lib/fluent/plugin/out_elasticsearch.rb:163:in `client'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluent-plugin-elasticsearch-1.9.7/lib/fluent/plugin/out_elasticsearch.rb:364:in `rescue in send_bulk'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluent-plugin-elasticsearch-1.9.7/lib/fluent/plugin/out_elasticsearch.rb:359:in `send_bulk'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluent-plugin-elasticsearch-1.9.7/lib/fluent/plugin/out_elasticsearch.rb:346:in `write_objects'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluentd-0.12.42/lib/fluent/output.rb:490:in `write'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluentd-0.12.42/lib/fluent/buffer.rb:354:in `write_chunk'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluentd-0.12.42/lib/fluent/buffer.rb:333:in `pop'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluentd-0.12.42/lib/fluent/output.rb:342:in `try_flush'
  2018-03-15 11:20:09 +0000 [warn]: /var/lib/gems/2.3.0/gems/fluentd-0.12.42/lib/fluent/output.rb:149:in `run'
2018-03-15 11:20:10 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2018-03-15 11:20:12 +0000 error_class="Elasticsearch::Transport::Transport::Errors::Forbidden" error="[403] " plugin_id="object:3f7e3ab9f4b8"
  2018-03-15 11:20:10 +0000 [warn]: suppressed same stacktrace
2018-03-15 11:20:12 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2018-03-15 11:20:16 +0000 error_class="Elasticsearch::Transport::Transport::Errors::Forbidden" error="[403] " plugin_id="object:3f7e3ab9f4b8"
  2018-03-15 11:20:12 +0000 [warn]: suppressed same stacktrace
2018-03-15 11:20:16 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2018-03-15 11:20:25 +0000 error_class="Elasticsearch::Transport::Transport::Errors::Forbidden" error="[403] " plugin_id="object:3f7e3ab9f4b8"
  2018-03-15 11:20:16 +0000 [warn]: suppressed same stacktrace
2018-03-15 11:20:25 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2018-03-15 11:20:39 +0000 error_class="Elasticsearch::Transport::Transport::Errors::Forbidden" error="[403] " plugin_id="object:3f7e3ab9f4b8"
  2018-03-15 11:20:25 +0000 [warn]: suppressed same stacktrace
2018-03-15 11:20:39 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2018-03-15 11:21:08 +0000 error_class="Elasticsearch::Transport::Transport::Errors::Forbidden" error="[403] " plugin_id="object:3f7e3ab9f4b8"
  2018-03-15 11:20:39 +0000 [warn]: suppressed same stacktrace
2018-03-15 11:21:08 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2018-03-15 11:21:38 +0000 error_class="Elasticsearch::Transport::Transport::Errors::Forbidden" error="[403] " plugin_id="object:3f7e3ab9f4b8"
  2018-03-15 11:21:08 +0000 [warn]: suppressed same stacktrace
2018-03-15 11:21:38 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2018-03-15 11:22:08 +0000 error_class="Elasticsearch::Transport::Transport::Errors::Forbidden" error="[403] " plugin_id="object:3f7e3ab9f4b8"
  2018-03-15 11:21:38 +0000 [warn]: suppressed same stacktrace
2018-03-15 11:22:08 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2018-03-15 11:22:38 +0000 error_class="Elasticsearch::Transport::Transport::Errors::Forbidden" error="[403] " plugin_id="object:3f7e3ab9f4b8"
  2018-03-15 11:22:08 +0000 [warn]: suppressed same stacktrace
2018-03-15 11:22:38 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2018-03-15 11:23:08 +0000 error_class="Elasticsearch::Transport::Transport::Errors::Forbidden" error="[403] " plugin_id="object:3f7e3ab9f4b8"
  2018-03-15 11:22:38 +0000 [warn]: suppressed same stacktrace
2018-03-15 11:23:08 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2018-03-15 11:23:38 +0000 error_class="Elasticsearch::Transport::Transport::Errors::Forbidden" error="[403] " plugin_id="object:3f7e3ab9f4b8"
  2018-03-15 11:23:08 +0000 [warn]: suppressed same stacktrace
2018-03-15 11:23:38 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2018-03-15 11:24:08 +0000 error_class="Elasticsearch::Transport::Transport::Errors::Forbidden" error="[403] " plugin_id="object:3f7e3ab9f4b8"
  2018-03-15 11:23:38 +0000 [warn]: suppressed same stacktrace

我无法理解导致这个问题的原因 “暂时无法刷新缓冲区.error_class =”Elasticsearch :: Transport :: Transport :: Errors :: Forbidden“error =”[403]“问题并无法解决问题。

帮助?

2 个答案:

答案 0 :(得分:0)

您似乎无权从FluentD守护程序写入Elasticsearch。这可能是因为在流畅工作的节点上有错误的IAM角色,例如。

所以,你有几种方法可以修复它:

  1. 您可以为Kuberentes在this AWS guide.
  2. 之后运行的节点添加正确的角色
  3. 此外,您可以使用Kube2iam将正确的IAM角色分配给您的广告连播。
  4. 这不太安全,但您可以通过它的IP地址允许从VPC中的Kubernetes节点访问ES。这是guide about ip-based policies

答案 1 :(得分:0)

我有类似的问题。我为另一个StackOverflow问题写了一个detailed response,所以这里不再重复。

我正在使用与您不同的流畅的docker映像和守护程序配置。这是在流畅的配置文件的<match **>部分中使用的设置:

  • host字段应该在网址的开头包含https://
  • 将端口设置为443
  • 将方案设置为https
  • AWS ES服务没有用户名或密码,因此这些字段不应位于配置中。参见this discussion