有没有办法将卷挂载到kubernetes docker compose部署?

时间:2019-11-05 20:37:37

标签: docker kubernetes docker-compose docker-volume

但是,当我运行命令时,我试图在docker-compose.yaml文件上使用kompose convert:

kompose convert -f docker-compose.yaml

我得到了输出:

WARN Volume mount on the host "/home/centos/Sprint0Demo/Servers/elasticSearchConnector/etc/kafka-connect" isn't supported - ignoring path on the host
WARN Volume mount on the host "/home/centos/Sprint0Demo/Servers/elasticSearchConnector/etc/kafka-elasticsearch" isn't supported - ignoring path on the host
WARN Volume mount on the host "/home/centos/Sprint0Demo/Servers/elasticSearchConnector/etc/kafak" isn't supported - ignoring path on the host

它还对其他永久卷显示更多警告

我的docker-compose文件是:

version: '3'
services:
  es01:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.2.1
    container_name: es01
    environment:
      [env]
    ulimits:
      nproc: 3000
      nofile: 65536
      memlock: -1
    volumes:
      - /home/centos/Sprint0Demo/Servers/elasticsearch:/usr/share/elasticsearch/data
    ports:
      - 9200:9200
    networks:
      - kafka_demo
  zookeeper:
    image: confluentinc/cp-zookeeper
    container_name: zookeeper
    ports:
      - "2181:2181"
    environment:
        ZOOKEEPER_CLIENT_PORT: 2181
    volumes:
      - /home/centos/Sprint0Demo/Servers/zookeeper/zk-data:/var/lib/zookeeper/data
      - /home/centos/Sprint0Demo/Servers/zookeeper/zk-txn-logs:/var/lib/zookeeper/log
    networks:
      kafka_demo:
  kafka0:
    image: confluentinc/cp-kafka
    container_name: kafka0
    environment:
      [env]
    volumes:
      - /home/centos/Sprint0Demo/Servers/kafkaData:/var/lib/kafka/data
    ports:
      - "9092:9092"
    depends_on:
      - zookeeper
      - es01
    networks:
      kafka_demo:
  schema_registry:
    image: confluentinc/cp-schema-registry:latest
    container_name: schema_registry
    environment:
      [env]
    ports:
      - 8081:8081
    networks:
      - kafka_demo
    depends_on:
      - kafka0
      - es01
  elasticSearchConnector:
    image: confluentinc/cp-kafka-connect:latest
    container_name: elasticSearchConnector
    environment:
        [env]
    volumes:
      - /home/centos/Sprint0Demo/Servers/elasticSearchConnector/etc/kafka-connect:/etc/kafka-connect
      - /home/centos/Sprint0Demo/Servers/elasticSearchConnector/etc/kafka-elasticsearch:/etc/kafka-elasticsearch
      - /home/centos/Sprint0Demo/Servers/elasticSearchConnector/etc/kafak:/etc/kafka
    ports:
      - "28082:28082"
    networks:
      - kafka_demo
    depends_on:
      - kafka0
      - es01
networks:
  kafka_demo:
    driver: bridge

有人知道我如何解决此问题?我以为它与错误消息有关,该错误消息表明它是卷装入还是主机装入?

1 个答案:

答案 0 :(得分:0)

我做了一些研究,需要指出三点:

  1. kompose不支持在主机上进行卷挂载。您可以考虑改用emptyDir

  2. Kubernetes使得难以传递host/root卷。您可以尝试 hostPathkompose convert --volumes hostPath适用于k8s。

  3. 如果您想在一台计算机上运行内容,还可以签出Compose on Kubernetes

请让我知道是否有帮助。