Dockerfile无法运行cp命令将文件移入容器

时间:2019-06-26 08:39:40

标签: docker dockerfile

嗨,我正在尝试在容器内下载文件,并将此文件移至容器内的特定位置。

RUN wget https://storage.googleapis.com/hadoop-lib/gcs/gcs-connector-latest-hadoop2.jar 
RUN cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/
RUN cp /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf.template /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf
RUN echo "spark.hadoop.google.cloud.auth.service.account.enable true" > /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf

但这失败,并显示以下错误:

Step 44/46 : RUN cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/
 ---> Running in 8c81d9871377
cp: cannot create regular file '/opt/spark-2.2.1-bin-hadoop2.7/jars/': No such file or directory
The command '/bin/sh -c cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/' returned a non-zero code: 1

EDIT-1 Error Screenshot

我尝试了提到的解决方案,现在出现以下错误:

卸下中间容器e885431017e8 步骤43/44:复制/opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf.template /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults .conf lstat opt / spark-2.2.1-bin-hadoop2.7 / conf / spark-defaults.conf.template:没有这样的文件或目录

3 个答案:

答案 0 :(得分:1)

您的容器中是否已有路径/opt/spark-2.2.1-bin-hadoop2.7/jars/

如果未在cp命令前添加:

mkdir -p /opt/spark-2.2.1-bin-hadoop2.7/jars/

然后尝试这样复制:

cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/gcs-connector-latest-hadoop2.jar

修改后:

您运行mkdir并尝试从中复制,因为该文件夹为空,所以该操作不起作用!

答案 1 :(得分:0)

为什么不直接下载到该文件夹​​?并使用COPY复制容器COPY内的文件

RUN wget https://storage.googleapis.com/hadoop-lib/gcs/gcs-connector-latest-hadoop2.jar -P /opt/spark-2.2.1-bin-hadoop2.7/jars/
COPY /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf.template /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf

假定:/opt/spark-2.2.1-bin-hadoop2.7/jars/文件夹存在

答案 2 :(得分:-3)

尝试在cp之前添加sudo命令。