嗨,我正在尝试在容器内下载文件,并将此文件移至容器内的特定位置。
RUN wget https://storage.googleapis.com/hadoop-lib/gcs/gcs-connector-latest-hadoop2.jar
RUN cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/
RUN cp /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf.template /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf
RUN echo "spark.hadoop.google.cloud.auth.service.account.enable true" > /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf
但这失败,并显示以下错误:
Step 44/46 : RUN cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/
---> Running in 8c81d9871377
cp: cannot create regular file '/opt/spark-2.2.1-bin-hadoop2.7/jars/': No such file or directory
The command '/bin/sh -c cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/' returned a non-zero code: 1
EDIT-1 Error Screenshot
我尝试了提到的解决方案,现在出现以下错误:
卸下中间容器e885431017e8 步骤43/44:复制/opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf.template /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults .conf lstat opt / spark-2.2.1-bin-hadoop2.7 / conf / spark-defaults.conf.template:没有这样的文件或目录
答案 0 :(得分:1)
您的容器中是否已有路径/opt/spark-2.2.1-bin-hadoop2.7/jars/
?
如果未在cp
命令前添加:
mkdir -p /opt/spark-2.2.1-bin-hadoop2.7/jars/
然后尝试这样复制:
cp gcs-connector-latest-hadoop2.jar /opt/spark-2.2.1-bin-hadoop2.7/jars/gcs-connector-latest-hadoop2.jar
修改后:
您运行mkdir
并尝试从中复制,因为该文件夹为空,所以该操作不起作用!
答案 1 :(得分:0)
为什么不直接下载到该文件夹?并使用COPY复制容器COPY内的文件
RUN wget https://storage.googleapis.com/hadoop-lib/gcs/gcs-connector-latest-hadoop2.jar -P /opt/spark-2.2.1-bin-hadoop2.7/jars/
COPY /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf.template /opt/spark-2.2.1-bin-hadoop2.7/conf/spark-defaults.conf
假定:/opt/spark-2.2.1-bin-hadoop2.7/jars/
文件夹存在
答案 2 :(得分:-3)
尝试在cp之前添加sudo命令。