如何使用dockerfile在continuumio / anaconda3图像上添加PATH

时间:2019-06-06 07:54:22

标签: docker docker-compose dockerfile

我想在docker上安装带有Continuousumio / anaconda3映像的spark。

Dockerfile

let array2New = [{
            name: 'test1',
            items: '...',
            settings: '...',
            values: ['a', 'b', 'c']
        },
        {
            name: 'test9',
            items: '...',
            settings: '...',
            values: []
        },

        {
            name: 'test10',
            items: '...',
            settings: '...',
            values: []
        },

        {
            name: 'test2',
            items: 'test2',
            settings: '...',
            values: ['w,','q','q' ]
        }
    },
]

我从dockerfille构建,spark和sshd正常运行。 但是此命令的路径无效。

FROM continuumio/anaconda3
RUN apt update && apt install -y openssh-server curl vim
RUN mkdir /var/run/sshd
RUN curl -O http://ftp.tsukuba.wide.ad.jp/software/apache/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz &&\
tar -zxvf spark-2.4.3-bin-hadoop2.7.tgz &&\
mv spark-2.4.3-bin-hadoop2.7 /usr/local &&\
ln -s /usr/local/spark-2.4.3-bin-hadoop2.7 /usr/local/spark
ENV PATH "/usr/local/spark/bin:${PATH}" 
RUN sed -i "s/#PermitRootLogin prohibit-password/PermitRootLogin yes/g" /etc/ssh/sshd_config &&\
echo "root:mypasswd" | chpasswd && \
CMD ["/usr/sbin/sshd", "-D"]
ENV PATH "/usr/local/spark/bin:${PATH}" 

此具有完整路径的命令可以正常工作。

#spark-shell
-bash: spark-shell: command not found

我试图将行添加到/ etc / profile或/etc/profile.d/xxx.sh,但是结果还是一样。

#/usr/local/spark/bin/spark-shell

此问题是由ssh访问引起的。因为此命令在env上工作正常。

RUN echo "export PATH=${PATH}:/user/local/spark/bin" >> the files

我该怎么办?

1 个答案:

答案 0 :(得分:0)

此问题是由ssh引起的,它不读取.bashrc和/etc/profile.d/, 所以我改变了这一行

ENV PATH "/usr/local/spark/bin:${PATH}" 

#add path to .bashrc
RUN echo "PATH=/usr/local/spark/bin:${PATH}" >> ~/.bashrc
#make .bash_profile which read .bashrc when ssh access
RUN echo "if [ -f ~/.bashrc ]; then  . ~/.bashrc;  fi" >>~/.bash_profile

此代码可以正常工作。