有没有办法跳过不支持跳过配置的Maven插件的执行?

时间:2017-01-06 17:29:56

标签: java maven plugins deployment skip

我想知道如果这个插件/目标不支持“跳过”配置,是否有办法跳过插件的执行?

我正在使用iterator-maven-plugin,我想跳过某些项目部署插件的部署文件目标的插件执行 - 因此将阶段更改为none不是一种选择。

请参阅下面的一些示例代码 - 我基本上希望为某些项目执行部署文件目标

17/01/06 11:46:20 WARN util.KerberosName: Kerberos krb5 configuration not found, setting default realm to empty
Starting namenodes on [localhost]
localhost: mkdir: cannot create directory ‘/username’: Permission denied
localhost: chown: cannot access ‘/username/username’: No such file or directory
localhost: starting namenode, logging to /username/username/hadoop-username-namenode-hostname.out
localhost: /home/username/project/hadoop/hadoop-dist/target/hadoop-2.0.0-alpha/sbin/hadoop-daemon.sh: line 149: /username/username/hadoop-username-namenode-hostname.out: No such file or directory
localhost: head: cannot open ‘/username/username/hadoop-username-namenode-hostname.out’ for reading: No such file or directory
localhost: mkdir: cannot create directory ‘/username’: Permission denied
localhost: chown: cannot access ‘/username/username’: No such file or directory
localhost: starting datanode, logging to /username/username/hadoop-username-datanode-hostname.out
localhost: /home/username/project/hadoop/hadoop-dist/target/hadoop-2.0.0-alpha/sbin/hadoop-daemon.sh: line 149: /username/username/hadoop-username-datanode-hostname.out: No such file or directory
localhost: head: cannot open ‘/username/username/hadoop-username-datanode-hostname.out’ for reading: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: mkdir: cannot create directory ‘/username’: Permission denied
0.0.0.0: chown: cannot access ‘/username/username’: No such file or directory
0.0.0.0: starting secondarynamenode, logging to /username/username/hadoop-username-secondarynamenode-hostname.out
0.0.0.0: /home/username/project/hadoop/hadoop-dist/target/hadoop-2.0.0-alpha/sbin/hadoop-daemon.sh: line 149: /username/username/hadoop-username-secondarynamenode-hostname.out: No such file or directory
0.0.0.0: head: cannot open ‘/username/username/hadoop-username-secondarynamenode-hostname.out’ for reading: No such file or directory

0 个答案:

没有答案