在mvn install:install-file之后找不到工件

时间:2014-10-24 17:08:23

标签: java maven

我想将一个工件安装到~/.m2并在另一个仓库中使用它,但是后者失败了。

1) 我使用命令mvn clean install构建apache-spark,在spark-assembly-1.2.0-SNAPSHOT-hadoop1.0.4.jar文件夹中为我提供了一个jar spark/assembly/target/scala2.10

$ ls -l assembly/target/scala2.10/
total 12
drwxrwxr-x 2 prayagupd prayagupd 4096 Oct 24 16:52 ./
drwxrwxr-x 6 prayagupd prayagupd 4096 Oct 24 16:52 ../
-rw-rw-r-- 1 prayagupd prayagupd 3117 Oct 24 16:52 spark-assembly-1.2.0-SNAPSHOT-hadoop1.0.4.jar

看来jar里面没有类,

$ jar tvf spark-assembly-1.2.0-SNAPSHOT-hadoop1.0.4.jar 
     0 Sat Oct 25 03:31:14 NPT 2014 META-INF/
   133 Sat Oct 25 03:31:14 NPT 2014 META-INF/MANIFEST.MF
     0 Sat Oct 25 03:31:14 NPT 2014 org/
     0 Sat Oct 25 03:31:14 NPT 2014 org/apache/
     0 Sat Oct 25 03:31:14 NPT 2014 org/apache/spark/
     0 Sat Oct 25 03:31:14 NPT 2014 org/apache/spark/unused/
   318 Sat Oct 25 03:31:14 NPT 2014 org/apache/spark/unused/UnusedStubClass.class
     0 Sat Oct 25 03:31:14 NPT 2014 META-INF/maven/
     0 Sat Oct 25 03:31:14 NPT 2014 META-INF/maven/org.spark-project.spark/
     0 Sat Oct 25 03:31:14 NPT 2014 META-INF/maven/org.spark-project.spark/unused/
  2356 Sat Oct 25 03:31:14 NPT 2014 META-INF/maven/org.spark-project.spark/unused/pom.xml
   114 Sat Oct 25 03:31:14 NPT 2014 META-INF/maven/org.spark-project.spark/unused/pom.properties
     0 Sat Oct 25 03:31:14 NPT 2014 META-INF/NOTICE

2)现在,我想将spark-assembly-1.2.0-SNAPSHOT jar作为mvn依赖项用于另一个项目。所以,我把这个文件安装在本地maven中

$ mvn -X install:install-file -Dfile=spark-assembly-1.2.0-SNAPSHOT-hadoop1.0.4.jar 
                              -DgroupId=org.apache.spark 
                              -DartifactId=spark-assembly 
                              -Dversion=1.2.0-SNAPSHOT 
                              -Dclassifier=javadoc 
                              -Dpackaging=jar 
                              -DgeneratePom=true

这给了我~/.m2/repository/org/apache/spark/spark-assembly

$ ll ~/.m2/repository/org/apache/spark/spark-assembly/1.2.0-SNAPSHOT/
total 24
drwxrwxr-x 2 prayagupd prayagupd 4096 Oct 24 23:14 ./
drwxrwxr-x 3 prayagupd prayagupd 4096 Oct 24 23:14 ../
-rw-rw-r-- 1 prayagupd prayagupd  757 Oct 24 23:14 maven-metadata-local.xml
-rw-rw-r-- 1 prayagupd prayagupd  206 Oct 24 23:14 _maven.repositories
-rw-rw-r-- 1 prayagupd prayagupd 3117 Oct 24 16:52 spark-assembly-1.2.0-SNAPSHOT-javadoc.jar
-rw-rw-r-- 1 prayagupd prayagupd  483 Oct 24 23:14 spark-assembly-1.2.0-SNAPSHOT.pom

spark-assembly / 1.2.0-SNAPSHOT /里面spark-assembly-1.2.0-SNAPSHOT.pom的内容是

  1 <?xml version="1.0" encoding="UTF-8"?>                                                                                                            
  2 <project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns="http://maven.apache.org/POM/4.0
    .0"
  3     xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">                                          
  4   <modelVersion>4.0.0</modelVersion>                                                                
  5   <groupId>org.apache.spark</groupId>                                                               
  6   <artifactId>spark-assembly</artifactId>                                                           
  7   <version>1.2.0-SNAPSHOT</version>                                                                 
  8   <description>POM was created from install:install-file</description>                              
  9 </project>  

3)现在,我想将其用作另一个maven项目中的依赖项talk-to-s3-http

+  15         <spark.vesion>1.2.0-SNAPSHOT</spark.vesion>

+  76           <dependency>                                                                                   
+  77               <groupId>org.apache.spark</groupId>                                                        
+  78               <artifactId>spark-assembly</artifactId>                                                     
+  79               <version>${spark.vesion}</version>                                                         
+  80               <scope>provided</scope>                                                                    
+  81           </dependency>  

但它会抛出如下错误,

$ mvn clean compile -U
[INFO] Scanning for projects...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building talk-to-s3-http 1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3.844s
[INFO] Finished at: Fri Oct 24 22:19:52 NPT 2014
[INFO] Final Memory: 7M/97M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project talk-to-s3-http: Could not resolve dependencies for project com.pseudononymous:talk-to-s3-http:jar:1.0-SNAPSHOT: Could not find artifact org.apache.spark:spark-assembly:jar:1.2.0-SNAPSHOT -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException

也没有部署而不是安装jar工作。

mvn -X deploy:deploy-file -Dfile=spark-assembly-1.2.0-SNAPSHOT-hadoop1.0.4.jar 
                          -Durl=file:///home/prayagupd/.m2/repository/ 
                          -DgroupId=org.apache.spark 
                          -DartifactId=spark-assembly 
                          -Dversion=1.2.0-SNAPSHOT 
                          -Dclassifier=javadoc 
                          -Dpackaging=jar 
                          -DgeneratePom=true

1 个答案:

答案 0 :(得分:1)

至少错误的一件事是分类器:那不是javadoc,而是hadoop1.0.4。这也是您需要在3)<classifier>hadoop1.0.4</classifier>的依赖项中指定的分类器。如果您运行安装文件,它必须告诉您文件X被复制到位置Y。