我认为需要额外的依赖关系,但我无法理解究竟是什么。我感谢大家的帮助。
我的pom文件是:
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.spark</groupId>
<artifactId>spark</artifactId>
<version>0.0.1-SNAPSHOT</version>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<name>M101J</name>
<url>http://maven.apache.org</url>
<dependencies>
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-core</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>org.freemarker</groupId>
<artifactId>freemarker</artifactId>
<version>2.3.19</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.6.4</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>Spark repository</id>
<url>http://sparkjava.com/nexus/content/repositories/spark/</url>
</repository>
</repositories>
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>com.spark.SparkHomework</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
</plugins>
</build>
</project>
当我尝试使用命令运行我的项目时:
mvn compile exec:java -Dexec.mainClass=com.spark.SparkHomework
我有这样的问题:
未知的生命周期阶段&#34; .mainClass = com.spark.SparkHomework&#34;。你必须 以格式指定有效的生命周期阶段或目标 : 要么 :[:] :. 可用的生命周期阶段是:验证,初始化, 生成源,流程源,生成资源, process-resources,compile,process-classes,generate-test-sources, 流程测试源,生成测试资源,流程测试资源, test-compile,process-test-classes,test,prepare-package,package, 预集成测试,集成测试,集成后测试,验证, 安装,部署,预清洁,清洁,后清洁,前期网站,网站, post-site,site-deploy。
我的名为SparkHomework的类(名为com.spark的包)是:
package com.spark;
import freemarker.template.Configuration;
import freemarker.template.Template;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import spark.Request;
import spark.Response;
import spark.Route;
import spark.Spark;
import java.io.StringWriter;
import java.net.UnknownHostException;
import java.util.HashMap;
import java.util.Map;
public class SparkHomework {
private static final Logger logger = LoggerFactory.getLogger("logger");
public static void main(String[] args) throws UnknownHostException {
final Configuration configuration = new Configuration();
configuration.setClassForTemplateLoading(
SparkHomework.class, "/");
Spark.get(new Route("/") {
@Override
public Object handle(final Request request,
final Response response) {
StringWriter writer = new StringWriter();
try {
Template helloTemplate = configuration.getTemplate("answer.ftl");
Map<String, String> answerMap = new HashMap<String, String>();
answerMap.put("answer", createAnswer());
helloTemplate.process(answerMap, writer);
} catch (Exception e) {
logger.error("Failed", e);
halt(500);
}
return writer;
}
});
}
private static String createAnswer() {
int i = 0;
for (int bit = 0; bit < 16; bit++) {
i |= bit << bit;
}
return Integer.toString(i);
}
}
答案 0 :(得分:1)
解决了!
在Powershell中需要添加-D"exec.mainClass"
之类的引号。在命令提示符 - 没有它们一切都没关系。感谢大家!
command prompt和powershell。