我想使用spark从html表单上传文件。以下是我处理邮政路线的java函数:
Spark.post("/upload", "multipart/form-data", (request, response) -> {
String location = "temporary"; // the directory location where files will be stored
long maxFileSize = 100000000; // the maximum size allowed for uploaded files
long maxRequestSize = 100000000; // the maximum size allowed for multipart/form-data requests
int fileSizeThreshold = 1024; // the size threshold after which files will be written to disk
MultipartConfigElement multipartConfigElement = new MultipartConfigElement(
location, maxFileSize, maxRequestSize, fileSizeThreshold);
request.raw().setAttribute("org.eclipse.multipartConfig",
multipartConfigElement);
Collection<Part> parts = request.raw().getParts(); //Line 50 where error is there
for (Part part : parts) {
System.out.println("Name: " + part.getName());
System.out.println("Size: " + part.getSize());
System.out.println("Filename: " + part.getSubmittedFileName());
}
String fName = request.raw().getPart("xmlfile").getSubmittedFileName();
System.out.println("Title: " + request.raw().getParameter("title"));
System.out.println("File: " + fName);
Part uploadedFile = request.raw().getPart("xmlFile");
Path out = Paths.get("temporary/" + fName);
try (final InputStream in = uploadedFile.getInputStream()) {
Files.copy(in, out);
uploadedFile.delete();
}
// cleanup
multipartConfigElement = null;
//parts = null;
uploadedFile = null;
return "OK";
});
以下是HTML表单:
<form class="ui fluid action input" id="fileForm" method="post" action="/sparkapp/upload" enctype = "multipart/form-data">
<input type="text" name="filePath" readonly>
<input type="file" name="xmlFile">
<button type="submit" value="Submit">
</form>
当我上传文件时,我得到500:内部服务器错误,带有以下堆栈跟踪:
java.lang.IllegalStateException: Unable to process parts as no multi-part configuration has been provided
at org.apache.catalina.connector.Request.parseParts(Request.java:2734)
at org.apache.catalina.connector.Request.getParts(Request.java:2701)
at org.apache.catalina.connector.Request.getPart(Request.java:2885)
at org.apache.catalina.connector.RequestFacade.getPart(RequestFacade.java:1089)
at javax.servlet.http.HttpServletRequestWrapper.getPart(HttpServletRequestWrapper.java:362)
at com.amulya.Application$2.handle(Application.java:50)
at spark.RouteImpl$1.handle(RouteImpl.java:61)
at spark.http.matching.Routes.execute(Routes.java:61)
at spark.http.matching.MatcherFilter.doFilter(MatcherFilter.java:127)
at spark.servlet.SparkFilter.doFilter(SparkFilter.java:173)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:616)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:528)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1100)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:687)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1520)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1476)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
遵循以下问题,但答案无效: SparkJava: Upload file did't work in Spark java framework
我正在使用eclipse IDE和tomcat服务器。
请帮我解决这个问题。
答案 0 :(得分:1)
我刚刚发现,当我使用带有spark的tomcat服务器时,我设置了过滤器,即spark.servlet.SparkFilter
。
通过this answer我发现实际上,我需要设置
allowCasualMultipartParsing="true"
在<Context>
或Webapp/META-INF/context.xml
中的webapp的Tomcat/conf/server.xml
元素中,以便Tomcat在multipart/form-data
或HttpServletRequest.getPart*
时自动解析HttpServletRequest.getParameter*
个请求正文即使目标servlet未使用@MultipartConfig
注释标记,也会调用。
请参阅以下链接以供参考:
http://sparkjava.com/documentation.html#other-webserver
https://stackoverflow.com/a/8050589/2256258
http://tomcat.apache.org/tomcat-7.0-doc/config/context.html
https://examples.javacodegeeks.com/enterprise-java/tomcat/tomcat-context-xml-configuration-example/
答案 1 :(得分:0)
您还必须在web.xml文件中的Servlet配置中提供多部分配置设置。我在下面列举了一个例子:
<servlet>
<servlet-name>MyServlet</servlet-name>
<servlet-class>com.example.servlet.MyServlet</servlet-class>
<multipart-config>
<max-file-size>xxxxx</max-file-size>
<max-request-size>yyyyy</max-request-size>
</multipart-config>
</servlet>