我正在尝试解析http://api.randomuser.me/
中的数据并将其合并到我的Java应用程序中。 JSON数据如下所示:
{"results":[{"user":{"gender":"female","name":{"title":"mrs","first":"riley","last":"barnes"},"location":{"street":"6900 washington ave","city":"lousville","state":"new jersey","zip":"82561"},"email":"riley.barnes83@example.com","username":"bluebear21","password":"michaela","salt":"#nH{uwT4","md5":"854a2e76639bc59a8bc08f8f1ceadeb0","sha1":"c84ec59a9137685aec1909e16642e75aecc3221e","sha256":"4dd09f1e230d7a98ebfe2c14588fedc6d6f436eecc72952dd4565d8e51823ac4","registered":"968313341","dob":"316929409","phone":"(288)-381-2384","cell":"(804)-975-5466","SSN":"546-48-9490","picture":"http://api.randomuser.me/0.3/portraits/women/17.jpg"},"seed":"6a35aafac76ff6f","version":"0.3"}]}
我正在使用http://argo.sourceforge.net/
来解析数据,但在尝试时我一直收到此错误。
Exception in thread "main" argo.jdom.JsonNodeDoesNotMatchPathElementsException: Failed to find a field called ["user"] at ["user"] while resolving ["user"] in [{"results":[{"user":{"gender":"female","name":{"title":"miss","first":"hannah","last":"parker"},"location":{"street":"4708 daisy dr","city":"waxahachie","state":"new hampshire","zip":"32151"},"email":"hannah.parker73@example.com","username":"blackswan57","password":"ellie","salt":"4IkJxhN2","md5":"40ccea4f5670f63af4c3ba26f749d735","sha1":"f9ac6e78079ff34f8bd018884bf658ac2c39d8c3","sha256":"b04fb6337fc7e5e26b096ddd3895a8e6500783fbe026f2cd92febd427df9b505","registered":"1140098652","dob":"492991640","phone":"(847)-802-9539","cell":"(816)-467-2525","SSN":"756-30-7542","picture":"http://api.randomuser.me/0.3/portraits/women/26.jpg"},"seed":"baffebbfa57c353","version":"0.3"}]}].
at argo.jdom.JsonNodeDoesNotMatchPathElementsException.jsonNodeDoesNotMatchPathElementsException(JsonNodeDoesNotMatchPathElementsException.java:23)
at argo.jdom.JsonNode.wrapExceptionsFor(JsonNode.java:359)
at argo.jdom.JsonNode.getStringValue(JsonNode.java:184)
at twitter.ciangallagher.net.RandomUserGenerator.genUser(RandomUserGenerator.java:55)
at twitter.ciangallagher.net.Browser.main(Browser.java:646)
我写的代码如下:
class RandomUserGenerator {
public String text;
public void genUser() throws ClientProtocolException, IOException, ParseException, InvalidSyntaxException{
HttpClient client = HttpClientBuilder.create().build();
HttpGet getRequest = new HttpGet("http://api.randomuser.me/");
getRequest.addHeader("accept", "application/json");
HttpResponse response = client.execute(getRequest);
if (response.getStatusLine().getStatusCode() != 200) {
throw new RuntimeException("Failed : HTTP error code : "
+ response.getStatusLine().getStatusCode());
}
BufferedReader br = new BufferedReader(
new InputStreamReader((response.getEntity().getContent())));
String output; // API output
System.out.println("Output from Server .... \n");
while ((output = br.readLine()) != null) {
System.out.println(output);
text = output;
}
System.out.println(text);
String secondSingle = new JdomParser().parse(text).getStringValue("user");
}
}
答案 0 :(得分:1)
这是我10年来见过的最糟糕的图书馆文档。
有很多优秀的库来解析JSON,比如GSON,flexjson,jackson等(我自己就像flexjson)。
但是如果你想继续这个,让我告诉你一个如何获得一些价值的例子
public static void main(String[] args) throws IOException, InvalidSyntaxException {
StringBuffer text = new StringBuffer();
BufferedReader br = new BufferedReader(new InputStreamReader(JSON.class.getResourceAsStream("json.txt")));
String output; // API output
System.out.println("Output from Server .... \n");
while ((output = br.readLine()) != null) {
System.out.println(output);
text.append(output);
}
System.out.println(text);
JsonRootNode rootNode = new JdomParser().parse(text.toString());
List<JsonNode> results = rootNode.getArrayNode("results");
JsonNode firstUserMap = results.get(0);
JsonField user = firstUserMap.getFieldList().get(0);
JsonNode userNode = user.getValue();
JsonField name = userNode.getFieldList().get(1);
JsonNode nameNode = name.getValue();
JsonField firstName = nameNode.getFieldList().get(1);
System.out.println(firstName.getValue().getText());
}
打印
riley
PS。 JSON的片段
{
"results":[
{
"user":{
"gender":"female",
"name":{
"title":"mrs",
"first":"riley",
"last":"barnes"
},
(...)