while url:
post = session.post(login, data=payload)
r = session.get(url)
parsed = json.loads(r.text)
# Retrieve json product data
if parsed['links']['next'] is not 'null':
url = 'https://testshop.example.com/admin/products' + str(parsed['links']['next'])
time.sleep(2)
for product in parsed['products']:
parsed_result = product['id']
else:
print('stop now!')
break
所以我正在使用上面的代码来检索和打印终端中的所有json数据。一切正常,直到我最后找到以下错误代码:
raise JSONDecodeError("Expecting value", s, err.value) from None
JSONDecodeError: Expecting value
有人知道这是什么原因吗,我该如何解决?
这很重要,这是我的JSON格式:
products: [
{
article_code: "123",
barcode: "456",
brand_id: 2600822,
created_at: "2018-05-31T15:15:34+02:00",
data01: "",
data02: "",
data03: "",
delivery_date_id: null,
has_custom_fields: false,
has_discounts: false,
has_matrix: false,
hits: 0,
hs_code: null,
id: 72660113,
image_id: null,
is_visible: false,
price_excl: 33.0165,
price_incl: 39.95,
price_old_excl: 0,
price_old_incl: 0,
product_set_id: null,
product_type_id: null,
search_context: "123 456 789",
shop_id: 252449,
sku: "789",
supplier_id: 555236,
updated_at: "2018-05-31T15:15:34+02:00",
variants_count: 1,
visibility: "hidden",
weight: 0,
nl: {
content: "",
fulltitle: "Grid Lifter",
slug: "grid-lifter",
title: "Grid Lifter"
}
],
links: {
first: ".json",
last: ".json?page=70",
prev: null,
next: ".json?page=2",
count: 3497,
limit: 50,
pages: 70
}
我正在用它来翻阅所有页面。
跟踪:
文件“”,第1行,在 runfile('loginlightspeedshop.py',wdir ='C:/Users/Solaiman/.spyder-py3/SCRIPTS/Lightspeed脚本')
runfile中的文件“ sitecustomize.py”,第705行 execfile(文件名,命名空间)
execfile中的文件“ sitecustomize.py”,第102行 exec(compile(f.read(),文件名,'exec'),命名空间)
文件“ C:/Users/Solaiman/.spyder-py3/SCRIPTS/Lightspeed脚本/loginshop.py”,第33行,在 解析= json.loads(r.text)
第354行的文件“ C:\ Users \ Solaiman \ Anaconda3 \ lib \ json__init __。py”已加载 返回_default_decoder.decode(s)
文件“ decoder.py”,行339,处于解码状态 obj,end = self.raw_decode(s,idx = _w(s,0).end())
raw_decode中的文件“ decoder.py”,第357行 从None提高JSONDecodeError(“期望值”,s,err.value)
JSONDecodeError:期望值
答案 0 :(得分:1)
您可能在这里得到空的/不是json响应:
r = session.get(url)
在解析r.text之前尝试打印它以检测问题原因。或使用try / except子句:
try:
parsed = r.json()
except ValueError:
print(r.text)
break