我正在尝试使用JSON_OBJECT_T
,JSON_ARRAY_T
API解析大型json及其工作正常,但是我希望专家建议它是否有效?
我正在添加我的json文件并解析如下代码
代码
SET SERVEROUTPUT ON;
DECLARE
l_clob clob;
l_time timestamp;
l_json json_object_t;
l_stops_array json_array_t;
l_stops_arr json_array_t;
routeInfoObj json_object_t;
routeStopArr json_array_t;
BEGIN
SELECT LOG_CLOB INTO l_clob FROM ITV_DEV_LOGS WHERE LOG_ID = 1435334;
l_time := systimestamp;
l_json := json_object_t.parse( l_clob );
dbms_output.put_line( 'Parsing Time: ' || extract(second from( systimestamp - l_time ) ) );
l_stops_array := l_json.get_array('data');
DBMS_OUTPUT.PUT_LINE('Data array: '||l_stops_array.get_size);
FOR i in 0..l_stops_array.get_size-1 loop
l_stops_arr := TREAT(l_stops_array.get(i) AS JSON_OBJECT_T).get_array('routedStops');
DBMS_OUTPUT.PUT_LINE('stops array: '||l_stops_arr.get_size);
FOR j in 0..l_stops_arr.get_size - 1 loop
routeInfoObj := TREAT(l_stops_arr.get(j) AS JSON_OBJECT_T).get_object('routingInfo');
DBMS_OUTPUT.PUT_LINE('Stop : ' || routeInfoObj.get_number('stop'));
routeStopArr := TREAT(l_stops_arr.get(j) AS JSON_OBJECT_T).get_array('routedJobs');
FOR k in 0..routeStopArr.get_size - 1 loop
DBMS_OUTPUT.PUT_LINE('JobRef : ' || TREAT(routeStopArr.get(k) AS JSON_OBJECT_T).get_string('jobRef'));
// update query to update stop value to respective jobRef
end loop;
end loop;
end loop;
END;
它工作正常,但是有一种方法可以改进此实现,因为这只是一个示例json,内部对象的数量可能会达到2000,而不是一个一个地更新记录,而是有一种方法可以一次更新所有记录声明?
答案 0 :(得分:3)
您可以使用json_table()
将JSON值转换为关系表示。这又可以在MERGE语句中使用。
例如以下查询:
select j.*
from itv_dev_logs
cross join json_table(log_clob, '$.data.routedStops[*]'
columns stop_id integer path '$.stopId',
zone_ltf integer path '$.zoneLTF',
info_stop_nr integer path '$.routingInfo.stop',
info_route_ref varchar(20) path '$.routingInfo.routeRef',
info_eta varchar(20) path '$.routingInfo.eta',
info_eta_dt timestamp path '$.routingInfo.etaDateTime',
info_stop_time number path '$.routingInfo.stopTime'
) j
where log_id = 1435334;
返回类似这样的内容:
STOP_ID | ZONE_LTF | INFO_STOP_NR | INFO_ROUTE_REF | INFO_ETA | INFO_ETA_DT | INFO_STOP_TIME | INFO_DIST_PREV_STOP | INFO_BREAK_TIME | INFO_BREAK_DURATION
--------------+----------+--------------+----------------+----------+-------------------------+----------------+---------------------+-----------------+--------------------
1554383571432 | 1 | 1 | R119 | 11:01 | 2019-04-16 11:01:00.000 | 0.08 | 0.27 | 00:00 | 00:00
1554383571515 | 1 | 2 | R119 | 11:07 | 2019-04-16 11:07:00.000 | 0.08 | 0.34 | 00:00 | 00:00
1554383571601 | 1 | 3 | R119 | 11:13 | 2019-04-16 11:13:00.000 | 0.08 | 0 | 00:00 | 00:00
1554383571671 | 1 | 4 | R119 | 11:19 | 2019-04-16 11:19:00.000 | 0.08 | 0 | 00:00 | 00:00
1554383571739 | 1 | 5 | R119 | 11:25 | 2019-04-16 11:25:00.000 | 0.08 | 0 | 00:00 | 00:00
它可以用作MERGE语句的源,以更新目标表:
merge into your_target_table tg
using (
select j.*
from itv_dev_logs
cross join json_table(log_clob, '$.data.routedStops[*]'
columns stop_id integer path '$.stopId',
zone_ltf integer path '$.zoneLTF',
info_stop_nr integer path '$.routingInfo.stop',
info_route_ref varchar(20) path '$.routingInfo.routeRef',
info_eta varchar(20) path '$.routingInfo.eta',
info_eta_dt timestamp path '$.routingInfo.etaDateTime',
info_stop_time number path '$.routingInfo.stopTime'
) j
where log_id = 1435334
) t on (t.stop_id = tg.stop_id and ... more join conditions ...)
when matched then update
set stop_nr = t.info_stop_nr,
eta_timestamp = t.eta_dt,
由于您既未提供目标的结构,也未提供哪些JSON键应与哪些表列匹配的信息,因此所有列名仅是猜测,您需要用正确的名称替换它们。