问题:
为解决此问题,我决定使用Firebase演示数据集来首先查看在示例时间范围内记录了哪些自定义事件,以确定我希望在每个会话级计算哪些两个示例转换事件:< / p>
# Standard SQL
SELECT
event.name as event_name,
COUNT(event.name) as event_count
FROM `firebase-analytics-sample-data.ios_dataset.app_events_*`,
UNNEST(event_dim) as event
WHERE (_TABLE_SUFFIX BETWEEN '20160601' AND '20160605')
------ Inclusive for both the start-date and end-date.
GROUP BY event_name
ORDER BY event_count DESC;
我认为“ slots_spun”和“ user_logged_in”将是一个不错的开始:
重新使用以下code(由@Felipa Hoffa编写),我不仅要在会话级别进行汇总,而且还要确定会话级别的后两个转换事件的计数:
#standardSQL
SELECT app_instance_id, sess_id, MIN(min_time) sess_start, MAX(max_time) sess_end, COUNT(*) records, MAX(sess_id) OVER(PARTITION BY app_instance_id) total_sessions,
(ROUND((MAX(max_time)-MIN(min_time))/(1000*1000),1)) sess_length_seconds
FROM (
SELECT *, SUM(session_start) OVER(PARTITION BY app_instance_id ORDER BY min_time) sess_id
FROM (
SELECT *, IF(
previous IS null
OR (min_time-previous)>(20*60*1000*1000), # sessions broken by this inactivity
1, 0) session_start
#https://blog.modeanalytics.com/finding-user-sessions-sql/
FROM (
SELECT *, LAG(max_time, 1) OVER(PARTITION BY app_instance_id ORDER BY max_time) previous
FROM (
SELECT user_dim.app_info.app_instance_id
, (SELECT MIN(timestamp_micros) FROM UNNEST(event_dim)) min_time
, (SELECT MAX(timestamp_micros) FROM UNNEST(event_dim)) max_time
FROM `firebase-analytics-sample-data.ios_dataset.app_events_20160601`
)
)
)
)
GROUP BY 1, 2
ORDER BY 1, 2
我正在寻找的输出应如下所示:
我已经测试了几个示例代码片段,但尚未设法解决。
SELECT
*,
IF(previous IS null
OR (min_time-previous)>(30*60*1000*1000), # sessions broken by this inactivity
1, 0) session_start,
COUNT(DISTINCT CASE WHEN event.name = "slots_spun" THEN event.name ELSE NULL END) AS slots_spun
FROM (
SELECT *, LAG(max_time, 1) OVER(PARTITION BY app_instance_id ORDER BY max_time) AS previous
FROM (
SELECT
user_dim.app_info.app_instance_id,
(SELECT MIN(timestamp_micros) FROM UNNEST(event_dim)) min_time,
(SELECT MAX(timestamp_micros) FROM UNNEST(event_dim)) max_time
FROM `firebase-analytics-sample-data.ios_dataset.app_events_20160601`, UNNEST(event_dim) AS event
)
)
我会继续努力,但是如果有人能指出正确的方向,我将心存感激。