如何从BigQuery工作清单中获取DELETE查询?

时间:2019-02-27 00:50:43

标签: google-bigquery

我有一个提取时间分区表,该表没有日期范围的任何行

#standardSQL
SELECT COUNT(*) AS rows_cnt,
       _PARTITIONTIME AS ptime
FROM my_dataset.my_table
WHERE _PARTITIONTIME BETWEEN '2017-01-01' AND '2017-01-30'
GROUP BY ptime
-- result returned zero rows :o

我想调查:很可能使用某些删除查询删除了这些分区中的数据。 我编写了一个脚本来检索“可疑”查询。这些查询具有某种模式

DELETE ...my_table .... 2017-01 ...

或仅删除作业

这是脚本

import re
import requests
import json
from pprint import pprint
from datetime import datetime
import subprocess
import argparse

parser = argparse.ArgumentParser()
parser.add_argument('-all_users', type=bool, default=True)
parser.add_argument('-projection', type=str, default='full')
parser.add_argument('-state_filter', type=str, default='done')
parser.add_argument('-access_token', type=str)
args = parser.parse_args()


def get_query_string(job):
    return job['configuration']['query']['query']

def log_job(job):
    with open('suspicious_jobs.txt', 'a+') as f:
        f.writelines(json.dumps(job))
        f.writelines('\n')

def is_suspicious_query(query_string):
    lower_query = query_string.lower()
    return re.search('delete.*my_table.*2017-01', lower_query) is not None

def is_delete_statement(job):
    return job['statistics']['query']['statementType'] == 'DELETE'

def is_copy_job(job):
    return job['configuration']['jobType'] == 'COPY'

def get_next_token():
    subprocess.check_call('gcloud auth login', shell=True)
    return subprocess.check_output('gcloud auth print-access-token', shell=True).decode('utf-8').strip()


all_users = args.all_users
projection = args.projection
state_filter = args.state_filter
query_url = """https://www.googleapis.com/bigquery/v2/projects/my_project_id/jobs"""
access_token = get_next_token()

next_page_token = ''
page = 1
while next_page_token is not None:
    print('######## querying page ', page)
    url_parameters = {
    'allUsers': all_users,
    'pageToken': next_page_token,
    'projection': projection,
    'stateFilter': state_filter
    }
    headers = {
    'Authorization': 'Bearer {}'.format(access_token)
    }
    r = requests.get(query_url, params=url_parameters, headers=headers)

    if r.status_code == 401:
        access_token = get_next_token()
        print(access_token)
        print(r.text)
        continue
    elif r.status_code != 200:
        print(r.text)
        print('###### last_page_token is ', next_page_token)
        break

    next_page_token = r.json().get('nextPageToken', None)
    jobs = r.json().get('jobs', [])
    for j in jobs:
        try:
            if is_copy_job(j):
                continue
            q = get_query_string(j)
            if is_suspicious_query(q) or is_delete_statement(j):
                log_job(j)

        except KeyError as e:
            pass

    page = page + 1

is_suspicious_query函数检查查询是否匹配模式delete.*my_table.*2017-01(不区分大小写)。

我找不到完成删除的工作。我的while循环中是否缺少工作? (只要有KeyError异常,我都会跳过)

是否可以删除表的分区而不记录操作?

1 个答案:

答案 0 :(得分:1)

为此,我建议将Stackdriver advanced filters与以下过滤器配合使用:

resource.type="bigquery_resource"
protoPayload.serviceData.jobQueryRequest.query:"your_dataset.your_table"
protoPayload.serviceData.jobQueryResponse.job.jobConfiguration.query.statementType="DELETE"

这样,您将对指定的表进行DELETE操作。

更新:

resource.type="bigquery_resource"
(protoPayload.methodName="jobservice.jobcompleted"
OR protoPayload.authorizationInfo.resource:"projects/<your-project>/datasets/<your-dataset>/tables/"
OR (protoPayload.serviceData.jobCompletedEvent.job.jobConfiguration.query.destinationTable.datasetId="<your-dataset>"
AND protoPayload.serviceData.jobCompletedEvent.job.jobConfiguration.query.destinationTable.tableId:"<your-table>"))
protoPayload.methodName!="tabledataservice.list"
"<yourdataset.yourtable>"
severity!="ERROR"

使用此过滤器,您可以找到对表进行的大多数操作,例如,一个具有WriteDisposition且以WRITE_TRUNCATE作为值的查询将截断表并从头开始写入。 请注意,如果您在使用过滤器时使用:而不是=,它将可以搜索部分字符串。

就您所知的分区问题,据我所知,删除分区后,在日志中应将其列为DELETED。此外,您还删除了分区by using the command-line tool's bq rmtable decorator

希望这对您有所帮助。