尽快回填 Google Ads 流量来源数据 (GA 360)
使用集合让一切井井有条
根据您的偏好保存内容并对其进行分类。
如果您使用 BigQuery“每天更新”导出功能,则可能会看到某些 traffic_source
字段显示 Data Not Available
。本指南可帮助您在缺失的流量来源数据可用后(通常在每天凌晨 5 点之前),自动回填现有导出数据中的大部分缺失数据。
自动回填的步骤如下:
- 监听来自 BigQuery 的每日完整性信号。
- 在 BigQuery 导出内容中找出缺少流量来源数据的事件。
- 从 Google Ads 中查询这些事件的完整数据。
- 将完整事件数据与 BigQuery Export 联接。
创建 Pub/Sub 主题
- 在 Google Cloud 控制台的左侧导航菜单中,打开 Pub/Sub。如果您没有看到 Pub/Sub,请在 Google Cloud 控制台搜索栏中搜索:

- 点击主题标签页中的 + 创建主题:

- 在主题 ID 字段中输入名称。
- 选择添加默认订阅,其他选项留空:

- 点击创建。
创建日志路由器接收器
- 在 Google Cloud 控制台中打开日志路由器:

- 点击创建接收器:

- 为接收器输入名称和说明,然后点击下一步。
- 选择 Cloud Pub/Sub 主题作为接收器服务。
- 选择您创建的主题,然后点击下一步。
在构建包含项过滤条件中输入以下代码:
logName="projects/YOUR-PROJECT-ID/logs/analyticsdata.googleapis.com%2Ffresh_bigquery_export_status"
将 YOUR-PROJECT-ID 替换为您的 Google Cloud 控制台项目的 ID。
点击下一步,然后点击创建接收器。您无需滤除任何日志。
验证接收器是否已列在日志路由器接收器下。
联接缺失的数据
使用 Cloud Run 函数在 Pub/Sub 检测到完整性信号时自动执行代码,以回填流量来源数据:
- 打开 Cloud Run 函数:

- 点击创建函数:

- 为环境选择 Cloud Run 函数。
- 为函数输入一个名称。
- 选择 Cloud Pub/Sub 作为触发器类型,并选择您创建的主题作为 Cloud Pub/Sub 主题。
- 点击下一步,然后在相应框中输入代码,以将 Google Ads 归因数据与您的 BigQuery 导出数据联接起来。
如未另行说明,那么本页面中的内容已根据知识共享署名 4.0 许可获得了许可,并且代码示例已根据 Apache 2.0 许可获得了许可。有关详情,请参阅 Google 开发者网站政策。Java 是 Oracle 和/或其关联公司的注册商标。
最后更新时间 (UTC):2024-12-19。
[null,null,["最后更新时间 (UTC):2024-12-19。"],[],["The core process addresses missing traffic source data in BigQuery Fresh Daily exports. It involves listening for a daily completeness signal via Pub/Sub, identifying incomplete events, and then querying Google Ads for the complete data. This complete data is then joined with the existing BigQuery export. To achieve this, a Pub/Sub topic and a Log Router sink are created to monitor the completeness signal. Finally, a Cloud Run function is employed to execute the data backfill using a custom code when triggered by the completeness signal.\n"],null,["# Backfill Google Ads traffic source data as soon as possible (GA 360)\n\nIf you use BigQuery Fresh Daily exports, you might see `Data Not Available` for\nsome `traffic_source` fields. This guide helps you automatically backfill most\nof the missing traffic source data in existing exports as soon as it's\navailable, typically by 5 AM each day.\n\nHere are the steps to automate the backfill:\n\n1. Listen for the daily completeness signal from BigQuery.\n2. Identify the events with missing traffic source data in your BigQuery export.\n3. Query the complete data for those events from Google Ads.\n4. Join the complete event data with your BigQuery export.\n\nCreate a Pub/Sub topic\n----------------------\n\n1. Open **Pub/Sub** in the left navigation menu of the [Google Cloud console](https://console.cloud.google.com/). If you don't see **Pub/Sub** , search for it in the Google Cloud console search bar:\n2. Click **+ CREATE TOPIC** in the **Topics** tab:\n3. Enter a name in the **Topic ID** field.\n4. Select **Add a default subscription** , leave the other options blank:\n5. Click **Create**.\n\nCreate a Log Router sink\n------------------------\n\n1. Open **Log router** in the Google Cloud console:\n2. Click **Create sink** :\n3. Enter a name and description for your sink, then click **Next**.\n4. Choose **Cloud Pub/Sub topic** as the sink service.\n5. Choose the topic you created, then click **Next.**\n6. Enter the following code in **Build inclusion filter**:\n\n logName=\"projects/\u003cvar translate=\"no\"\u003eYOUR-PROJECT-ID\u003c/var\u003e/logs/analyticsdata.googleapis.com%2Ffresh_bigquery_export_status\"\n\n Replace \u003cvar translate=\"no\"\u003eYOUR-PROJECT-ID\u003c/var\u003e with the ID for your Google Cloud console\n project.\n7. Click **Next** , then click **Create sink**. You don't need to filter out any\n logs.\n\n8. Verify the sink is now listed under **Log Router Sinks**.\n\nJoin the missing data\n---------------------\n\nUse a Cloud Run function to automatically execute the code to backfill traffic\nsource data when Pub/Sub detects the completeness signal:\n\n1. Open **Cloud Run functions** :\n2. Click **CREATE FUNCTION** :\n3. Choose **Cloud Run function** for the **Environment**.\n4. Enter a name for your function.\n5. Choose **Cloud Pub/Sub** as the **Trigger type** , and the topic you created as the **Cloud Pub/Sub topic**.\n6. Click **Next**, then enter your code to join the Google Ads attribution data with your BigQuery export in the box."]]