开始使用
使用集合让一切井井有条
根据您的偏好保存内容并对其进行分类。
当您与销售或支持联系人一起设置对数据传输 v2.0 的访问权限时,
系统会为您提供存储分区名称您需要向销售联系人提供一个 Google 群组,以便您控制对 Google Cloud Storage 中数据文件的访问权限。
您可以选择使用实用程序访问自己的数据
也可以自行编写代码。
使用 gsutil 访问数据
gsutil 工具是一个用 Python 编写的命令行应用,可让您无需编写任何代码即可访问数据。例如,您可以将 gsutil 作为脚本或批处理文件的一部分使用,而无需创建自定义应用。
要开始使用 gsutil,请阅读 gsutil
文档。该工具会首次提示您输入凭据
供您使用,然后存储起来以备日后使用。
gsutil 示例
您可以使用 gsutil 列出所有文件,如下所示:
gsutil ls gs://[bucket_name]/[object name/file name]
gsutil 使用的语法大部分与 UNIX 相同,包括通配符
星号 (*),因此您可以列出所有 NetworkImpression 文件:
gsutil ls gs://[bucket_name]/dcm_account6837_impression_*
下载文件也非常简单:
gsutil cp gs://[bucket_name]/dcm_account6837_impression_2015120100.log.gz
您可以将文件从分散的 DT Google 存储分区复制到您自己的 Google API GCS 存储分区
使用 Unix shell 脚本时,有以下两种选择:
以编程方式访问数据
Google Cloud Storage 提供了适用于多种编程语言的 API 和示例,可让您以编程方式访问数据。以下是构建有效集成时必须执行的 Data Transfer v2.0 专用步骤。
获取服务账号
要开始使用数据传输 v2.0,您首先需要
设置工具,它会引导您在
Google API 控制台、启用 API 并创建凭据。
要开设新的服务账号,请按以下步骤操作:
- 点击创建凭据 > 服务账号密钥。
- 选择要下载哪种格式的服务账号公钥/私钥文件,是标准的 P12 文件还是可由 Google API 客户端库加载的 JSON 文件。
接下来系统就会为您生成新的公钥/私钥对并将其下载到您的计算机;这对密钥仅此一份,您应负责妥善存储。
请务必让此窗口保持打开状态,因为您将需要服务账号电子邮件地址
。
向群组添加服务账号
- 转到 Google 群组
- 点击“我的群组”,然后选择您用于管理访问权限的群组
复制到 DT v2.0 Cloud Storage 存储分区中
- 点击“管理”
- 请勿点击“Invite Members”(邀请成员)!
- 点击“直接添加成员”
- 将上一步中的服务账号电子邮件地址复制到“成员”框中
- 选择“不接收电子邮件”
- 点击“添加”按钮
我不小心点击了“邀请成员”
<ph type="x-smartling-placeholder">
</ph>
更多...
- 别慌!修正方法
- 像之前一样返回“管理”界面
- 点击“待处理的邀请”
- 查找并选择服务账号
- 点击屏幕顶部的“撤消邀请”
- 点击“直接添加成员”,然后继续执行上述步骤
范围
传递到 Cloud Storage 的所有范围都必须是只读的
例如,在使用 Java 客户端库时,将正确的作用域
使用如下:
StorageScopes.DEVSTORAGE_READ_ONLY
如未另行说明,那么本页面中的内容已根据知识共享署名 4.0 许可获得了许可,并且代码示例已根据 Apache 2.0 许可获得了许可。有关详情,请参阅 Google 开发者网站政策。Java 是 Oracle 和/或其关联公司的注册商标。
最后更新时间 (UTC):2025-08-31。
[null,null,["最后更新时间 (UTC):2025-08-31。"],[[["\u003cp\u003eData Transfer v2.0 provides access to your data files stored in a Google Cloud Storage bucket, controlled by a Google Group you specify.\u003c/p\u003e\n"],["\u003cp\u003eYou can access your data using the gsutil command-line utility or programmatically via Google Cloud Storage APIs.\u003c/p\u003e\n"],["\u003cp\u003eTo access data programmatically, you'll need to set up a service account and grant it read-only access to your Google Group and Cloud Storage bucket.\u003c/p\u003e\n"]]],[],null,["# Get Started\n\nWhen you work with your sales or support contact to setup access to Data Transfer v2.0,\nyou will be provided with a bucket name. You will need to provide your sales contact a\n[Google Group](http://groups.google.com/) which enables you to control\naccess to your data files in [Google Cloud Storage](//cloud.google.com/storage/).\n\n\nYou can choose to access your data using a [utility](#access-data-using-gsutil)\nor you can write your own [code.](#access-data-programmatically)\n\nAccess data using gsutil\n------------------------\n\nThe gsutil tool is a command-line application, written in Python, that\nlets you access your data without having to do any coding. You\ncould, for example, use gsutil as part of a script or batch file instead of\ncreating custom applications.\n\n\nTo get started with gsutil read the [gsutil\ndocumentation](/storage/docs/gsutil). The tool will prompt you for your credentials the first time\nyou use it and then store them for use later on.\n\n### gsutil examples\n\nYou can list all of your files using gsutil as follows:\n`gsutil ls gs://[bucket_name]/[object name/file name]`\n\ngsutil uses much of the same syntax as UNIX, including the wildcard\nasterisk (\\*), so you can list all NetworkImpression files:\n`gsutil ls gs://[bucket_name]/dcm_account6837_impression_*`\n\nIt's also easy to download a file:\n`gsutil cp gs://[bucket_name]/dcm_account6837_impression_2015120100.log.gz`\n\nYou can copy your files from the dispersed DT Google buckets to your own Google API GCS Bucket\nusing a Unix shell script, there are two options:\n\n- In gsutil, if you are using a Unix System, run the following for all your buckets daily:\n\n ```bash\n $ day=$(date --date=\"1 days ago\" +\"%m-%d-%Y\")\n $ gsutil -m cp gs://{\u003cdcmhashid_A\u003e,\u003cdcmhashid_B\u003e,etc.}/*$day*.log.gz gs://\u003cclient_bucket\u003e/\n ```\n- Alternatively, a solution that is a little trickier is to use a bash file:\n\n ```bash\n #!/bin/bash\n\n set -x\n\n buckets={dfa_-hasid_A dfa_-hashid_B,...} #include all hash ids\n day=$(date --date=\"1 days ago\" +\"%m-%d-%Y\")\n for b in ${buckets[@]}; do /\n gsutil -m cp gs://$b/*$day*.log.gz gs:/// /\n done\n ```\n\nAccess data programmatically\n----------------------------\n\n\n[Google Cloud Storage](/storage) has APIs and [samples](/storage/docs/json_api/v1/libraries) for many programming\nlanguages that allow you to access your data in a programmatic way. Below are\nthe steps specific to Data Transfer v2.0 that you must take to build a\nworking integration.\n\n### Get a service account\n\n\nTo get started using Data Transfer v2.0, you need to first\n[use\nthe setup tool](https://console.cloud.google.com/start/api?id=storage_component&credential=client_key), which guides you through creating a project in the\nGoogle API Console, enabling the API, and creating credentials.\n\n\u003cbr /\u003e\n\n\nTo set up a new service account, do the following:\n\n1. Click **Create credentials \\\u003e Service account key**.\n2. Choose whether to download the service account's public/private key as a standard P12 file, or as a JSON file that can be loaded by a Google API client library.\n\nYour new public/private key pair is generated and downloaded to your machine;\nit serves as the only copy of this key. You are responsible for storing it\nsecurely.\n\n\u003cbr /\u003e\n\n|\n| **Note:** If you plan to access Google Cloud Storage using the\n| [JSON API](/storage/docs/json_api), then you must also verify\n| that the [Google Cloud Storage JSON API](//console.developers.google.com//project/_/apiui/apiview/storage_api/overview) component is activated as well.\n\nBe sure to keep this window open, you will need the service account email\nin the next step.\n\n\n### Add a service account to your group\n\n- Go to [Google Group](http://groups.google.com/)\n- Click on My Groups and select the group you use for managing access to your DT v2.0 Cloud Storage Bucket\n- Click Manage\n- **Do not click Invite Members!**\n- Click Direct add members\n- Copy the service account email from the previous step into the members box\n- Select No email\n- Click the Add button\n\n#### I accidentally clicked Invite Members\n\n[More...]()\n\n- Don't Panic! You can fix it\n- Head back to the Manage screen as before\n- Click on Outstanding Invitations\n- Find the service account and select it\n- Click Revoke invitation at the top of the screen\n- Click Direct add members and resume steps above\n\n### Scope\n\n\n**Any scopes passed to Cloud Storage must be Read Only**\n\nFor example, when using the Java client library the correct scope to\nuse is: \n\n```scdoc\nStorageScopes.DEVSTORAGE_READ_ONLY\n```\n\n\u003cbr /\u003e"]]