開始使用
透過集合功能整理內容
你可以依據偏好儲存及分類內容。
與您的銷售人員或支援聯絡人合作設定資料移轉 2.0 版的存取權時,
您會看到值區名稱您需要提供銷售聯絡人的 Google 群組,以便控管 Google Cloud Storage 中資料檔案的存取權。
您可以選擇使用公用程式存取資料,也可以自行編寫程式碼。
使用 gsutil 存取資料
gsutil 工具是以 Python 編寫的命令列應用程式
,您不需要編寫程式碼即可存取資料。個人中心
舉例來說,您可以將 gsutil 做為指令碼或批次檔案的一部分,
以及建立自訂應用程式
如要開始使用 gsutil,請參閱 gsutil
說明文件。這項工具會在首次要求您提供憑證時提示
不僅如此,還能儲存這些資料,以供日後使用
gsutil 範例
您可以使用 gsutil 列出所有檔案,方法如下:
gsutil ls gs://[bucket_name]/[object name/file name]
gsutil 使用的語法與 UNIX 的語法非常相似,包括萬用字元星號 (*),因此您可以列出所有 NetworkImpression 檔案:
gsutil ls gs://[bucket_name]/dcm_account6837_impression_*
下載檔案也很簡單:
gsutil cp gs://[bucket_name]/dcm_account6837_impression_2015120100.log.gz
您可以將檔案從分散的 DT Google 值區複製到自己的 Google API GCS 值區
使用 Unix 殼層指令碼,您有兩種方法:
以程式輔助方式存取資料
Google Cloud Storage 提供多種程式設計語言的 API 和範例,可讓您以程式設計方式存取資料。以下是
您在建立
成效整合
取得服務帳戶
如要開始使用 Data Transfer 2.0,請先使用設定工具;這項工具會逐步引導您在 Google API 控制台中建立專案、啟用 API,並建立憑證。
如要設定新的服務帳戶,請按照下列步驟操作:
- 依序按一下 [Create credentials] (建立憑證) > [Service account key] (服務帳戶金鑰)。
- 選擇是否要將服務帳戶的公開/私密金鑰下載為
標準 P12 檔案,或是可由 Google API 用戶端載入的 JSON 檔案
資源庫。
接著,系統就會為您產生新的公開/私密金鑰,並下載至您的電腦中;這是金鑰的唯一副本,您必須負責妥善保管。
請務必備妥服務帳戶電子郵件地址
以便進行下一步
將服務帳戶新增至群組
- 前往 Google 群組
- 按一下「我的群組」,然後選取用於管理 DT 2.0 版 Cloud Storage 值區存取權的群組
- 按一下「管理」
- 請勿點選「邀請成員!」
- 按一下「直接新增成員」
- 將上一個步驟中的服務帳戶電子郵件地址複製到
成員方塊
- 選取「無電子郵件」
- 按一下「新增」按鈕
我不小心點選了「邀請成員」
更多...
- 別擔心!你可以修正
- 返回「管理」畫面
- 按一下「尚未接受的邀請」
- 找出並選取服務帳戶
- 按一下畫面頂端的「撤銷邀請」
- 按一下「直接新增成員」並繼續執行上述步驟
範圍
傳送至 Cloud Storage 的任何範圍都必須為唯讀
舉例來說,使用 Java 用戶端程式庫時
使用方式:
StorageScopes.DEVSTORAGE_READ_ONLY
除非另有註明,否則本頁面中的內容是採用創用 CC 姓名標示 4.0 授權,程式碼範例則為阿帕契 2.0 授權。詳情請參閱《Google Developers 網站政策》。Java 是 Oracle 和/或其關聯企業的註冊商標。
上次更新時間:2025-08-31 (世界標準時間)。
[null,null,["上次更新時間:2025-08-31 (世界標準時間)。"],[[["\u003cp\u003eData Transfer v2.0 provides access to your data files stored in a Google Cloud Storage bucket, controlled by a Google Group you specify.\u003c/p\u003e\n"],["\u003cp\u003eYou can access your data using the gsutil command-line utility or programmatically via Google Cloud Storage APIs.\u003c/p\u003e\n"],["\u003cp\u003eTo access data programmatically, you'll need to set up a service account and grant it read-only access to your Google Group and Cloud Storage bucket.\u003c/p\u003e\n"]]],[],null,["# Get Started\n\nWhen you work with your sales or support contact to setup access to Data Transfer v2.0,\nyou will be provided with a bucket name. You will need to provide your sales contact a\n[Google Group](http://groups.google.com/) which enables you to control\naccess to your data files in [Google Cloud Storage](//cloud.google.com/storage/).\n\n\nYou can choose to access your data using a [utility](#access-data-using-gsutil)\nor you can write your own [code.](#access-data-programmatically)\n\nAccess data using gsutil\n------------------------\n\nThe gsutil tool is a command-line application, written in Python, that\nlets you access your data without having to do any coding. You\ncould, for example, use gsutil as part of a script or batch file instead of\ncreating custom applications.\n\n\nTo get started with gsutil read the [gsutil\ndocumentation](/storage/docs/gsutil). The tool will prompt you for your credentials the first time\nyou use it and then store them for use later on.\n\n### gsutil examples\n\nYou can list all of your files using gsutil as follows:\n`gsutil ls gs://[bucket_name]/[object name/file name]`\n\ngsutil uses much of the same syntax as UNIX, including the wildcard\nasterisk (\\*), so you can list all NetworkImpression files:\n`gsutil ls gs://[bucket_name]/dcm_account6837_impression_*`\n\nIt's also easy to download a file:\n`gsutil cp gs://[bucket_name]/dcm_account6837_impression_2015120100.log.gz`\n\nYou can copy your files from the dispersed DT Google buckets to your own Google API GCS Bucket\nusing a Unix shell script, there are two options:\n\n- In gsutil, if you are using a Unix System, run the following for all your buckets daily:\n\n ```bash\n $ day=$(date --date=\"1 days ago\" +\"%m-%d-%Y\")\n $ gsutil -m cp gs://{\u003cdcmhashid_A\u003e,\u003cdcmhashid_B\u003e,etc.}/*$day*.log.gz gs://\u003cclient_bucket\u003e/\n ```\n- Alternatively, a solution that is a little trickier is to use a bash file:\n\n ```bash\n #!/bin/bash\n\n set -x\n\n buckets={dfa_-hasid_A dfa_-hashid_B,...} #include all hash ids\n day=$(date --date=\"1 days ago\" +\"%m-%d-%Y\")\n for b in ${buckets[@]}; do /\n gsutil -m cp gs://$b/*$day*.log.gz gs:/// /\n done\n ```\n\nAccess data programmatically\n----------------------------\n\n\n[Google Cloud Storage](/storage) has APIs and [samples](/storage/docs/json_api/v1/libraries) for many programming\nlanguages that allow you to access your data in a programmatic way. Below are\nthe steps specific to Data Transfer v2.0 that you must take to build a\nworking integration.\n\n### Get a service account\n\n\nTo get started using Data Transfer v2.0, you need to first\n[use\nthe setup tool](https://console.cloud.google.com/start/api?id=storage_component&credential=client_key), which guides you through creating a project in the\nGoogle API Console, enabling the API, and creating credentials.\n\n\u003cbr /\u003e\n\n\nTo set up a new service account, do the following:\n\n1. Click **Create credentials \\\u003e Service account key**.\n2. Choose whether to download the service account's public/private key as a standard P12 file, or as a JSON file that can be loaded by a Google API client library.\n\nYour new public/private key pair is generated and downloaded to your machine;\nit serves as the only copy of this key. You are responsible for storing it\nsecurely.\n\n\u003cbr /\u003e\n\n|\n| **Note:** If you plan to access Google Cloud Storage using the\n| [JSON API](/storage/docs/json_api), then you must also verify\n| that the [Google Cloud Storage JSON API](//console.developers.google.com//project/_/apiui/apiview/storage_api/overview) component is activated as well.\n\nBe sure to keep this window open, you will need the service account email\nin the next step.\n\n\n### Add a service account to your group\n\n- Go to [Google Group](http://groups.google.com/)\n- Click on My Groups and select the group you use for managing access to your DT v2.0 Cloud Storage Bucket\n- Click Manage\n- **Do not click Invite Members!**\n- Click Direct add members\n- Copy the service account email from the previous step into the members box\n- Select No email\n- Click the Add button\n\n#### I accidentally clicked Invite Members\n\n[More...]()\n\n- Don't Panic! You can fix it\n- Head back to the Manage screen as before\n- Click on Outstanding Invitations\n- Find the service account and select it\n- Click Revoke invitation at the top of the screen\n- Click Direct add members and resume steps above\n\n### Scope\n\n\n**Any scopes passed to Cloud Storage must be Read Only**\n\nFor example, when using the Java client library the correct scope to\nuse is: \n\n```scdoc\nStorageScopes.DEVSTORAGE_READ_ONLY\n```\n\n\u003cbr /\u003e"]]