[null,null,["最后更新时间 (UTC):2025-07-25。"],[[["\u003cp\u003eMedia tracks can be audio, video, or text (subtitles/captions), represented by \u003ccode\u003eGCKMediaTrack\u003c/code\u003e objects with unique identifiers and attributes.\u003c/p\u003e\n"],["\u003cp\u003eAssociate \u003ccode\u003eGCKMediaTrack\u003c/code\u003e objects with a \u003ccode\u003eGCKMediaInformation\u003c/code\u003e object representing the media item before loading it to the receiver.\u003c/p\u003e\n"],["\u003cp\u003eActivate or deactivate tracks on the receiver using \u003ccode\u003esetActiveTrackIDs\u003c/code\u003e on \u003ccode\u003eGCKRemoteMediaClient\u003c/code\u003e with the track IDs or an empty array.\u003c/p\u003e\n"],["\u003cp\u003eStyle text tracks using \u003ccode\u003eGCKMediaTextTrackStyle\u003c/code\u003e and apply it with \u003ccode\u003esetTextTrackStyle\u003c/code\u003e on \u003ccode\u003eGCKRemoteMediaClient\u003c/code\u003e, and receive status updates via \u003ccode\u003eGCKRemoteMediaClientListener\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eCORS headers are required for media streams with Tracks, including simple mp4 files with subtitles, ensuring the server allows necessary headers.\u003c/p\u003e\n"]]],[],null,["# Using Media Tracks\n\nA media track can be an audio or video stream object, or a text object (subtitle\nor caption).\n| **Note:** The [Styled](/cast/docs/styled_receiver) and [Default Media](/cast/docs/web_receiver#default_media_web_receiver) Receivers allow you to use only the text tracks with the API. To work with the audio and video tracks, you must develop a [Custom Receiver](/cast/docs/web_receiver/basic).\n\nA [`GCKMediaTrack`](/cast/docs/reference/ios/interface_g_c_k_media_track)\nobject represents a track. It consists of a unique numeric identifier and other\nattributes such as a content ID and title. A `GCKMediaTrack` instance can be\ncreated as follows:\nSwift \n\n```swift\nlet captionsTrack = GCKMediaTrack.init(identifier: 1,\n contentIdentifier: \"https://some-url/caption_en.vtt\",\n contentType: \"text/vtt\",\n type: GCKMediaTrackType.text,\n textSubtype: GCKMediaTextTrackSubtype.captions,\n name: \"English Captions\",\n languageCode: \"en\",\n customData: nil)\n```\nObjective-C \n\n```objective-c\nGCKMediaTrack *captionsTrack =\n [[GCKMediaTrack alloc] initWithIdentifier:1\n contentIdentifier:@\"https://some-url/caption_en.vtt\"\n contentType:@\"text/vtt\"\n type:GCKMediaTrackTypeText\n textSubtype:GCKMediaTextTrackSubtypeCaptions\n name:@\"English Captions\"\n languageCode:@\"en\"\n customData:nil];\n```\n\nA media item can have multiple tracks; for example, it can have multiple\nsubtitles (each for a different language) or multiple alternative audio streams\n(for different languages).\n[`GCKMediaInformation`](/cast/docs/reference/ios/interface_g_c_k_media_information)\nis the class that represents a media item. To associate a collection of\n[`GCKMediaTrack`](/cast/docs/reference/ios/interface_g_c_k_media_track) objects\nwith a media item, your app should update its\n`mediaTracks` property. Your app needs to make this association before it loads\nthe media to the receiver, as in the following code:\nSwift \n\n```swift\nlet tracks = [captionsTrack]\n\nlet url = URL.init(string: \"https://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4\")\nguard let mediaURL = url else {\n print(\"invalid mediaURL\")\n return\n}\n\nlet mediaInfoBuilder = GCKMediaInformationBuilder.init(contentURL: mediaURL)\nmediaInfoBuilder.streamType = GCKMediaStreamType.none;\nmediaInfoBuilder.contentType = \"video/mp4\"\nmediaInfoBuilder.metadata = metadata;\nmediaInfoBuilder.mediaTracks = tracks;\nmediaInformation = mediaInfoBuilder.build()\n```\nObjective-C \n\n```objective-c\nNSArray *tracks = @[captionsTrack];\n\nGCKMediaInformationBuilder *mediaInfoBuilder =\n [[GCKMediaInformationBuilder alloc] initWithContentURL:\n [NSURL URLWithString:@\"https://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4\"]];\nmediaInfoBuilder.streamType = GCKMediaStreamTypeNone;\nmediaInfoBuilder.contentType = @\"video/mp4\";\nmediaInfoBuilder.metadata = metadata;\nmediaInfoBuilder.mediaTracks = tracks;\nself.mediaInformation = [mediaInfoBuilder build];\n```\n\nActivate one or more tracks that were associated with the media item (after the\nmedia is loaded) by calling `-[setActiveTrackIDs:]` on\n[`GCKRemoteMediaClient`](/cast/docs/reference/ios/interface_g_c_k_remote_media_client)\nand passing the IDs of the tracks to be activated. For example, the following\ncode activates the captions track created above.\nSwift \n\n```swift\nsessionManager.currentSession?.remoteMediaClient?.setActiveTrackIDs([1])\n```\nObjective-C \n\n```objective-c\n[self.sessionManager.currentSession.remoteMediaClient setActiveTrackIDs:@[@1]];\n```\n\nTo deactivate a track on the current media item, call\n`-[setActiveTrackIDs:]` on\n[`GCKRemoteMediaClient`](/cast/docs/reference/ios/interface_g_c_k_remote_media_client)\nwith an empty array or nil. The following code disables the captions track.\nSwift \n\n```swift\nsessionManager.currentSession?.remoteMediaClient?.setActiveTrackIDs([])\n```\nObjective-C \n\n```objective-c\n[self.sessionManager.currentSession.remoteMediaClient setActiveTrackIDs:@[]];\n```\n\nStyle text tracks\n-----------------\n\nThe\n[`GCKMediaTextTrackStyle`](/cast/docs/reference/ios/interface_g_c_k_media_text_track_style)\nclass encapsulates the style information of a text track. A track style can be\napplied to the currently playing media item by calling\n[`-[GCKRemoteMediaClient\nsetTextTrackStyle]`](/cast/docs/reference/ios/interface_g_c_k_media_control_channel).\nThe track style created in the code below turns text red (FF) at 50% opacity\n(80) and sets a serif font.\nSwift \n\n```swift\nlet textTrackStyle = GCKMediaTextTrackStyle.createDefault()\ntextTrackStyle.foregroundColor = GCKColor.init(cssString: \"#FF000080\")\ntextTrackStyle.fontFamily = \"serif\"\nstyleChangeRequest = sessionManager.currentSession?.remoteMediaClient?.setTextTrackStyle(textTrackStyle)\nstyleChangeRequest?.delegate = self\n```\nObjective-C \n\n```objective-c\nGCKMediaTextTrackStyle *textTrackStyle = [GCKMediaTextTrackStyle createDefault];\n[textTrackStyle setForegroundColor:[[GCKColor alloc] initWithCSSString:@\"#FF000080\"]];\n[textTrackStyle setFontFamily:@\"serif\"];\nself.styleChangeRequest = [self.sessionManager.currentSession.remoteMediaClient setTextTrackStyle:textTrackStyle];\nself.styleChangeRequest.delegate = self;\n```\n\nYou can use the returned\n[`GCKRequest`](/cast/docs/reference/ios/interface_g_c_k_request) object for\ntracking this request.\nSwift \n\n```swift\n// MARK: - GCKRequestDelegate\n\nfunc requestDidComplete(_ request: GCKRequest) {\n if request == styleChangeRequest {\n print(\"Style update completed.\")\n styleChangeRequest = nil\n }\n}\n```\nObjective-C \n\n```objective-c\n#pragma mark - GCKRequestDelegate\n\n- (void)requestDidComplete:(GCKRequest *)request {\n if (request == self.styleChangeRequest) {\n NSLog(@\"Style update completed.\");\n self.styleChangeRequest = nil;\n }\n}\n```\n\nSee [Status updates](/cast/docs/ios_sender_advanced#receive_status_updates)\nbelow for more information. Apps should allow users to update the style for text\ntracks, either using the settings provided by the system or by the app itself.\nThere is a default style provided (in iOS 7 and later), which can be retrieved\nvia the static method [`+[GCKMediaTextTrackStyle\ncreateDefault]`](/cast/docs/reference/ios/interface_g_c_k_media_text_track_style).\nThe following text track style elements can be changed:\n\n- Foreground (text) color and opacity\n- Background color and opacity\n- Edge type\n- Edge Color\n- Font Scale\n- Font Family\n- Font Style\n\nReceive status updates\n----------------------\n\nWhen multiple senders are connected to the same receiver, it is important\nfor each sender to be aware of the changes on the receiver even if those\nchanges were initiated from other senders.\n| **Note:** This is important for all apps, not only those that explicitly support multiple senders, because some Cast devices have control inputs (remotes, buttons) that behave as virtual senders, affecting the status on the receiver.\n\nTo ensure your sender receives status updates from the receiver, your app should\nregister a\n[`GCKRemoteMediaClientListener`](/cast/docs/reference/ios/protocol_g_c_k_remote_media_client_listener-p).\nIf the\n[`GCKMediaTextTrackStyle`](/cast/docs/reference/ios/interface_g_c_k_media_text_track_style)\nof the current media changes, then all of the connected senders will be notified\nthrough both the\n`-[remoteMediaClient:didUpdateMediaMetadata:]` and the\n`-[remoteMediaClient:didUpdateMediaStatus:]` callbacks. In this case, the\nReceiver SDK does not verify whether the new style is different from the\nprevious one and notifies all the connected senders regardless. If, however,\nthe list of active tracks is updated, only the\n`-[remoteMediaClient:didUpdateMediaStatus:]` in connected senders will be\nnotified.\n| **Note:** The list of tracks associated with the currently loaded media cannot be changed. If needed, you have to update the tracks information on the [`GCKMediaInformation`](/cast/docs/reference/ios/interface_g_c_k_media_information) object and reload the media.\n\nSatisfy CORS requirements\n-------------------------\n\nFor adaptive media streaming, Google Cast requires the presence of CORS headers,\nbut even simple mp4 media streams require CORS if they include Tracks. If you\nwant to enable Tracks for any media, you must enable CORS for both your track\nstreams and your media streams. So, if you do not have CORS headers available\nfor your simple mp4 media on your server, and you then add a simple subtitle\ntrack, you will not be able to stream your media unless you update your server\nto include the appropriate CORS header. In addition, you need to allow at least\nthe following headers: Content-Type, Accept-Encoding, and Range. Note that the\nlast two headers are additional headers that you may not have needed previously."]]