…app.handle('revealWord',conv=>{conv.add(newSimple(`<speak>Sorry, you lost.<mark name="REVEAL_WORD"/> The word is ${conv.session.params.word}.</speak>`));conv.add(newCanvas());});…
…setCallbacks(){// declare Assistant Canvas Action callbacksconstcallbacks={onTtsMark(markName){if(markName==='REVEAL_WORD'){// display the correct word to the userthat.revealCorrectWord();}},}callbacks.onUpdate.bind(this);}…
onInputStatusChanged()
輸入狀態變更時,onInputStatusChanged() 回呼會通知您
以及互動式畫布動作輸入狀態變更會指出
就會開啟並關閉麥克風,或是 Google 助理正在處理查詢時。
下列事件可能會導致輸入狀態發生變更:
使用者對您的動作說話時
使用者在 Android Google 搜尋應用程式 (AGSA) 上輸入文字
網頁應用程式使用 sendTextQuery() API 將文字查詢傳送至動作
動作寫入住家儲存空間和其他 Google 助理事件的動作
這個回呼的主要用途是將動作與
透過語音互動取得資料舉例來說,如果使用者玩的是互動式遊戲
畫布遊戲並開啟麥克風,您可以在使用者同時暫停遊戲
說話。您也可以等到麥克風開啟時再傳送文字查詢給
並檢查 Google 助理是否收到驗證碼
/***RegisterallcallbacksusedbytheInteractiveCanvasAction*executedduringgamecreationtime.*/setCallbacks(){constthat=this;//DeclaretheInteractiveCanvasactioncallbacks.constcallbacks={onUpdate(data){console.log('Received data',data);},onInputStatusChanged(inputStatus){console.log("The new input status is: ",inputStatus);},};//CalledbytheInteractiveCanvaswebapponcewebapphasloadedto//registercallbacks.this.canvas.ready(callbacks);}}
[null,null,["上次更新時間:2025-07-25 (世界標準時間)。"],[[["\u003cp\u003eInteractive Canvas Actions support \u003ccode\u003eonUpdate()\u003c/code\u003e, \u003ccode\u003eonTtsMark()\u003c/code\u003e, and \u003ccode\u003eonInputStatusChanged()\u003c/code\u003e callbacks to enhance user interactions.\u003c/p\u003e\n"],["\u003cp\u003eThe \u003ccode\u003eonUpdate()\u003c/code\u003e callback facilitates data exchange between your webhook and web app for dynamic updates, primarily used in server-side fulfillment.\u003c/p\u003e\n"],["\u003cp\u003e\u003ccode\u003eonTtsMark()\u003c/code\u003e synchronizes web app behavior with spoken prompts by triggering actions based on custom SSML marks, applicable to both server-side and client-side fulfillment.\u003c/p\u003e\n"],["\u003cp\u003eCurrently in Developer Preview, \u003ccode\u003eonInputStatusChanged()\u003c/code\u003e allows your web app to respond to microphone and Assistant processing states for a more integrated user experience.\u003c/p\u003e\n"]]],[],null,["# Callbacks\n\nYou can implement the following callbacks in your Interactive Canvas Action:\n\n`onUpdate()`\n------------\n\nThe `onUpdate()`callback passes data from your webhook to your web app to update\nthe web app appropriately. You should only use this callback with the server-side\nfulfillment model of Interactive Canvas development.\n\nFor more information about `onUpdate()`, see\n[Pass data to update the web app](/assistant/interactivecanvas/prompts#pass_data_to_update_the_web_app).\n\n`onTtsMark()`\n-------------\n\nThe `onTtsMark()` callback is called when custom `\u003cmark\u003e` tags included in the\nSpeech Synthesis Markup Language ([SSML](/assistant/conversational/ssml))\nof your response are read out to the user during Text to Speech (TTS). You can\nuse `onTtsMark()` in both the server-side and client-side fulfillment development\nmodels.\n\nIn the following snippets, `onTtsMark()` synchronizes the web app's animation\nwith the corresponding TTS output. When the Action has said to the user, \"Sorry,\nyou lost,\" the web app spells out the correct word and displays the letters to\nthe user.\n| **Note:** At this time, timepoints don't work with the SSML `\u003cbreak\u003e` tag.\n\nIn the following example, the webhook handler `revealWord` includes a custom\nmark in the response to the user when they've lost the game: \n\n### JavaScript\n\n```javascript\n...\napp.handle('revealWord', conv =\u003e {\n conv.add(new Simple(`\u003cspeak\u003eSorry, you lost.\u003cmark name=\"REVEAL_WORD\"/\u003e The word is ${conv.session.params.word}.\u003c/speak\u003e`));\n conv.add(new Canvas());\n});\n...\n \n```\n\nThe following code snippet then registers the `onTtsMark()` callback, checks the\nname of the mark, and executes the `revealCorrectWord()` function, which updates\nthe web app: \n\n### JavaScript\n\n```javascript\n...\nsetCallbacks() {\n // declare Assistant Canvas Action callbacks\n const callbacks = {\n onTtsMark(markName) {\n if (markName === 'REVEAL_WORD') {\n // display the correct word to the user\n that.revealCorrectWord();\n }\n },\n }\n callbacks.onUpdate.bind(this);\n}\n...\n \n```\n\n`onInputStatusChanged()`\n------------------------\n\n\u003cbr /\u003e\n\n| **Warning**: This API is currently in Developer Preview. You can test this API in the simulator, but do not deploy an Action that uses this feature to alpha, beta, or production channels. Actions deployed using these features will not function on end-user devices.\n\n\u003cbr /\u003e\n\nThe `onInputStatusChanged()` callback notifies you when the input status changes\nin your Interactive Canvas Action. Input status changes indicate when the\nmicrophone opens and closes or when Assistant is processing a query. The\nfollowing events can cause the input status to change:\n\n- The user speaking to your Action\n- The user inputting text on the Android Google Search App (AGSA)\n- The web app using the `sendTextQuery()` API to send a text query to the Action\n- The Action writing to home storage and other Assistant events\n\nThe primary use case for this callback is synchronizing your Action with the\nuser's voice interactions. For example, if a user is playing an Interactive\nCanvas game and opens the microphone, you can pause the game while the user\nspeaks. You can also wait until the microphone is open to send a text query to\nAssistant to ensure it's received.\n\nThis API reports the following statuses:\n\n- `LISTENING` - Indicates that the microphone is open.\n- `IDLE` - Indicates that the microphone is closed.\n- `PROCESSING` - Indicates that Assistant is currently executing a query, and the microphone is closed.\n\nThe API reports the input status to your Action each time the status changes.\n\nWhile any transition between states is possible, the following flows are common:\n\n- `IDLE`\\\u003e`LISTENING`\\\u003e`PROCESSING`\\\u003e`IDLE` - The user says a query, the query is processed, and the microphone closes.\n- `IDLE`\\\u003e`PROCESSING`\\\u003e`IDLE` - The web app uses the `sendTextQuery()` API to send a text query to the Action.\n- `IDLE`\\\u003e`LISTENING`\\\u003e`IDLE` - The user opens the microphone but does not say a query.\n\nTo use this feature in your Action, add `onInputStatusChanged()` to your web app\ncode, as shown in the following snippet: \n\n onInputStatusChanged(inputStatus) {\n console.log(\"The new input status is: \", inputStatus);\n }\n\nThe `onInputStatusChanged()` callback passes back a single enum parameter,\n`inputStatus`. You can check this value to see the current input status. The\n`inputStatus` can be `LISTENING`, `PROCESSING`, or `IDLE`.\n\nNext, add `onInputStatusChanged()` to the `callbacks` object to register it, as\nshown in the following snippet: \n\n /**\n * Register all callbacks used by the Interactive Canvas Action\n * executed during game creation time.\n */\n setCallbacks() {\n const that = this;\n // Declare the Interactive Canvas action callbacks.\n const callbacks = {\n onUpdate(data) {\n console.log('Received data', data);\n },\n onInputStatusChanged(inputStatus) {\n console.log(\"The new input status is: \", inputStatus);\n },\n };\n // Called by the Interactive Canvas web app once web app has loaded to\n // register callbacks.\n this.canvas.ready(callbacks);\n }\n }"]]