Configure your web app

A web app is the user interface (UI) for an Action that uses Interactive Canvas. You can use existing web technologies (such HTML, CSS, JavaScript, and WebAssembly) to design and develop your web app. For the most part, Interactive Canvas can render web content like a browser, but there are a few restrictions enforced for user privacy and security. Before you begin designing your UI, consider the design principles outlined in Design guidelines . We recommend using Firebase hosting to deploy your web app.

The HTML and JavaScript for your web app do the following:

This page goes over the recommended ways to build your web app, how to enable communication between your Conversational Action and your web app, and general guidelines and restrictions.

Although you can use any method to build your UI, Google recommends using the following libraries:

Architecture

Google strongly recommends using a single-page application architecture. This approach allows for optimal performance and supports continuous conversational user experience. Interactive Canvas can be used in conjunction with front-end frameworks like Vue , Angular , and React , which help with state management.

HTML file

The HTML file defines how your UI looks. This file also loads the Interactive Canvas API, which enables communication between your web app and your Conversational Action.

HTML

< !DOCTYPE html 
>
< html 
>  
< head 
>  
< meta 
  
 charset 
 = 
 "utf-8" 
>  
< meta 
  
 name 
 = 
 "viewport" 
  
 content 
 = 
 "width=device-width,initial-scale=1" 
>  
< title>Interactive 
  
 Canvas 
  
 Sample 
< / 
 title 
>  
<! -- 
  
 Disable 
  
 favicon 
  
 requests 
  
 -- 
>  
< link 
  
 rel 
 = 
 "shortcut icon" 
  
 type 
 = 
 "image/x-icon" 
  
 href 
 = 
 "data:image/x-icon;," 
>  
<! -- 
  
 Load 
  
 Interactive 
  
 Canvas 
  
 JavaScript 
  
 -- 
>  
< script 
  
 src 
 = 
 "https://www.gstatic.com/assistant/interactivecanvas/api/interactive_canvas.min.js" 
>< / 
 script 
>  
<! -- 
  
 Load 
  
 PixiJS 
  
 for 
  
 graphics 
  
 rendering 
  
 -- 
>  
< script 
  
 src 
 = 
 "https://cdnjs.cloudflare.com/ajax/libs/pixi.js/4.8.7/pixi.min.js" 
>< / 
 script 
>  
<! -- 
  
 Load 
  
 Stats 
 . 
 js 
  
 for 
  
 fps 
  
 monitoring 
  
 -- 
>  
< script 
  
 src 
 = 
 "https://cdnjs.cloudflare.com/ajax/libs/stats.js/r16/Stats.min.js" 
>< / 
 script 
>  
<! -- 
  
 Load 
  
 custom 
  
 CSS 
  
 -- 
>  
< link 
  
 rel 
 = 
 "stylesheet" 
  
 href 
 = 
 "css/main.css" 
>  
< / 
 head 
>  
< body 
>  
< div 
  
 id 
 = 
 "view" 
  
 class 
 = 
 "view" 
>  
< div 
  
 class 
 = 
 "debug" 
>  
< div 
  
 class 
 = 
 "stats" 
>< / 
 div 
>  
< div 
  
 class 
 = 
 "logs" 
>< / 
 div 
>  
< / 
 div 
>  
< / 
 div 
>  
<! -- 
  
 Load 
  
 custom 
  
 JavaScript 
  
 after 
  
 elements 
  
 are 
  
 on 
  
 page 
  
 -- 
>  
< script 
  
 src 
 = 
 "js/log.js" 
>< / 
 script 
>  
< script 
  
 type 
 = 
 "module" 
  
 src 
 = 
 "js/main.js" 
>< / 
 script 
>  
< / 
 body 
>
< / 
 html 
>  

Communicate between Conversational Action and web app

After you've built your web app and Conversational Action and loaded in the Interactive Canvas library in your web app file, you need to define how your web app and Conversational Action interact. To do this, modify the files that contain your web app logic.

action.js

This file contains the code to define callbacks and invoke methods through interactiveCanvas . Callbacks allow your web app to respond to information or requests from the Conversational Action, while methods provide a way to send information or requests to the Conversational Action.

Add interactiveCanvas.ready(callbacks); to your HTML file to initialize and register callbacks :

JavaScript

 /** 
 * This class is used as a wrapper for Google Assistant Canvas Action class 
 * along with its callbacks. 
 */ 
 export 
  
 class 
  
 Action 
  
 { 
  
 /** 
 * @param  {Phaser.Scene} scene which serves as a container of all visual 
 * and audio elements. 
 */ 
  
 constructor 
 ( 
 scene 
 ) 
  
 { 
  
 this 
 . 
 canvas 
  
 = 
  
 window 
 . 
 interactiveCanvas 
 ; 
  
 this 
 . 
 gameScene 
  
 = 
  
 scene 
 ; 
  
 const 
  
 that 
  
 = 
  
 this 
 ; 
  
 this 
 . 
 intents 
  
 = 
  
 { 
  
 GUESS 
 : 
  
 function 
 ( 
 params 
 ) 
  
 { 
  
 that 
 . 
 gameScene 
 . 
 guess 
 ( 
 params 
 ); 
  
 }, 
  
 DEFAULT 
 : 
  
 function 
 () 
  
 { 
  
 // do nothing, when no command is found 
  
 }, 
  
 }; 
  
 } 
  
 /** 
 * Register all callbacks used by the Interactive Canvas Action 
 * executed during game creation time. 
 */ 
  
 setCallbacks 
 () 
  
 { 
  
 const 
  
 that 
  
 = 
  
 this 
 ; 
  
 // Declare the Interactive Canvas action callbacks. 
  
 const 
  
 callbacks 
  
 = 
  
 { 
  
 onUpdate 
 ( 
 data 
 ) 
  
 { 
  
 const 
  
 intent 
  
 = 
  
 data 
 [ 
 0 
 ]. 
 google 
 . 
 intent 
 ; 
  
 that 
 . 
 intents 
 [ 
 intent 
  
 ? 
  
 intent 
 . 
 name 
 . 
 toUpperCase 
 () 
  
 : 
  
 'DEFAULT' 
 ]( 
 intent 
 . 
 params 
 ); 
  
 }, 
  
 }; 
  
 // Called by the Interactive Canvas web app once web app has loaded to 
  
 // register callbacks. 
  
 this 
 . 
 canvas 
 . 
 ready 
 ( 
 callbacks 
 ); 
  
 } 
 } 
  

main.js

The main.js JavaScript module imports the files action.js and scene.js and creates instances of each of them when the web app loads. This module also registers callbacks for Interactive Canvas.

JavaScript

 import 
  
 { 
 Action 
 } 
  
 from 
  
 './action.js' 
 ; 
 import 
  
 { 
 Scene 
 } 
  
 from 
  
 './scene.js' 
 ; 
 window 
 . 
 addEventListener 
 ( 
 'load' 
 , 
  
 () 
  
 => 
  
 { 
  
 window 
 . 
 scene 
  
 = 
  
 new 
  
 Scene 
 (); 
  
 // Set Google Assistant Canvas Action at scene level 
  
 window 
 . 
 scene 
 . 
 action 
  
 = 
  
 new 
  
 Action 
 ( 
 scene 
 ); 
  
 // Call setCallbacks to register Interactive Canvas 
  
 window 
 . 
 scene 
 . 
 action 
 . 
 setCallbacks 
 (); 
 }); 
  

scene.js

The scene.js file constructs the scene for your web app. The following is an excerpt from scene.js :

JavaScript

 const 
  
 view 
  
 = 
  
 document 
 . 
 getElementById 
 ( 
 'view' 
 ); 
 // initialize rendering and set correct sizing 
 this 
 . 
 radio 
  
 = 
  
 window 
 . 
 devicePixelRatio 
 ; 
 this 
 . 
 renderer 
  
 = 
  
 PIXI 
 . 
 autoDetectRenderer 
 ({ 
  
 transparent 
 : 
  
 true 
 , 
  
 antialias 
 : 
  
 true 
 , 
  
 resolution 
 : 
  
 this 
 . 
 radio 
 , 
  
 width 
 : 
  
 view 
 . 
 clientWidth 
 , 
  
 height 
 : 
  
 view 
 . 
 clientHeight 
 , 
 }); 
 this 
 . 
 element 
  
 = 
  
 this 
 . 
 renderer 
 . 
 view 
 ; 
 this 
 . 
 element 
 . 
 style 
 . 
 width 
  
 = 
  
 ` 
 ${ 
 this 
 . 
 renderer 
 . 
 width 
  
 / 
  
 this 
 . 
 radio 
 } 
 px` 
 ; 
 this 
 . 
 element 
 . 
 style 
 . 
 height 
  
 = 
  
 ` 
 ${ 
 ( 
 this 
 . 
 renderer 
 . 
 height 
  
 / 
  
 this 
 . 
 radio 
 ) 
 } 
 px` 
 ; 
 view 
 . 
 appendChild 
 ( 
 this 
 . 
 element 
 ); 
 // center stage and normalize scaling for all resolutions 
 this 
 . 
 stage 
  
 = 
  
 new 
  
 PIXI 
 . 
 Container 
 (); 
 this 
 . 
 stage 
 . 
 position 
 . 
 set 
 ( 
 view 
 . 
 clientWidth 
  
 / 
  
 2 
 , 
  
 view 
 . 
 clientHeight 
  
 / 
  
 2 
 ); 
 this 
 . 
 stage 
 . 
 scale 
 . 
 set 
 ( 
 Math 
 . 
 max 
 ( 
 this 
 . 
 renderer 
 . 
 width 
 , 
  
 this 
 . 
 renderer 
 . 
 height 
 ) 
  
 / 
  
 1024 
 ); 
 // load a sprite from a svg file 
 this 
 . 
 sprite 
  
 = 
  
 PIXI 
 . 
 Sprite 
 . 
 from 
 ( 
 'triangle.svg' 
 ); 
 this 
 . 
 sprite 
 . 
 anchor 
 . 
 set 
 ( 
 0.5 
 ); 
 this 
 . 
 sprite 
 . 
 tint 
  
 = 
  
 0x00FF00 
 ; 
  
 // green 
 this 
 . 
 sprite 
 . 
 spin 
  
 = 
  
 true 
 ; 
 this 
 . 
 stage 
 . 
 addChild 
 ( 
 this 
 . 
 sprite 
 ); 
 // toggle spin on touch events of the triangle 
 this 
 . 
 sprite 
 . 
 interactive 
  
 = 
  
 true 
 ; 
 this 
 . 
 sprite 
 . 
 buttonMode 
  
 = 
  
 true 
 ; 
 this 
 . 
 sprite 
 . 
 on 
 ( 
 'pointerdown' 
 , 
  
 () 
  
 => 
  
 { 
  
 this 
 . 
 sprite 
 . 
 spin 
  
 = 
  
 ! 
 this 
 . 
 sprite 
 . 
 spin 
 ; 
 }); 
  

Support touch interactions

Your Interactive Canvas Action can respond to your user's touch as well as their vocal inputs. Per the Interactive Canvas design guidelines , you should develop your Action to be "voice-first". That being said, some smart displays support touch interactions.

Supporting touch is similar to supporting conversational responses; however, instead of a vocal response from the user, your client-side JavaScript looks for touch interactions and uses those to change elements in the web app.

You can see an example of this in the sample, which uses the Pixi.js library:

JavaScript

  
 this 
 . 
 sprite 
  
 = 
  
 PIXI 
 . 
 Sprite 
 . 
 from 
 ( 
 'triangle.svg' 
 ); 
  
 this 
 . 
 sprite 
 . 
 interactive 
  
 = 
  
 true 
 ; 
  
 // Enables interaction events 
 this 
 . 
 sprite 
 . 
 buttonMode 
  
 = 
  
 true 
 ; 
  
 // Changes `cursor` property to `pointer` for PointerEvent 
 this 
 . 
 sprite 
 . 
 on 
 ( 
 'pointerdown' 
 , 
  
 () 
  
 => 
  
 { 
  
 this 
 . 
 sprite 
 . 
 spin 
  
 = 
  
 ! 
 this 
 . 
 sprite 
 . 
 spin 
 ; 
 }); 
  

Troubleshooting

While you can use the simulator in the Actions console to test your Interactive Canvas Action during development, you can also see errors that occur within your Interactive Canvas web app on users' devices in production. You can view these errors in your Google Cloud Platform logs.

To see these error messages in your Google Cloud Platform logs, follow these steps:

  1. Open your Actions project in the Actions console .
  2. Click Testin the top navigation.
  3. Click the View logs in Google Cloud Platformlink.

Errors from your users' devices appear in chronological order in the logs viewer.

Error types

There are three types of web app errors you can see in the Google Cloud Platform logs:

  • Timeouts that occur when ready is not called within 10 seconds
  • Timeouts that occur when the promise returned by onUpdate() is not fulfilled within 10 seconds
  • JavaScript runtime errors that are not caught within your web app

View JavaScript error details

The details of JavaScript runtime errors within your web app aren't available by default. To see the details of JavaScript runtime errors, follow these steps:

  1. Ensure that you've configured the appropriate cross-origin resource sharing (CORS) HTTP response headers in your web app files. For more information, see Cross-origin resource sharing .
  2. Add crossorigin="anonymous" to your imported <script> tags in your HTML file , as shown in the following code snippet:
 <script crossorigin="anonymous" src="<SRC>"></script> 

Guidelines and restrictions

Take the following guidelines and restrictions into consideration as you develop your web app:

  • No cookies
  • No local storage
  • No geolocation
  • No camera usage
  • No audio or video recording
  • No popups
  • Stay under the 200 MB memory limit
  • Take the Action name header into account when rendering content (occupies upper portion of screen)
  • No styles can be applied to videos
  • Only one media element may be used at a time
  • No Web SQL database
  • No support for the SpeechRecognition interface of the Web Speech API .
  • Dark mode setting not applicable
  • Video playback is supported on smart displays. For more information on the supported media container formats and codecs, see Google Nest Hub codecs .

Cross-origin resource sharing

Because Interactive Canvas web apps are hosted in an iframe and the origin is set to null, you must enable cross-origin resource sharing (CORS) for your web servers and storage resources. This process allows your assets to accept requests from null origins.

Next steps

To add more features to your web app, see Continue building with client or server-side fulfillment .

Create a Mobile Website
View Site in Mobile | Classic
Share by: