For comprehensive and automated dialog testing, you can switch to the Test View in Alan AI Studio. The Test View lets you create test cases and repeatedly run them against new versions of the dialog script. This way, you can quickly validate your dialog and make sure all requirements to the conversational flow are met at any stage of the development process.
To test a dialog script in the Test View:
In the top right corner of the code editor, click Test.
In the left pane, click Add Case, specify the case name and click OK.
In the dialog box, create a dialog branch you want to test:
Enter the commands that the user will give while following the dialog branch.
To validate Alan’s responses, in the left pane of the code editor, set the Show Expected Responses toggle to the On position. Then hover over a user’s command in the branch and click the Add expected response icon. In the box below, enter one or more phrases with which the assistant must reply.
To specify compound responses, use
OR. If a phrase contains an apostrophe, enclose it in double quotes, for example:
hello OR (hi AND "what's up").
To set the visual state for the dialog branch or a specific command, click the Set visual state icon and enter the visual state as JSON. For details, see Setting the visual state.
To test the dialog branch in your app or in Alan AI Playground, click Scan QR and scan the code with your device camera. For details, see In-app testing.
To add another dialog branch, click the plus icon to the right of the dialog box and create the branch as described above.
Once your test case is set up, in the left pane of the code editor, click Run all. Alan AI will run the test case against the current version of the dialog script and mark the dialog branches as successfully passed or failed.
The Alan AI Platform allows you to easily save and share the created test cases. You can:
Export and import test cases to save and copy them between projects
Save test cases to GitHub to share them with other team members
Exporting and importing test cases¶
To save a test case for later or transfer it to another project on your account, use the Import/Export option in Alan AI Studio.
In the left pane of the Test View, click Import/Export.
In the top right corner, click Copy to copy the raw test case data and save it in a file.
To import a test case to an Alan AI Studio project, in the Test View, click Import/Export, paste the copied data and click Apply Changes.
If a test case with the same name already exists in your project, Alan AI will replace the existing test case with the imported one.
Saving test cases to GitHub¶
You can integrate your assistant project with GitHub and store the created test cases in a GitHub repository, just like dialog script files. This option can be helpful, for example, if you want to share test cases with other team members working on your assistant. For details, see Sharing and keeping scripts in GitHub.
Dialog scripts and test cases are saved to the same target. Once you synchronize with GitHub, Alan AI pulls the content from the selected repository and branch and adds it to your project:
Dialog script files (JS) are added to the code editor
Test case files (TESTCASE) are added to the Test View
When you push the changes to GitHub, Alan AI pushes both changes made to dialog scripts and test cases. In case of a conflict, Alan AI will prompt you to sequentially resolve conflicts for dialog scripts and test cases.