@ibm-watson/food-coach
v1.1.0
Published
A simple Node.js based web app which shows how to use the Watson Assistant API to recognize user intents.
Downloads
8
Readme
Demo: http://food-coach.ng.bluemix.net/
For more information on the Assistant service, see the detailed documentation. For more information on the Tone Analyzer Service, see the detailed documentation.
Deploying the application
If you want to experiment with the application or use it as a basis for building your own application, you need to deploy it in your own environment. You can then explore the files, make changes, and see how those changes affect the running application. After making modifications, you can deploy your modified version of the application to IBM Cloud.
Prerequisites
- Sign up for an IBM Cloud account.
- Download the IBM Cloud CLI.
- Create an instance of the Watson Assistant service and get your credentials:
- Go to the Watson Assistant page in the IBM Cloud Catalog.
- Log in to your IBM Cloud account.
- Click Create.
- Click Show to view the service credentials.
- Copy the
apikey
value, or copy theusername
andpassword
values if your service instance doesn't provide anapikey
. - Copy the
url
value.
- Create an instance of the Tone Analyzer service and get your credentials:
- Go to the Tone Analyzer page in the IBM Cloud Catalog.
- Log in to your IBM Cloud account.
- Click Create.
- Click Show to view the service credentials.
- Copy the
apikey
value, or copy theusername
andpassword
values if your service instance doesn't provide anapikey
. - Copy the
url
value.
Configuring the application
In your IBM Cloud console, open the Watson Assistant service instance
Click the Import workspace icon in the Watson Assistant service tool. Specify the location of the workspace JSON file in your local copy of the app project:
<project_root>/food-coach/training/food-coach-workspace.json
Select Everything (Intents, Entities, and Dialog) and then click Import. The car dashboard workspace is created.
Click the menu icon in the upper-right corner of the workspace tile, and then select View details.
Click the icon to copy the workspace ID to the clipboard.
In the application folder, copy the .env.example file and create a file called .env
cp .env.example .env
Open the .env file and add the service credentials that you obtained in the previous step.
Example .env file that configures the
apikey
andurl
for a Watson Assistant service instance hosted in the US East region:ASSISTANT_IAM_APIKEY=X4rbi8vwZmKpXfowaS3GAsA7vdy17Qh7km5D6EzKLHL2 ASSISTANT_URL=https://gateway-wdc.watsonplatform.net/assistant/api
If your service instance uses
username
andpassword
credentials, add theASSISTANT_USERNAME
andASSISTANT_PASSWORD
variables to the .env file.Example .env file that configures the
username
,password
, andurl
for a Watson Assistant service instance hosted in the US South region:ASSISTANT_USERNAME=522be-7b41-ab44-dec3-g1eab2ha73c6 ASSISTANT_PASSWORD=A4Z5BdGENrwu8 ASSISTANT_URL=https://gateway.watsonplatform.net/assistant/api
Add the
WORKSPACE_ID
to the previous propertiesWORKSPACE_ID=522be-7b41-ab44-dec3-g1eab2ha73c6
Your
.env
file should looks like:# Environment variables WORKSPACE_ID=1c464fa0-2b2f-4464-b2fb-af0ffebc3aab ASSISTANT_IAM_APIKEY=_5iLGHasd86t9NddddrbJPOFDdxrixnOJYvAATKi1 ASSISTANT_URL=https://gateway-syd.watsonplatform.net/assistant/api TONE_ANALYZER_IAM_APIKEY=UdHqOFLzoOCFD2M50AbsasdYhOnLV6sd_C3ua5zah TONE_ANALYZER_URL=https://gateway-syd.watsonplatform.net/tone-analyzer/api
Running locally
Install the dependencies
npm install
Run the application
npm start
View the application in a browser at
localhost:3000
Deploying to IBM Cloud as a Cloud Foundry Application
Login to IBM Cloud with the IBM Cloud CLI
ibmcloud login
Target a Cloud Foundry organization and space.
ibmcloud target --cf
Edit the manifest.yml file. Change the name field to something unique.
For example,- name: my-app-name
.Deploy the application
ibmcloud app push
View the application online at the app URL.
For example: https://my-app-name.mybluemix.net
What to do next
After you have the application installed and running, experiment with it to see how it responds to your input.
Modifying the application
After you have the application deployed and running, you can explore the source files and make changes. Try the following:
Modify the .js files to change the application logic.
Modify the .html file to change the appearance of the application page.
Use the Assistant tool to train the service for new intents, or to modify the dialog flow. For more information, see the Assistant service documentation.
What does the Food Coach application do?
The application interface is designed for chatting with a coaching bot. Based on the time of day, it asks you if you've had a particular meal (breakfast, lunch, or dinner) and what you ate for that meal.
The chat interface is in the left panel of the UI, and the JSON response object returned by the Assistant service in the right panel. Your input is run against a small set of sample data trained with the following intents:
yes: acknowledgment that the specified meal was eaten
no: the specified meal was not eaten
help
exit
The dialog is also trained on two types of entities:
food items
unhealthy food items
These intents and entities help the bot understand variations your input.
After asking you what you ate (if a meal was consumed), the bot asks you how you feel about it. Depending on your emotional tone, the bot provides different feedback.
Below you can find some sample interactions:
In order to integrate the Tone Analyzer with the Assistant service, the following approach was taken:
- Intercept the user's message. Before sending it to the Assistant service, invoke the Tone Analyzer Service. See the call to
toneDetection.invokeToneAsync
in theinvokeToneConversation
function in app.js. - Parse the JSON response object from the Tone Analyzer Service, and add appropriate variables to the context object of the JSON payload to be sent to the Assistant service. See the
updateUserTone
function in tone_detection.js. - Send the user input, along with the updated context object in the payload to the Assistant service. See the call to
assistant.message
in theinvokeToneConversation
function in app.js.
You can see the JSON response object from the Assistant service in the right hand panel.
In the conversation template, alternative bot responses were encoded based on the user's emotional tone. For example:
License
This sample code is licensed under Apache 2.0. Full license text is available in LICENSE.
Contributing
See CONTRIBUTING.
Open Source @ IBM
Find more open source projects on the IBM Github Page.