@lighthouseapps/react-native-oda
v1.1.5
Published
An unofficial react-native client for Oracle Digital Assistant
Downloads
16
Keywords
Readme
@lighthouseapps/react-native-oda
An unofficial react-native client for Oracle Digital Assistant
IMPORTANT NOTE: For now, this library only supports Android's client SDK.
iOS support will come in a future update.
Getting started
Install Node dependency
$ npm install @lighthouseapps/react-native-oda --save
or
$ yarn add @lighthouseapps/react-native-oda
Link dependencies
For react-native <.60
$ react-native link @lighthouseapps/react-native-oda
For react-native >=.60
Skip
Add native SDKs to native projects
Android
- Replace the following line in
settings.gradle
:
Replace
include ':app'
with
include ':app', ':com.oracle.bots.client.sdk.android.core-21.10'
- Make sure the following dependencies are imported in
android/app/build.gradle
:
implementation "androidx.swiperefreshlayout:swiperefreshlayout:1.0.0"
implementation 'androidx.preference:preference:1.1.1'
// SDK
implementation project(':com.oracle.bots.client.sdk.android.core-21.10')
// Core dependencies
implementation 'androidx.room:room-runtime:2.2.5'
implementation 'io.socket:socket.io-client:0.8.3'
implementation 'androidx.core:core:1.3.0'
- Copy the native module from
node_modules/@lighthouseapps/react-native-oda/android/com.oracle.bots.client.sdk.android.core-21.10
to your project's android directory
Usage
Send a text message
<Button
title="Chat"
onPress={() => {
OracleDigitalAssistant.sendMessage('Buy some coffee');
}}
/>
Voice messages
OracleDigitalAssistant.startRecording(); // Voice to text
// NOTE: The first time you run this function, it will ask for recording permissions, but won't start recording. You'll have to run it again to get user speech
OracleDigitalAssistant.stopRecording(); // Stops listening for user speech
OracleDigitalAssistant.setSpeechLocale('en-us') // sets the user's expected speaking language
Listen for new text messages
import OracleDigitalAssistant from '@lighthouseapps/react-native-oda';
...
useEffect(() => {
(async () => {
try {
const result = await OracleDigitalAssistant.init(
'userId',
'2h7s92cv-d4c6-ds93-a069-f1l374932aaL',
'oda-9928v32csde323asb5o930s1c84751f4-dd4.data.digitalassistant.oci.oraclecloud.com',
);
OracleDigitalAssistant.setSpeechLocale('pt-br');
OracleDigitalAssistant.setupChatListeners({
onMessage: message => {
console.log(message);
/*
outputs:
{
"actions": [
"1. New Shopping Cart",
"2. Checkout",
"3. SignUp",
],
"createdDate": 1639500175642,
"footerText": "",
"headerText": "",
"isRead": false
}
*/
},
onStatusChange: status => {
console.log(status)
/*
"status" can be either of:
DISCONNECTED
CONNECTING
CONNECTED
*/
},
onSpeechStopRecording: () => {
// E.g: change the recording icon
// NOTE: This method will also call onSuccess with an empty string
},
onSpeechStartRecording: () => {
// E.g: change the recording icon
},
onSpeechPartialResult: partialMessage => {
/*
Displays the user "spelling"
for exemple, if the user is saying "test"
it will log "t", "te", "tes" and so on...
*/
console.log(partialMessage);
},
onSpeechError: error => {
console.log(error);
},
onSpeechSuccess: message => {
console.log(message)
/*
When the user stops speaking, the "message" will the the entire phrase
E.g: "test"
*/
},
});
} catch (err) {
console.log(err);
}
})();
}, []);