teacher-ai
v1.0.37
Published
Module for control AI Teacher
Downloads
16
Readme
Teacher
Module for control AI Teacher
Install
npm install teacher-ai
Usage
In teacher-ai module you have two main entities
Class with business logic
import { Avatar } from "teacher-ai";
const teacher = new Teacher(token, agentId);
React component with view
import { Avatar } from "teacher-ai";
<Avatar teacher={teacher} />;
API
Teacher
const teacher = new Teacher(token, agentId);
Also teacher has three main subclasses for control
const teacher = new Teacher(token, agentId);
teacher.audioManager;
teacher.microphoneManager;
teacher.socketManager;
Run current session with selected microphone. If you don't put value it will use default one
teacher.start({ microphoneId });
teacher.start(); // microphoneId: default
Class responsible for microphone handling
teacher.microphoneManager;
Subscribe on microphone events. Be careful it is so frequent action and use debounce to avoid performance issues
interface MicrophoneEvents {
audio: Float32Array;
volume: number; // float range: 0 - 1
}
const event: (data: MicrophoneEvents) => void = (data) => {};
teacher.microphoneManager.subscribeToEvents(event);
To get a list of microphones, you can use this method
const microphones = await teacher.microphoneManager.getMicrophones();
To mute/unmute microphone
teacher.microphoneManager.mute();
teacher.microphoneManager.unmute();
To get AudioContext for microphone Available after teacher.start()
const audioContext = teacher.microphoneManager.audioContext;
Remove current session
teacher.stop();
Class responsible for socket connection and handling
teacher.socketManager;
To get socket instance. Available after socketManager.open()
teacher.socketManager.socket;
To open/close socket connection
await teacher.socketManager.open();
teacher.socketManager.close(); // teacher.socketManager.socket === undefined
Subscribe on messages from bot
type MessageType = {
text: string;
type: "bot" | "human";
id: number;
};
const event: (data: MessageType) => void = (data) => {};
teacher.socketManager.subscribeToEvents(event);
Class responsible for audio handling
teacher.audioManager;
To get AudioContext for speaker. Available after teacher.start()
const audioContext = teacher.audioManager.audioContext;
To control the volume of playing audio.
const audioContext = teacher.audioManager.setVolume(1); // range: -1 - 1
Subscribe on audio events
enum AudioStatus {
PLAYING = "playing",
AUDIO_FINISHED = "audio_finished",
QUEUE_FINISHED = "stopped",
}
interface AudioEvents {
status: AudioStatus;
source?: AudioBufferSourceNode;
}
const event: (data: AudioEvents) => void = (data) => {};
teacher.audioManager.subscribeToEvents(event);
React
For the avatar to work you need to pass in Teacher and ref to control the state of the avatar
const ref = useRef("idle") // hello, gesture1, thumbsUP, surprised, laugh
<Avatar controlRef={ref} teacher={teacher} />
With ref you can control the avatar's emotions and behaviour, also the Teacher will control it.
For example, let's make the avatar laugh
const ref = useRef("idle")
const handleSmile = () => ref.current = "laugh"
<button onClick={handleSmile}>make smile</button>
<Avatar controlRef={ref} teacher={teacher} />
For more detailed avatar customisation, the following options are also available to you
const params = { x: 0, y: -2, z: -3, dpr: 2 }
<Avatar controlRef={ref} teacher={teacher} params={params} />