openai_embedding
v1.0.13
Published
package to help inputting and storing long reference text, and extract information when asking prompt
Downloads
8
Readme
openai_embedding
package to help inputting and storing long reference text, and extract information when asking prompt
Example usage:
//this will return all vector map in {text:{embedding:num[],tokensNum:num}}
let embedMap=await computeDocEmbeddings(["hello, hows it going","come on!","long long long text long long long text long long long text long long long text" + " long long long text long long long text long long long text long long long text"+ " long long long text long long long text long long long text long long long text"+ " long long long text long long long text long long long text long long long text"+ " long long long text long long long text long long long text long long long text"+ " long long long text long long long text long long long text long long long text"]);
//pass the vector map and question text, then //this will return related context in string[]
await constructPrompt("How long is the text?",embedMap);
//this will return an array of text, separated in to coherent sections
intoParagraphs({raw:"some paragraphs text", maxtoken=400});