n8n-nodes-gpt-tokenizer
v0.1.1
Published
n8n node for working with BPE Tokens with OpenAI's GPT models in mind.
Downloads
62
Maintainers
Readme
Work with BPE Tokens in n8n with the GPT-Tokenizer Node
This community package contains a node to work with BPE Tokens such as OpenAI's GPT models use under the hood. As a matter of fact this node works just fine with the OpenAI Node.
You can:
- Encode a string into BPE Tokens (may be cool for custom training)
- Decode an array of BPE Tokens back to a string (for funzies?)
- Determine a strings token length before submitting to the OpenAI API
- Calculate costs before submitting to OpenAI API
- Split a text into chunks which match exactly a definable Token Limit
n8n is a fair-code licensed workflow automation platform.
Supported Operations
Installation
Compatibility
About
Version History
Supported Operations
| Operation | Description | Options | | ------------- | ------------- | ------------- | | Encode | Encode a string into BPE Tokens. Returns an array of Tokens. | - | | Decode | Decode an array of BPE Tokens into a string. Returns a string. | - | | Count Tokens | Count the tokens a string produces. Return the number of tokens. | - | | Check Token Limit | Wheather a given string exceeds a defined Token Limit. Returns a boolean. | Optional: throw an error if the Token Limit is exceeded. | | Slice to Max Token Limit | Slice the string into block which match exactly the provided token limit. Returns an array of strings. | - |
Installation
Follow the installation guide in the n8n community nodes documentation. It also should automatical install this depency: https://www.npmjs.com/package/gpt-tokenizer, which is a port of the original BPE Token Python Library.
Compatibility
The Latest Version of n8n. If you encounter any problem, feel free to open an issue on Github.
About
Version History
0.1.1
- just polising the npm release
0.1.0
- initial release