gpt3-tokenizer
v1.1.5
Published
[![Build](https://github.com/botisan-ai/gpt3-tokenizer/actions/workflows/main.yml/badge.svg)](https://github.com/botisan-ai/gpt3-tokenizer/actions/workflows/main.yml) [![NPM Version](https://img.shields.io/npm/v/gpt3-tokenizer.svg)](https://www.npmjs.com/
Downloads
65,586
Readme
GPT3 Tokenizer
This is a isomorphic TypeScript tokenizer for OpenAI's GPT-3 model. Including support for gpt3
and codex
tokenization. It should work in both NodeJS and Browser environments.
Usage
First, install:
yarn add gpt3-tokenizer
In code:
import GPT3Tokenizer from 'gpt3-tokenizer';
const tokenizer = new GPT3Tokenizer({ type: 'gpt3' }); // or 'codex'
const str = "hello 👋 world 🌍";
const encoded: { bpe: number[]; text: string[] } = tokenizer.encode(str);
const decoded = tokenizer.decode(encoded.bpe);
Reference
This library is based on the following:
The main difference between this library and gpt-3-encoder is that this library supports both gpt3
and codex
tokenization (The dictionary is taken directly from OpenAI so the tokenization result is on par with the OpenAI Playground). Also Map API is used instead of JavaScript objects, especially the bpeRanks
object, which should see some performance improvement.