npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

chatgptreversed

v0.1.2

Published

Free ChatGPT API reversed and simplified

Downloads

20

Readme

ChatGPTReversed - Educational project

Lets keep it simple, this is a educational project to learn to reverse complex API's and understand how they communicate with the frontend. In this case we take a look at the ChatGPT frontend and reverse engineer the API used to communicate with the LLM.

OpenAi uses several techniques to prevent malicious use of their API, eg. rate limiting, token expiration, hashing, proof of work, continious calls, proxies, captchas, etc.

We take a look at the ChatGPT webapp as starting point and only use the chromium devtools to understand the process.

step1 We start by opening the ChatGPT webapp and open the devtools to see the network requests.

step2 https://chatgpt.com/backend-api/conversation endpoint is called with a POST request, we can see the payload and type of response. (In this case its a EventStream)

step3 step4 Several Identifing headers and cookies are used to prevent abuse, in this case: Authorization(JWT Token), csrf-token(CSRF protection), session-token (Same as the JWT token), Requirements-Token & Proof Token

step5 https://chatgpt.com/backend-api/sentinel/chat-requirements endpoint is called before the conversation starts, it passes in the token x and returns the token y.

{
  "persona": "chatgpt-freeaccount",
  "token": "y",
  "arkose": {},
  "turnstile": {},
  "proofofwork": {
    "required": true,
    "seed": "0.81186133b2821174",
    "difficulty": "073682"
  }
}

To find out how x is retrieved we need to take a look at the minified source code of the frontend.

step6 Token x in this case is variable e which is passed as callback from variable n which uses the function getRequirementsToken to retrieve the token. step7 The function getRequirementsToken in this case returns the token x by checking if the value is already in a map called answers, if not it calls the function _generateAnswer which returns the token x by using a hash function provided by the hashing library hash-wasm.

step8 We place a breakpoint right after the getRequirementsToken function is called and check the returned value which is the token x.

So we have the token x (Requirements token), we need to pass to the endpoint, we also have the token y (Required Requirements Token) which is returned by the endpoint. The last thing we need is the token z (Proof token) which as we find out is also generated with _generateAnswer.

_generateAnswer function is called with the seed and difficulty returned by the endpoint, it uses the seed and also multiple parameters retrieved by the getConfig such as screen size, timezone, cpu cores, etc. to generate a hash and satisfy the difficulty condition. If no hash is found it will increment the step and try again. It falls back to a specified value after multiple steps.

In this case the function _generateAnswer is called with the seed and difficulty returned by the endpoint and that returns the token z.

So we have all the required tokens to call the conversation endpoint and start a conversation with the LLM. To recap:

  • Session Token (JWT Token) is returned by https://chatgpt.com/api/auth/session in field accessToken

  • CSRF Token is returned by https://chatgpt.com/api/auth/csrf in field csrfToken

  • Requirements Token (Token x) is returned by getRequirementsToken function

  • Required Requirements Token (Token y) is returned by the endpoint https://chatgpt.com/backend-api/sentinel/chat-requirements

  • Proof Token (Token z) is returned by _generateAnswer function with the seed and difficulty returned by the https://chatgpt.com/backend-api/sentinel/chat-requirements endpoint

The rest is basic web communication knowledge.

Documentation

import {ChatGPTReversed} from "chatgptreversed"; // const {ChatGPTReversed} = require("chatgptreversed");

const chatgpt = new ChatGPTReversed({
  sessionToken: "Session Token",
  requirementsToken: "token x",
});

const result = await chatgpt.complete("Hello, how are you?");
console.log(result);

// Output: Hello! I'm here and ready to assist you. How can I help you today?
import {ChatGPTReversed} from "chatgptreversed"; // const {ChatGPTReversed} = require("chatgptreversed");

const chatgpt = new ChatGPTReversed({
  sessionToken: "Session Token",
  requirementsToken: "token x",
});

async function main() {
  const result = await chatgpt.complete("Hello, how are you?");
  console.log(result);
}

main();

// Output: Hello! I'm here and ready to assist you. How can I help you today?

Like this project? Leave a star! 💫⭐