npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

opensea-scraper

v6.6.0

Published

Scraping floor prices from opensea.

Downloads

94

Readme

Opensea Scraper

DISCLAIMER: You can get accurate realtime floor prices from this official opensea API endpoint: https://api.opensea.io/api/v1/collection/{slug}/stats:

const axios = require("axios");

async function getFloorPrice(slug) {
  try {
    const url = `https://api.opensea.io/collection/${slug}/stats`;
    const response = await axios.get(url);
    return response.data.stats.floor_price;
  } catch (err) {
    console.log(err);
    return undefined;
  }
}

await getFloorPrice("lostpoets");
await getFloorPrice("treeverse");
await getFloorPrice("cool-cats-nft");

If you need floor prices, please use the official API (see above 👆👆👆). This scraper still can be used to scrape additional information about offers (tokenId, name, tokenContractAddress and offerUrl) as well as the ranking.

Install

npm install opensea-scraper

Usage

slug is the human readable identifier that opensea uses to identify a collection. It can be extracted from the URL: https://opensea.io/collection/{slug} slug

options is an object with the following keys

  • debug [Boolean] launches chromium locally, omits headless mode (default: false)
  • logs [Boolean]: display logs in the console (default: false)
  • sort [Boolean]: sorts the offers by lowest to highest (default: true)
  • additionalWait [Number]: time to wait (in milliseconds) after page was loaded before the scraping starts (default: 0)
  • browserInstance [PuppeteerBrowser]: bring your own browser instance for more control
const OpenseaScraper = require("opensea-scraper");

// which nft project to scrape?
const slug = "cool-cats-nft";

// options
const options = {
  debug: false,
  logs: false,
  sort: true,
  additionalWait: 0,
  browserInstance: undefined,
}

// get basic info (from the opensea API)
const basicInfo = await OpenseaScraper.basicInfo(slug);

// get offers from opensea. Each offer includes the 
// floor price, tokenName, tokenId, tokenContractAddress
// and offerUrl
// scrapes only the first 20 offers from opensea.
let result = await OpenseaScraper.offers(slug, options);
console.dir(result, {depth: null}); // result object contains keys `stats` and `offers`

// get offers from opensea using a custom link
// Opensea supports encoding filtering in the URL so 
// this method is helpful for getting a specific asset 
// (for example floor price for a deadfellaz with  
// a purple fur trait)
let url = "https://opensea.io/collection/deadfellaz?search[sortAscending]=true&search[sortBy]=PRICE&search[stringTraits][0][name]=Body&search[stringTraits][0][values][0]=Purple%20Fur&search[toggles][0]=BUY_NOW";
result = await OpenseaScraper.offersByUrl(url, options);
// result object contains keys `stats` and `offers`
console.dir(result, {depth: null}); 

// DISCLAIMER: FUNCTION `offersByScrolling`
// IS CURRENTLY NOT WORKING (!!!) see [issue#36](https://github.com/dcts/opensea-scraper/issues/36)
// get offersByScrolling from opensea. This is an 
// alternative method to get the same data as with
// the function `offers`, with the only difference 
// that the data is here scraped actively by scrolling 
// through the page. This method is not as efficient
// as the `offers` method, but it can scrape more 
// than 20 offers. You could even scrape a whole 
// collection with ~10k spots (this is not recommended 
// though).
// IMPORTANT: if you need less than 20 offers, 
// please use the function `offers()` instead
let resultSize = 40; 
result = await OpenseaScraper.offersByScrolling(slug, resultSize, options);
// result object contains keys `stats` and `offers`
console.dir(result, {depth: null}); 

// DISCLAIMER: FUNCTION `offersByScrollingByUrl`
// IS CURRENTLY NOT WORKING (!!!) see [issue#36](https://github.com/dcts/opensea-scraper/issues/36)
// get offersByScrollingByUrl from opensea using a 
// custom link instead of the slug. the same logic 
// applies as in `offersByScrolling()`
// Opensea supports encoding filtering in the URL so 
// this method is helpful for getting a specific asset 
// (for example floor price for a deadfellaz with  
// a purple fur trait)
// IMPORTANT: if you need less than 20 offers, 
// please use the function `offersByUrl()` instead
url = "https://opensea.io/collection/deadfellaz?search[sortAscending]=true&search[sortBy]=PRICE&search[stringTraits][0][name]=Body&search[stringTraits][0][values][0]=Purple%20Fur&search[toggles][0]=BUY_NOW";
resultSize = 40;
result = await OpenseaScraper.offersByScrollingByUrl(url, resultSize, options);
// result object contains keys `stats` and `offers`
console.dir(result, {depth: null}); 

// scrape all slugs, names and ranks from the top collections from the rankings page
// "type" is one of the following:
//   "24h": ranking of last 24 hours: https://opensea.io/rankings?sortBy=one_day_volume
//   "7d": ranking of last 7 days: https://opensea.io/rankings?sortBy=seven_day_volume
//   "30d": ranking of last 30 days: https://opensea.io/rankings?sortBy=thirty_day_volume
//   "total": scrapes all time ranking: https://opensea.io/rankings?sortBy=total_volume
// "chain" is one of the following: "ethereum", "matic", "klaytn", "solana"
//    if chain is unset, all chains will be selected by default
const type = "24h"; // possible values: "24h", "7d", "30d", "total"
const chain = "solana";
const ranking = await OpenseaScraper.rankings(type, chain, options);

Debugging

To investigate an issue turn on logs and debug mode (debug: true and logs: true):

const result = await OpenseaScraper.offers("treeverse", {
  debug: true,
  logs: true
});

Bring your own puppeteer

if you want to customize the settings for your puppeteer instance you can add your own puppeteer browser instance in the options. 🚧 IMPORTANT: I recommend using stealth plugin as otherwise you most likely won't be able to scrape opensea. If you find a way without using the stealth plugin please report in the form of an issue!

const puppeteer = require('puppeteer-extra');
// add stealth plugin and use defaults (all evasion techniques)
const StealthPlugin = require('puppeteer-extra-plugin-stealth');
puppeteer.use(StealthPlugin());

const myPuppeteerInstance = await puppeteer.launch(myCustomSettings);

const result = await OpenseaScraper.offer("cool-cats-nft", {
  browserInstance: myPuppeteerInstance
});

Demo

npm run demo

Run local console / REPL

To test the functions in an REPL node environment that has OpenseaScraper service preloaded simply run:

node --experimental-repl-await -i -e "$(< init-dev-env.js)"

I recommend saving an alias:

alias consl='node --experimental-repl-await -i -e "$(< init-dev-env.js)"';

Contribute

Open PR or issue if you would like to have more features added.

Donations 🙏

Thanks for your support!
BTC: bc1qq5qn96ahlqjxfxz2n9l20kem8p9nsz5yzz93f7
ETH: 0x3e4503720Fb8f4559Ecf64BE792b3100722dE940

nftfloorprice.info 🔔

Simple NFT floor price alerts. Easily track all your NFTs and receive realtime email alerts with: https://nftfloorprice.info