npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

google-url-indexing

v1.1.0

Published

Script to get your site indexed on Google in less than 24 hours

Downloads

134

Readme

Google URL Indexing

Use this script to get your entire site indexed on Google in less than 24 hours. No tricks, no hacks, just a simple script and a Google API.

You can read more about the motivation behind it and how it works in this blog post

[!IMPORTANT]

  1. Indexing != Ranking. This will not help your page rank on Google, it'll just let Google know about the existence of your pages.
  2. This script uses Google Indexing API. We do not recommend using this script on spam/low-quality content.

Requirements

Preparation

  1. Follow this guide from Google. By the end of it, you should have a project on Google Cloud with the Indexing API enabled, a service account with the Owner permission on your sites.
  2. Make sure you enable both Google Search Console API and Web Search Indexing API on your Google Project ➤ API Services ➤ Enabled API & Services.
  3. Download the JSON file with the credentials of your service account and save it in the same folder as the script. The file should be named service_account.json

[!IMPORTANT]

To download the JSON file from Google Cloud:

  1. Go to IAM & Admin, click on "Service Accounts."
  2. Select your service account, click on the "Keys" tab.
  3. Add a new key, choose "Create new key."
  4. Set "Key type" to "JSON" and click "Create."
  5. Download the JSON file prompted by your browser.
  6. Remember: It's a one-time download, consider temporary access for better security.

Installation

Using CLI

Install the cli globally on your machine.

npm i -g google-url-indexing

Using the repository

Clone the repository to your machine.

git clone https://github.com/bytexposure/google-url-indexing.git
cd google-url-indexing

Install and build the project.

npm install
npm run build
npm i -g .

[!NOTE] Ensure you are using an up-to-date Node.js version, with a preference for v20 or later. Check your current version with node -v.

Usage

Create a .gui directory in your home folder and move the service_account.json file there.

mkdir ~/.gui
mv service_account.json ~/.gui

Run the script with the domain or url you want to index.

gui <domain or url>
# example
gui techjunctions.com

# submit a single URL for indexing
gui techjunctions.com --single-url https://example.com/page-to-index

# force resubmission of URLs even if already submitted
gui techjunctions.com --force-resubmit

# submit multiple URLs from a file
gui techjunctions.com --bulk-urls urls.txt

# submit multiple URLs from command line
gui techjunctions.com --bulk-urls-list "https://example.com/page1,https://example.com/page2"

Sitemap Submission

# Process a specific sitemap URL
gui example.com --sitemap https://example.com/sitemap.xml

This command will:
1. Fetch all URLs from the specified sitemap
2. Check the indexing status of each URL
3. Request indexing for URLs that are not yet indexed or need reindexing

Using the --sitemap option allows you to focus on indexing URLs from a specific sitemap, which can be useful for:
- Prioritizing indexing for a subset of your site's pages
- Handling large sites with multiple sitemaps more efficiently
- Quickly indexing newly added or updated pages listed in a specific sitemap

Note: When using the --sitemap option, the script will only process URLs from the specified sitemap, ignoring other sitemaps that might be associated with the site.

Here are some other ways to run the script:

# custom path to service_account.json
gui techjunctions.com --path /path/to/service_account.json
# long version command
google-url-indexing techjunctions.com
# cloned repository
npm run index techjunctions.com

Open service_account.json and copy the client_email and private_key values.

Run the script with the domain or url you want to index.

GUI_CLIENT_EMAIL=your-client-email GUI_PRIVATE_KEY=your-private-key gui techjunctions.com

Open service_account.json and copy the client_email and private_key values.

Once you have the values, run the script with the domain or url you want to index, the client email and the private key.

gui techjunctions.com --client-email your-client-email --private-key your-private-key

You can also use the script as a npm module in your own project.

npm i google-url-indexing
import { index } from "google-url-indexing";
import serviceAccount from "./service_account.json";

index("techjunctions.com", {
  client_email: serviceAccount.client_email,
  private_key: serviceAccount.private_key,
})
  .then(console.log)
  .catch(console.error);

Read the API documentation for more details.

Here's an example of what you should expect:

[!IMPORTANT]

  • Your site must have 1 or more sitemaps submitted to Google Search Console. Otherwise, the script will not be able to find the pages to index.
  • You can run the script as many times as you want. It will only index the pages that are not already indexed.
  • Sites with a large number of pages might take a while to index, be patient.

Quota

Depending on your account several quotas are configured for the API (see docs). By default the script exits as soon as the rate limit is exceeded. You can configure a retry mechanism for the read requests that apply on a per minute time frame.

export GUI_QUOTA_RPM_RETRY=true
import { index } from 'google-url-indexing'
import serviceAccount from './service_account.json'

index('seogets.com', {
  client_email: serviceAccount.client_email,
  private_key: serviceAccount.private_key
  quota: {
    rpmRetry: true
  }
})
  .then(console.log)
  .catch(console.error)

📄 License

MIT License

💖 Sponsor

This project is sponsored by Tech Junctions