npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

seo-info

v1.0.2

Published

An SEO Analyzer for Single Page Applications (SPAs)

Downloads

57

Readme

SEO Info

An SEO Analyzer for Single Page Applications (SPAs) that provides comprehensive insights into SEO metrics, accessibility, performance, and best practices.

Features

  • Command-Line Interface (CLI): Easily analyze websites directly from the terminal.
  • API Usage: Integrate the analyzer into Node.js projects and CI/CD pipelines.
  • Configurable Settings: Customize analysis parameters via configuration files or command-line options.
  • Detailed Reports: Generate reports in JSON, HTML, or PDF formats.
  • Accessibility Audit: Leverage Axe-core for in-depth accessibility analysis.
  • Performance Metrics: Use Lighthouse to gather key performance indicators.
  • CSR/SSR Detection: Determine if a site uses Client-Side or Server-Side Rendering.
  • Lazy Loading Detection: Identify lazy-loaded images and content.
  • JavaScript Dependency Analysis: Analyze JS dependencies and their impact on load times.
  • Error Handling: Robust error handling for network issues and missing resources.
  • Savings Report: Displays analysis statistics such as performance scores and accessibility issues.

Installation

Via NPM

Install globally to use the CLI anywhere:

npm install -g seo-info

Using NPX

Run without installation

npx seo-info <url> [options]

CLI Usage

Analyze a website by providing its URL

seo-info <url> [options]

Example

seo-info https://example.com --format html --output ./reports/example-report

CLI Options:

  • -c, --config: Path to a configuration file.
  • -0, --output: Output path for the report (without extension).
  • -f, --format: Format of the generated report (json, html, pdf).
  • -h, --help: Display help information.

API Usage

Integrate the analyzer into your Node.js projects:

import { analyzeSEO } from 'seo-info';

(async () => {
  const htmlContent = `
    <!DOCTYPE html>
    <html>
    <head>
      <title>Your SPA</title>
      <meta name="description" content="Description of your SPA">
      <!-- Other meta tags -->
    </head>
    <body>
      <!-- Your SPA content -->
    </body>
    </html>
  `;
  const baseUrl = 'https://example.com'; // Replace with your base URL

  const options = {
    imageSizeLimit: 200 * 1024, // 200KB
    timeout: 60000, // 60 seconds
    reportFormat: 'html',
    outputPath: './reports/seo-report',
  };

  try {
    const seoResults = await analyzeSEO(htmlContent, baseUrl, options);
    console.log('SEO analysis completed successfully.');
  } catch (error) {
    console.error('SEO analysis failed:', error.message);
  }
})();

Configurations

You can customize the analyzer using a configuration file using .seoinforc. Create a .seoinforc file in JSON format in your project's root directory:

{
  "imageSizeLimit": 150000,
  "timeout": 60000,
  "thresholds": {
    "largeImageSize": 150000,
    "totalJsSize": 750000,
    "ssrContentLengthThreshold": 2000,
    "lazyLoadDelay": 1500
  },
  "reportFormat": "html",
  "outputPath": "./reports/seo-report"
}

Output Statistics

Each compression operation returns statistics such as:

  • Title: The title of the page.
  • Description: Meta description content.
  • Keywords: Meta keywords content.
  • Open Graph Data: Extracted Open Graph tags.
  • Headings: Structure of headings (h1 to h6).
  • Images: Details about images, including alt text and file size.
  • Performance Metrics: Key performance indicators from Lighthouse.
  • Accessibility Issues: Detailed accessibility findings from axe-core.
  • CSR/SSR Detection: Indicates whether the site uses Client-Side or Server-Side Rendering.
  • Lazy Loading Issues: Information about lazy-loaded content.
  • JavaScript Dependencies: Analysis of JS files and their sizes.

License

This project is licensed under the MIT License.