@nasa-gcn/remix-seo
v2.0.1
Published
Collection of SEO utilities like sitemap, robots.txt, etc. for a Remix application. A fork of https://github.com/balavishnuvj/remix-seo with some added bug fixes and features.
Downloads
26,523
Readme
Remix SEO
A fork of https://github.com/balavishnuvj/remix-seo with some added bug fixes and features.
Collection of SEO utilities like sitemap, robots.txt, etc. for a Remix application.
Features
- Generate Sitemap
- Generate Robots.txt
Installation
To use it, install it from npm (or yarn):
npm install @nasa-gcn/remix-seo
Usage
Add a sitemap and a robots.txt file to your site by adding resource routes for them, as explained below.
Sitemap
Add to your project a route module called app/routes/sitemap[.]xml.ts
with the following contents.
import { routes } from "@remix-run/dev/server-build";
import type { LoaderFunctionArgs } from "@remix-run/node";
import { generateSitemap } from "@nasa-gcn/remix-seo";
export function loader({ request }: LoaderFunctionArgs) {
return generateSitemap(request, routes, {
siteUrl: "https://balavishnuvj.com",
});
}
generateSitemap
takes three params request
, routes
, and SEOOptions
.
Configuration
SEOOptions
lets you configure the sitemap
export type SEOOptions = {
siteUrl: string; // URL where the site is hosted, eg. https://balavishnuvj.com
headers?: HeadersInit; // Additional headers
/*
eg:
headers: {
"Cache-Control": `public, max-age=${60 * 5}`,
},
*/
};
- To not generate sitemap for a route
// in your routes/url-that-doesnt-need-sitemap
import { SEOHandle } from "@nasa-gcn/remix-seo";
export let loader: LoaderFunction = ({ request }) => {
/**/
};
export const handle: SEOHandle = {
getSitemapEntries: () => null,
};
- To generate sitemap for dynamic routes
// routes/blog/$blogslug.tsx
export const handle: SEOHandle = {
getSitemapEntries: async (request) => {
const blogs = await db.blog.findMany();
return blogs.map((blog) => {
return { route: `/blog/${blog.slug}`, priority: 0.7 };
});
},
};
Robots
Add a new route module with the filename app/routes/robots[.txt].ts
and the
following contents:
To generate robots.txt
import { generateRobotsTxt } from '@nasa-gcn/remix-seo'
export function loader() {
return generateRobotsTxt([
{ type: "sitemap", value: "https://balavishnuvj.com/sitemap.xml" },
{ type: "disallow", value: "/admin" },
]);
}
generateRobotsTxt
takes two arguments.
First one is array of policies
export type RobotsPolicy = {
type: "allow" | "disallow" | "sitemap" | "crawlDelay" | "userAgent";
value: string;
};
and second parameter RobotsConfig
is for additional configuration
export type RobotsConfig = {
appendOnDefaultPolicies?: boolean; // If default policies should used
/*
Default policy
const defaultPolicies: RobotsPolicy[] = [
{
type: "userAgent",
value: "*",
},
{
type: "allow",
value: "/",
},
];
*/
headers?: HeadersInit; // Additional headers
/*
eg:
headers: {
"Cache-Control": `public, max-age=${60 * 5}`,
},
*/
};