npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@ngx-http-cache-control/core

v0.1.1

Published

Server-Side Angular Interceptor for HTTP level caching based on HTTP Cache-Control headers

Downloads

31

Readme

@ngx-http-cache-control/core

How to use this package?

Install

npm i @ngx-http-cache-control/core

Import Angular module

This package should only be imported part of the server-side Angular module. Typically that module is placed in a file named app.server.module.ts. universal-starter can serve as a good reference point.

import { NgModule } from '@angular/core';
import {
  ServerModule,
  ServerTransferStateModule
} from '@angular/platform-server';
import { HttpCacheControlCoreModule } from "@ngx-http-cache-control/core";

import { AppModule } from './app.module';
import { AppComponent } from './app.component';

@NgModule({
  imports: [
    AppModule,
    ServerModule,
    ModuleMapLoaderModule,
    HttpCacheControlCoreModule
  ],
  bootstrap: [AppComponent]
})
export class AppServerModule { }

Optional configuration

Overriding default memory cache config

import {
  MemoryCacheStoreConfig,
  T_MEMORY_CACHE_STORE_CONFIG
} from "@ngx-http-cache-control/core";

@NgModule({
  imports: [
    AppModule,
    ServerModule,
    ModuleMapLoaderModule,
    HttpCacheControlCoreModule
  ],
  providers: [
    {
      provide: T_MEMORY_CACHE_STORE_CONFIG,
      useValue: {
        maxCacheSizeInBytes: 500 * 1024 * 1024 // default is 100MB
      } as MemoryCacheStoreConfig
    }
  ]
  bootstrap: [AppComponent]
})
export class AppServerModule { }

Overriding default cache policy options

Default values should be fine for most use-cases.

import {
  CachePolicyOptions,
  T_CACHE_POLICY_OPTIONS
} from "@ngx-http-cache-control/core";

@NgModule({
  imports: [
    AppModule,
    ServerModule,
    ModuleMapLoaderModule,
    HttpCacheControlCoreModule
  ],
  providers: [
    {
      provide: T_CACHE_POLICY_OPTIONS,
      useValue: {
        trustServerDate: false
      } as CachePolicyOptions
    }
  ]
  bootstrap: [AppComponent]
})
export class AppServerModule { }

Replacing the cache layer

By default the cache is an in-memory LRU cache.

This is a really simple and fast solution (no external services are required, works out of the box).

The main disadvantage: cache will not be shared if you have multiple node processes (each process will have its own memory cache). Depending on your use-case, this might be completely fine or totally unacceptable.

You can easily override the default cache store with your own implementation (there is a plan to officially support Redis later on).

First create a service that implements the CacheStore interface
import { Injectable } from "@angular/core";
import { CacheStore } from "@ngx-http-cache-control/core"

@Injectable()
export class MyCacheStore<T> implements CacheStore<T> {

    /**
     * Return the data cached behind this key.
     * Don't forget to deserialize if you stored the data serialized
     */
    get(key: string): Promise<T | undefined> {
        // implementation comes here
    }

    /**
     * Store the data cached under the given key.
     *
     * `item` can be an object and need to be serialized
     * if you use a store that only accepts string values.
     *
     * Using `maxAge` is optional.
     * It can help keeping the cache size smaller.
     * If you don't use it and store the data forever,
     * it will not cause any problems. Library validates max age
     * based on the headers and won't use cached response if it expired.
     */
    set(key: string, item: T, maxAge?: number): Promise<void> {
        // implementation comes here
    }
}
Then provide your service as the cache store
import {
  CachePolicyOptions,
  T_CACHE_POLICY_OPTIONS
} from "@ngx-http-cache-control/core";

import { MyCacheStore } from "./my-cache-store";

@NgModule({
  imports: [
    AppModule,
    ServerModule,
    ModuleMapLoaderModule,
    HttpCacheControlCoreModule
  ],
  providers: [
    {
      provide: T_CACHE_STORE,
      useClass: MyCacheStore
    }
  ]
  bootstrap: [AppComponent]
})
export class AppServerModule { }

Logging

By default nothing is logged. If you'd like to see console logs for debugging purposes, you can set process.env.TRACE_NG_HTTP_CACHE env variable to a truthy value when starting the server.

process.env.TRACE_NG_HTTP_CACHE node your-server.js

Monitoring the behavior using events

If you'd like to monitor how the cache behaves (which requests are getting cached, when is a response served from cache etc.), you can subscribe to the events emitted by the http interceptor.

Events are strongly typed.

import {
  HttpCacheControlInterceptor,
  ReturnResponseFromCacheEvent
} from "@ngx-http-cache-control/core";


@NgModule({
  imports: [
    AppModule,
    ServerModule,
    ModuleMapLoaderModule,
    HttpCacheControlCoreModule
  ],
  bootstrap: [AppComponent]
})
export class AppServerModule {

  constructor(interceptor: HttpCacheControlInterceptor) {
    // you can inject the interceptor anywhere in your app
    interceptor.events.subscribe(event => {
      if (event instanceof ReturnResponseFromCacheEvent) {
        // do something
      }
    })
  }

}

Why was this package created?

Problem

To render a more complex page in a web application, typically the app needs to make multiple HTTP requests to get the required data for the given view.

With a traditional Single-Page Application (SPA) approach, normally the server serves an application shell and when the application is bootstrapping on the client-side, it will make these calls. When API responses arrive one by one, App will render the view progressively. Most of these requests are GET requests that are often cached by the browser (if user is a returning visitor) or by a CDN / HTTP Accelerator (i.e.: Varnish).

When Angular is running on the server, to render the requested page, all required HTTP calls are made on the server to get the data and generate the view server-side. Angular will make HTTP calls. Even if these API calls return Cache-Control headers, Angular will ignore them and for every incoming requests, Angular will make those HTTP calls all over again. If you have a simple application with a few visitors, this might be acceptable.

For a more complex application with high-traffic, making those HTTP calls and waiting for the response can easily become a bottleneck with Server-Side rendering.

Simple example without caching API calls:
* 5 concurrent req per second to get the same page
    (it could be the traffic hitting your home page)
* for each request, app needs to make 10 API calls to render the view
    (to get footer, menu, banners etc.)

That's 50 HTTP calls to the API per second.
90.000 HTTP calls each 30 minutes.

If all those API calls return Cache-Control headers
and allow caching for 30 minutes, then by using
`@ngx-http-cache-control/core`, it could be 10 HTTP calls per 30 minutes.

Probably a real-life scenario is more complex than this, but the performance improvements can be huge.

Solution 1 - Addig a cache server in front of your site

If performance is critical, adding a cache server in front of your site is recommended. Then you don't need to run server-side rendering on the fly for each incoming requests. If most of your visitors visit just a handful of pages (or your application doesn't have millions of different pages), this will result in a big performance boost.

When will a cache server not provide a big performance improvement? In general when there are a lot different page requests that are not yet cached.

Some of the examples:

  • most of your pages cannot be cached (example: view is customer specific and most of your visitors are logged in)
  • you need to vary the view based on User-Agent or some cookies and cache hit ratio is low
  • there are many popular pages and traffic is more or less evenly distributed (example: products in a webshop available in many different languages)
  • content in your app changes rapidly and there are new pages realy often

Solution 2 - Addig a cache server in front of your API

Similar to solution 2. Instead of caching the entire response generated by Angular, you can cache some of the API calls.

Solution 3 - Using @ngx-http-cache-control/core

Using this library to cache the responses in-memory. Compared to solution 2 and solution 3, this is a more light-weight and simpler solution.

How does it work?

HTTP specifies Cache-Control headers to leverage the performance improvement of caching.

Read a short explanation:

  • https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/http-caching More detailed:
  • https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control Specs:
  • https://httpwg.org/specs/rfc7234.html

Under the hood this package is using http-cache-semantics package to decide when a response can be served from the cache instead of making an HTTP call.

It also supports revalidation of staled requests.