npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

nest-pret

v1.6.10

Published

A generator to bootstrap fully-featured NestJS apps. Includes user registration, email verification, password recovery, claims-based access control, standardized and paginated responses, OpenAPI auto-documentation and more.

Downloads

58

Readme

A generator to bootstrap fully-featured NestJS apps

Generates code that is tested, documented, and production-ready with zero downtime continuous deployment.

The generated app solves much of the functionality required from a modern web app:

  • User registration
  • Password recovery
  • E-mail verification, configurable between:
    • required before login
    • delayed until a route with EmailVerifiedGuard enforces it
    • or off
  • Claims-based access control, including:
    • Restricted access to routes via policies
    • Restricted access to specific documents by ownership or other conditional constraints
    • Serialization of response objects exposing only the fields the user has access to
  • Standardized API responses, including:
    • Automatic wrapping of return objects into a StandardResponse
    • Metadata-based — handlers remains returning Classes compatible with interceptors
    • Handling of pagination, sorting and filtering
    • Generation of OpenAPI documentation for routes with the proper combined response schema
  • Secure defaults:
    • Sets secure HTTP response headers
    • Global validation of all request inputs
    • Global validation of response values before serialization
    • Rate-limiting across the app with tighter limits for account creation
  • Configurable
    • Config module parses and validates .env variables during bootstrap
    • Config service makes them available app-wide with proper type definitions
  • Deployable
    • Docker compose environmets for dev and e2e testing
    • Docker swarm stack ready for continuous deployment
  • Tested
    • Complete end-to-end testing suites
    • 100% coverage of all user interaction flows

🚀 Getting started

In a machine with npm installed, run:

npx nest-pret@latest new

After the generator have bootstraped your new app:

cd myapp
npm run dev

This will start

  • the dev database
  • the NestJS app in watch mode: localhost:3000
  • Mongo-express visual DB admin*: localhost:8081
  • Swagger UI documentation explorer*: localhost:3000/dev-tools/docs
  • MermaidJS App Graph*: localhost:3000/dev-tools/graph

These features are only started when running in development env.

Remember to edit the .env file and add your mailer service information to get mailer features.

🚦 Managing the app

|Command|Description| |----|-----------| |npm run dev|Use for local development. This will start docker as development and keep the app in watch mode inside of it.| |npm run dev:stop|Stops all containers created by running the dev command.| |npm run test|Run tests locally.| |npm run e2e|Starts docker as production and run the e2e tests inside of it.| |npm run deploy|Once you're ready to publish to production, this stars the continuous deployment pipeline. See running in production. |

Some of the scripts can be started in watch mode:

|Command|Description| |----|-----------| |npm run test:watch|Run all tests locally and keep watching for changes.| |npm run e2e:watch|Starts docker as production and run all e2e tests inside of it. Keep test containers alive and will re-run changed tests.| |npm run e2e:stop|Stops all test containers keept alive by running e2e:watch.|

To see how the app behaves in production, you can run the deployment stack on a local docker swarm using the commands:

|Command|Description| |----|-----------| |npm run prod|Will start a stack as production in the local machine docker engine. This requires docker to have the swarm orchestrator active. If not, you will need to run docker swarm init first.| |npm run prod:stop|Stops all docker swarm services created by running the prod command.|

🐳 Running in production

Prepare the servers:

  1. Start one or more servers or VPSs on your cloud provider of choice and install Docker on them;
  2. Start docker in swarm mode; If running multiple servers, add them to the swarm;

On your local machine:

  1. Make sure to edit the .env file to add the correct production information for your domain, mailer service, SSH key location, and a private container registry where the application container will be published to.
  2. Make sure your git working directory is clean. Merge all changes that you want to be included in this release or stash them.
  3. Start the deployment pipeline by running:
npm run deploy

🔥 Done!

The deployment pipeline will:

  • Run all tests and e2e tests;
  • Build the app;
  • Bump the npm version and create a tagged git commit;
  • Build the container image and push it to the registry;
  • SSH into the docker swarm manager node;
  • Update the deployed stack with the new services;

Once the new stack is applied, the swarm will start a zero downtime rolling update of changed containers one at a time.

Deployments will default to building a new patch release. You can specify another semversion, for example for a minor release, run:

npm run deploy -- -v minor

To see all options available to the deploy.sh script, run: npm run deploy -- --help.

Rolling back failed updates

If the deployed containers are crashing, docker will stop rolling out any new containers and will reroute traffic to the replicas that are still running the previous image. You can rollback the updated containers by running:

npm run rollback

Reference

The Code:

Tech stack

App Graph

These are the modules included in the generated app and how they interact with each other.

%%{ init: { 'flowchart': { 'curve': 'monotoneX' }, 'theme':'dark' } }%%
flowchart LR
	subgraph legend[ Legend ]
		direction LR
		subgraph legendLine1 [ ]
			direction TB
			ex1(Module)
			ex2([Global Module]):::globalModule
			ex3{{fa:fa-globe Controller}}:::controller
			ex9([fa:fa-bell-concierge Service]):::service
			ex4([fa:fa-briefcase Provider]):::provider
		end
		subgraph legendLine2 [ ]
			direction TB
			ex6{{fa:fa-fish-fins Global Pipe}}:::pipe
			ex7{{fa:fa-bullseye Global Interceptor}}:::interceptor
			ex8{{fa:fa-shield-halved Global Guard}}:::guard
			ex5([fa:fa-database Model]):::model
		end
	end
	subgraph globalModules[ ]
		ConfigModule([ConfigModule]):::globalModule
		JwtModule([JwtModule]):::globalModule
		ConfigHostModule([ConfigHostModule]):::globalModule
		MongooseCoreModule([MongooseCoreModule]):::globalModule
	end
	subgraph modules[" "]
		direction LR
		subgraph AppModule
			direction LR
			Pipe{{fa:fa-fish-fins ValidationPipe}}:::pipe
			Serializer{{fa:fa-fish-fins RolesSerializerInterceptor}}:::interceptor
			AppService([fa:fa-bell-concierge AppService]):::service
		end
		subgraph ConfigModule[ ]
      subgraph ConfigModulePadding[ConfigModule]
      end
		end
		subgraph ConfigHostModule[ ]
      subgraph ConfigHostModulePadding[ConfigHostModule]
      end
		end
		subgraph MongooseModule
			direction LR
			UserModel([fa:fa-database UserModel]):::model
			EmailVerificationModel([fa:fa-database EmailVerificationModel]):::model
			ForgottenPasswordModel([fa:fa-database ForgottenPasswordModel]):::model
		end
		subgraph MongooseCoreModule[ ]
      subgraph MongooseCoreModulePadding[MongooseCoreModule]
      end
		end
		subgraph StandardResponseModule
			direction LR
			Interceptor{{fa:fa-bullseye StandardResponseInterceptor}}:::interceptor
		end
		subgraph AuthModule
			direction LR
			AuthController{{fa:fa-globe AuthController}}:::controller
			AuthService([fa:fa-bell-concierge AuthService]):::service
			JwtStrategy(["fa:fa-briefcase JwtStrategy"]):::provider
		end
		subgraph UserModule
			direction LR
			UserController{{fa:fa-globe UserController}}:::controller
			UserService([fa:fa-bell-concierge UserService]):::service
		end
		subgraph PoliciesModule
			direction LR
			CaslAbilityFactory(["fa:fa-briefcase CaslAbilityFactory"]):::provider
		end
		subgraph MailerModule
			direction LR
			MailerService([fa:fa-bell-concierge MailerService]):::service
		end
		subgraph JwtModule[ ]
      subgraph JwtModulePadding[JwtModule]
		  end
		end
		
		AppModule===>MongooseModule
		AppModule===>StandardResponseModule
		AppModule===>AuthModule
		AuthModule===>UserModule
		UserModule-.->MongooseModule
		UserModule===>PoliciesModule
		AuthModule===>MailerModule
		AuthModule-.->MongooseModule
		AppModule===>UserModule
		AppModule===>MailerModule
	end
classDef controller fill:darkgreen
classDef provider fill:#1f2020
classDef service fill:#1f2020
classDef pipe fill:#8b0e5d
classDef guard fill:#8b0e5d
classDef interceptor fill:#8b0e5d
classDef model fill:#b83100
classDef moduleSubgraph fill:#1f2020,stroke:#81B1DB,rx:5,ry:5
classDef globalModule fill:indigo,stroke:#81B1DB,rx:5,ry:5
classDef layoutGroup fill:none,stroke:none
classDef groupStyles rx:10,ry:10
class legend groupStyles
class modules,globalModules,legendLine1,legendLine2,JwtModulePadding,MongooseCoreModulePadding,ConfigModulePadding,ConfigHostModulePadding layoutGroup
class AppModule,MongooseModule,StandardResponseModule,AuthModule,UserModule,PoliciesModule,MailerModule moduleSubgraph
style legend stroke-dasharray: 0 1 1,fill:white,fill-opacity:0.02,opacity:0.95

Models as a Single Source of Truth (SSOT)

Model Classes serve as the unified entry point describing the format and all expectations for a given piece of data. They are used as an Interface to create the mongoose schema, but they are also used to create both ingress and egress DTOs using Mapped Types.

This means the information on Model properties define input validation rules enforced when the model is expected in requests, and defines serialization rules when the model is send in responses.

Finally, model properties can also provide OpenAPI documentation information, like descriptions and usage examples.

Having all this information present in a central Model Class avoids code duplication, since derivative classes only need to pick what properties of the Model they want, without worrying about providing documentation, examples, validation rules, etc.

This means that properties on a Model Class can have up to 4 types of decorators on them:

  1. Schema - @Prop() from '@nestjs/mongoose' to add the property to the schema;
  2. Docs - @ApiProperty() from '@nestjs/swagger' to add documentation and examples;
  3. Serialization - @Exclude(), @Expose(), and @Transform() from 'class-transformer' to define serialization rules;
  4. Validation - @IsString(), @IsEmail(), @Min(), etc... from 'class-validator' to perform input validation;
@Schema() // ⬅ marks a class to be used as the Interface for the mongoose schema
class User {
  @Prop() // ⬅ marks this property to appear in the mongoose schema
  @ApiProperty({ example: 'Mark' }) // ⬅ provides OpenAPI documentation for this property
  name: string;

  @Prop({ index: { unique: true } }) // ⬅ accepts the same options as a 'new Mongoose.Schema()'
  @ApiProperty({ example: '[email protected]' })
  @IsEmail() // ⬅ provides validation when this property is required as an input
  email: string;

  // ⬇ will exclude this property on 'output', i.e. from the serialized object sent in responses (but allow it on input)
  @Exclude({ toPlainOnly: true })
  @Prop()
  password: string;
  
  // ⬇ will exclude this property on 'input', i.e. from request DTOs and validation (but allow it in responses)
  @Exclude({ toClassOnly: true })
  @Prop()
  lastSeenAt: Date;

  @Exclude() // ⬅ will exclude this property in both directions
  @Prop()
  chatAccessKey: string;

  // ⬇ only admins will see this property in the serialized response, it's excluded for everyone else
  @Expose({ groups: ['Admin'] })
  @Prop({ type: Date, default: Date.now })
  @IsDateString()
  registeredAt: Date;

  // ⬇ allows you to easily create instances of this model from a document from the DB
  constructor(partial: Partial<User> = {}) {
    Object.assign(this, partial);
  }

  // ⬇ you can add other props and utility methods on the model class
  hasCake() {
    const registeredDaysAgo = (new Date().getTime() - this.registeredAt.getTime()) / 1000 / 60 / 60 / 24;
    return registeredDaysAgo > 365; // 🍰 account is at least one year old!
  }
}

Sending data

When sending data in responses, it's important to always send instances of a Model Class, or instances of DTOs created from it. You can either send a single one, or an array of them. But never send documents retrieved from the database directly in reponses! The serialization rules (and all other benefits from the model) only apply to instances of the Model or derived classes, not documents from the DB.

This also means you should not wrap the returned model in any other javascript object. If you need to add more data to the response (like pagination, filtering, additional messages, etc), you should add them using the metadata decorators provided by nest-standard-response.

@Controller('user')
export class UserController {
  @Get()
  @StandardResponse({ // ⬅ setup a StandardResponse wrapper
    isPaginated: true,
  })
  public async findAll(
    // ⬇ injects a StandardParam providing methods to manipulate the wrapper
    @StandardParam() params: StandardParams
  ): Promise<User[]> { // ⬅ route return type must always resolve to Model or Model[]
    const users: UserDocument[] = await this.userModel
      .find()
      .limit(params.paginationInfo.limit) // ⬅ we get pagination query params for free
      .skip(params.paginationInfo.offset) //    by using the isPaginated option above
      .exec();

    params.setMessage('Custom message...') // ⬅ adds a custom message to the response
    params.setExtra('myCustomProperty', { // ⬅ add some extra field in the response
      customObjProp1: 'any serializable value',
      customObjProp2: { nested: true },
    });
    // ⬇ Use the document from the DB to construct a new Model() before returning
    return users.map((userDoc) => new User(userDoc.toJSON()));
  }
}

The response from this route would look like this:

Note the smart serialization in the response! The field registeredAt is only present when an admin is making the request. It would be hidden from other users because of the serialization rules in the model.

{
  success: true,
  message: "Custom message...",
  isArray: true,
  isPaginated: true,
  pagination: {
    limit: 10,
    offset: 0,
    defaultLimit: 10,
  },
  myCustomProperty: {
    customObjProp1: 'any serializable value',
    customObjProp2: { nested: true },
  }
  data: [{
    name: "Mark",
    email: '[email protected]',
    registeredAt: '2023-11-09T13:06:37.384Z'
  }, {
    name: "Jane",
    email: '[email protected]',
    registeredAt: '2023-11-09T13:06:37.384Z'
  }, {
    name: "Eva",
    email: '[email protected]' },
    registeredAt: '2023-11-09T13:06:37.384Z'
  }]
}

Receving data

The same is true for receving data in the request params or body. Always strongly type the expected data as the Model Class or a DTO derived from it. This way the data gets auto validation from the global ValidationPipe, plus the route gets auto documentation in Open API.

// CreateUserDto.ts

// ⬇ We choose the properties we want from the model with MappedTypes, so this DTO will inherit all
// the validation and serialization logic we defined there, without having to duplicate anything

const requiredFields = ['email', 'password'] as const;
const optionalFields = ['name', 'familyName', 'phone', 'birthDate'] as const;

export class CreateUserDto extends IntersectionType(
  PartialType(PickType(User, optionalFields)),
  PickType(User, requiredFields),
) {}
@Controller('user')
export class UserController {
  @Post())
  public async create(
    // ⬇ Setting our DTO as the Type for the request body means it will be automatically validated
    // by the global ValidationPipe.
    @Body() createUserDto: CreateUserDto
  ): Promise<User> {
    // Any request to this route with a body that's missing required fields, or that contains fields
    // with values that fail the model validation rules will result in a HTTP 400 Bad Request exception,
    // and this handler will never be executed.
    // This means it's safe to use body here without any further validation
    return await this.userService.create(createUserDto);
  }
}
}

🔮 Use concrete JS classes as types, not Typescript interfaces

Typescript interfaces are completely removed from compiled code. Since we want to perform data validation and transformation at runtime, all models and DTOs must use Classes instead. TS Classes can also be used as types when needed, but they are persisted as JS Classes in the compiled code.

Auth Module 🚪

  • Allows account creation;
  • Sends e-mail verification and keeps track of confirmation status;
  • Sends forgotten password emails and allows password reset;
  • Manages log-in and JWTs;
  • Guard routes from unlogged users and injects the logged-in user into the request.

Policies Module 🏛️

  • Defines policies limiting any individual user to access only resources they can claim;
  • Claims define which Actions (create, read, update, etc...) any user Role can take on each Model;
  • Claims can also define constraint queries, for example allowing a user to read the User model, but only for his own user; or to update Articles, but only those authored by himself;

Note: There is no Articles module provided by this app. This is just an example on how you can define policies for any model you want.

Policies are defined using Casl.

CaslAbilityFactory

The CaslAbilityFactory provider exposes the createForUser function, which is called during a request with the logged-in user information, and should return a casl Ability object constructed using the provided can or cannot methods. This function is free to inspect the user object and define any custom logic it needs to limit individual access to actions taken on models.

Example:

if (user.roles.includes(UserRole.USER)) {
  // users can view and update their own info,
  // view any article, and update articles authored by them
  can([Action.Read, Action.Update], User, { _id: user._id });
  can(Action.Read, Article);
  can(Action.Update, Article, { authorId: user._id });
}
if (user.roles.includes(UserRole.MOD)) {
  // mods can read and update any user or any article
  can([Action.Read, Action.Update], User);
  can([Action.Read, Action.Update], Article);
}
if (user.roles.includes(UserRole.ADMIN)) {
  // admins can do anything. Note that 'manage' in casl means all actions,
  // and the keywork 'all' means in all models. Common actions are 'create',
  // 'read', 'update', 'delete' and 'list', but you can extend the Actions enum
  // with any other action you want
  can(Action.Manage, 'all');
}

Protecting routes

Just add the PoliciesGuard to any controller or route. Since policies depend on the user object, using this guard also requires using AuthGuard or other mechanism that guarantees log-in.

@UseGuards(AuthGuard('jwt'), PoliciesGuard)

Once this guard is in place, you can add the @CheckPolicies() decorator to any route, and choose the claims that are required to access this route. @CheckPolicies() expects a simple function that is called with the userAbility object, so you can use can or cannot methods on it to define which Actions this route requires on which Models.

@CheckPolicies((ability: UserAbility) => ability.can(Action.List, User))

Checking policies in this way is very efficient, since requests can be denied at the Guard level, without even executing the route handler. But it is also limited: it cannot check for constraint queries since no document has been retrieved from the DB yet. If the logged-in user has access to at least one document for a given Model, it will be granted access by the guard, and you should check for constraints during the route handling.

Protecting access per-document

  • The userAbility object is also injected in the request object, and you can retrieve it by using req.userAbility;
  • If this is all you're using from the request object, it can be cleaner to inject it directly using the custom param decorator @UserAbilityParam();

This allows you to retrieve documents from the database and call the can or cannot methods against them. Note that here these methods are called using an instance of the model (instead of on the Model class itself).

function findOne(
  @UserAbilityParam() userAbility: UserAbility,
) {
  const user = await this.userService.findOne(idOrEmail);
  if (userAbility.cannot(Action.Read, user)) {
    throw new ForbiddenException();
  }
  return user;
}

User Module 👤

  • Defines the User model, schema and DTOs;
  • Defines the services required to create, read, update, delete, list, reset password, and verify email;
  • Most services from this module are consumed by the Auth module for managing accounts;
  • The user controller provides routes that can be used by admins to manage users from a backend outside of the auth flow;
  • Some routes can also be used by users to view or update their own profile;

EmailVerifiedGuard

If the app is configured to use delayed email verification, users will be logged in automatically after account creation, and will be allowed to login anytime without clicking the verification link.

To protect access to certain routes only to users who have verified their email, you can add the EmailVerifiedGuard to any controller or route.

@UseGuards(EmailVerifiedGuard)
  • If the app is configured to use required email verification, users will be asked to verified their email before being allowed to log-in. In that case, this guard is redundant.

  • If the app is configured with email verification off, this guard shoud not be used, since it will never allow access to the routes under it.

  • The routes from the UserController that allow users to view and edit their own information use this guard. If you're setting this setting to off, you should also remove this guard from that controller.

EmailOrIdPipe

Both email and id are unique keys in the user schema. An id provides consistency since it should never be changed, and also provides some privacy if you need to include a user reference in a public link without exposing their email. However, sometimes using an email can be more convenient.

That's why routes and services from the User module accept both an id or an email as the target for their operations. To validate the input parameters in those cases, the app provides the EmailOrIdPipe pipe.

@Controller('user')
export class UserController {
  constructor(private readonly userService: UserService) {}

  @Get(':idOrEmail')
  public async findOne(
    @Param('idOrEmail', EmailOrIdPipe) idOrEmail: string
  ): Promise<User> {
    const user = await this.userService.findOne(idOrEmail);
    ...
  }
}

When used, it makes sure the piped data is either a syntactically valid email or a syntactically valid ObjectId. Note that a pipe can only check for syntax. It will throw a HTTP 400 BadRequestException if the provided information is malformatted, but it's possible that the information is valid yet still doesn't match any known user from the DB.

Mailer Module 📮

  • Automatically creates and configures a nodemailer instance using info from the .env file injected by the config module;
  • Defines services for sending emails;
  • Currently this module can send the following emails:
    • Welcome
    • Please confirm yout email
    • Forgot your password?
    • Your password was reset

Config Module ⚙️

  • Prevents runtime errors by validating environment variables during app startup;
  • Provides helpful console messages when envorinment variables are missing or invalid;
  • Parses .env vars into a strongly typed Configuration object that can be dependency injected;
  • Exposes interfaces that can be used to provide types when calling the configService.get<>() generic method;

Example:

@Controller('books')
export class BooksController {
  constructor(private readonly configService: ConfigService) {}

  @Get()
  public async listBooks() {
    const apiConfig = this.configService.get<ApiConfig>('api'); 
    // equivalent to process.env.API_INTERNAL_URL,
    // but parsed, typed, and guarateed to exist
    console.log(apiConfig.internalUrl);
  }
}

Standard Response Module 📦

StandardReponse has been exported into a separate package. The full documentation now resides in it's own repo.

  • Metadata-based wrapper to provide customizable and standardized API response objects;

  • Built-in handling of pagination, sorting and filtering;

  • Allows route handlers to keep returning classes instead of wrapper objects, so they remain fully compatible with interceptors;

// 👇 just annotate a route with
// @StandardResponse() and choose
// the features you need
@get("/books")
@StandardResponse({
  isPaginated: true,
  isSorted: true,
  isFiltered: true,
})
async listBooks(
  // 👇 then inject a @StandardParam() into
  // the handler to access the features
  @StandardParam() params: StandardParams
): BookDto[] {
  const {
    books,
    count
  } = await this.bookService.list({
    // 👇 this route can now be called with
    // query parameters, fully parsed and
    // validated to use in services
    limit: params.pagination.limit,
    offset: params.pagination.offset,
    sort: params.pagination.sort,
    filter: params.pagination.filter,
  });
  // 👆 to see how the 'sort' and 'filter'
  // params are parsed, look at the 
  // SortingInfo and FilteringInfo classes
  // in the @StandardParam() section of
  // StandardResponse's Docs

  // 👇 add extra information into the response
  params.setPaginationInfo({ count: count })
  params.setMessage('Custom message...')
  return books;
}
// response
{
  success: true,
  message: "Custom message...",
  isArray: true,
  isPaginated: true,
  isSorted: true,
  isFiltered: true,
  pagination: {
    limit: 10,
    offset: 0,
    defaultLimit: 10,
    // 👇 added in handler
    count: 33
  },
  sorting: {
    query: ...,
    sortableFields: [...],
    sort: SortingInfo
    // check docs
  },
  filtering: {
    query: ...,
    filterableFields: [...],
    filter: FilteringInfo
    // check docs
  },
  data: [
    { title: "Dune", year: 1965 },
    { title: "Jaws", year: 1974 },
    { title: "Emma", year: 1815 },
  ]
}





// this route can now be called using query params like this:
'/books?limit=8&offset=16&sort=-author,title&filter=author^=Frank;year>=1960;year>=1970'

ℹ️ Check out the full documentation to learn:

Test Module 🧪

  • Provides end-to-end testing of all user interaction flows;
  • e2e tests run in docker, using NODE_ENV=production;
  • Jest runs tests in parallel, so each test file needs to instantiate the app in it's own thread;
  • The DB is shared between all threads. To avoid racing conditions, the DB should never be dropped during testing. Use the provided factories to create and destroy resources instead.

To facilitate creating and destroing instances of the NestJS application, as well as registering all kinds of test users, this project provides two utility factories:

TestingServerFactory

When creating new e2e test files, use the beforeAll hook from jest to instantiate a new NestJS app by calling await new TestingServerFactory().create().

This method will create a new TestingModule, mock the mailer service, start the app, auto-increment the port number to avoid conflicts, and return an instance with methods to retrieve all created resources, like the getModule(), getApp() and getBaseUrl().

Since this gives you access to the underlying NestJS TestingModule, you can reach any part of the nest app by using the get() and resolve() methods on the module.

UserStubFactory

To create stub users for testing access control and serialization, use the UserStubFactory. It provides methods for creating regular users, users with verified emails, admin users, etc. It also provides methods to login those users and get their access tokens, as well as to delete them.

The test DB is dropped only once before starting e2e tests. It's a good idea to delete any resource you created in the DB during a test inside the afterAll hook.

Example:

describe('BooksController (e2e)', () => {
  let app: INestApplication;
  let stub: UserStubFactory;
  let verifiedUser: FakeUser;
  let verifiedUserToken: string;
  beforeAll(async () => {
    const testingServer = await new TestingServerFactory().create();
    const testingModule = testingServer.getModule();
    app = testingServer.getApp();
    booksService = await testingModule.resolve(BooksService);

    stub = new UserStubFactory(testingServer);
    verifiedUser = await stub.registerNewVerifiedUser({ firstName: 'Martha' });
    verifiedUserToken = await stub.getLoginTokenForUser(verifiedUser);
  });
  afterAll(async () => {
    await stub.deleteUser(verifiedUser.email);
    await app.close();
  });
}

🏃   TODO Milestones

  • Add a mgob instance to the production docker swarm for automated mongo backups (and add its configurations via .env)
  • Add some tool to the production docker swarm to expose server metrics
  • Add user consent forms with versioned policies
  • Add option for log-in using social media accounts

License

MIT License

Copyright (c) 2022 Ricardo Simioni