@arcteryx/design-system
v3.4.0
Published
## Table of Contents
Downloads
63
Readme
README
Table of Contents
- Project Description
- Installation
- Development
- Builds (Rollup), Continous Integration (BitBucket) & Publishing
- Issues / Notes
Project Descripton
This is a React Component Library written in TypeScript that serves as the Design System for Arc'teryx. The library consists of a collection of reusable UI components that can be used to build consistent and cohesive user interfaces across Arc'teryx's web applications.
The library is built using React and TypeScript, which provides strong typing and helps to catch errors early in the development process. The library is strictly a component library, which means that it does not include any application-specific logic or functionality.
To help developers understand how to use the library, we use Storybook to display our stories. Storybook is a tool that allows us to showcase our components in isolation, making it easy for developers to see how each component works and how it can be used in different contexts.
Overall, this React Component Library provides a solid foundation for building high-quality user interfaces that are consistent and easy to maintain. By using this library, Arc'teryx can ensure that its web applications have a consistent look and feel, which helps to improve the user experience and build brand recognition.
Installation
To install this project, you will need to have Node.js and npm installed on your machine.
- Clone the repository:
git clone https://github.com/your-username/your-project.git
- Install the dependencies
nvm use && npm install
- View Component Library
npm run storybook
You will then be able to view our Storybook Instance at : http://localhost:6006
Development
Folder Structure
Our React Component Library uses the Atomic Design methodology to organize our components. This methodology breaks down components into smaller, reusable parts, which can be combined to create more complex components.
Our folder structure is organized as follows:
src/
└── components/
├── atoms/
│ ├── Button/
│ │ ├── index.ts
│ │ ├── Button.tsx
│ │ ├── Button.stories.tsx
│ │ ├── Button.types.ts
│ │ └── helpers.ts
│ ├── Input/
│ └── ...
├── molecules/
│ ├── Form/
│ └── ...
└── organisms/
├── Header/
└── ...
src/
: The root directory of our project.components/
: The directory where all of our components are stored.atoms/
: The directory where our smallest, most basic components are stored.molecules/
: The directory where our more complex components, made up of multiple atoms, are stored.organisms/
: The directory where our largest, most complex components, made up of multiple molecules and/or atoms, are stored.Button/
: An example of an atom component directory.index.ts
: The entry point for the component, which exports the component and any related types or helpers.Button.tsx
: The implementation of the component.Button.stories.tsx
: The Storybook stories for the component.Button.types.ts
: The TypeScript types for the component.helpers.ts
: Any helper functions or utilities related to the component.
By organizing our components in this way, we can easily find and reuse components across our projects. It also makes it easier to maintain and update our components, as each component is broken down into smaller, more manageable parts.
Barrel Exports
In our TypeScript project, we use barrel exports to simplify the process of importing components and other modules. A barrel export is a way to export multiple modules from a single file, which can make it easier to organize and manage our code.
To use barrel exports, we create an index.ts
file in each directory that we want to export from. For example, if we have a directory called components that contains multiple subdirectories, we can create an index.ts
file in the components directory that exports all of the components in the subdirectories.
In this example, we could create an index.ts
file in the components directory that exports all of the components in the atoms, molecules, and organisms directories:
src/components/index.ts
export * from './atoms';
export * from './molecules';
export * from './organisms';
Testing
We use React Testing Library to test our React components. React Testing Library provides a set of utilities for testing React components in a way that simulates how users interact with them.
To test a component, we create a test file with the .test.tsx
extension in the same directory as the component. For example, if we have a component called Button
in the components
directory, we would create a file called Button.test.tsx
in the same directory.
Here's an example of how we might test the Button
component:
// src/components/Button/Button.test.tsx
import React from "react";
import { render, fireEvent } from "@testing-library/react";
import Button from "./Button";
describe("Button", () => {
it("renders the button text", () => {
const { getByText } = render(<Button>Click me</Button>);
expect(getByText("Click me")).toBeInTheDocument();
});
it("calls the onClick handler when clicked", () => {
const handleClick = jest.fn();
const { getByText } = render(
<Button onClick={handleClick}>Click me</Button>,
);
fireEvent.click(getByText("Click me"));
expect(handleClick).toHaveBeenCalledTimes(1);
});
});
You can run the tests locally by running the command:
npm run test
We have a special command that is used in our lint-staged implementation:
npm run test:staged
This command will run this script:
{
"scripts": {
...
"test:staged": "jest --bail --findRelatedTests",
}
}
This command will only run tests on files that are currently staged to be committed. This speeds up our pre-commit hooks.
You can find more here.
Typescript
Our TypeScript configuration is split into two files: tsconfig.json and tsconfig.prod.json. The tsconfig.json file is used for our base TypeScript implementation, which includes type checking, tests, and stories. The tsconfig.prod.json file is used for our production build, which excludes tests and stories.
To run TypeScript, we have added two scripts to our package.json file:
{
"scripts": {
...
"typescript": "tsc --noEmit",
"typescript:watch": "tsc --noEmit -w",
}
}
The typescript
script runs the TypeScript compiler with the --noEmit
flag, which means that it only performs type checking and does not generate any output files. The typescript:watch
will run a watcher in your terminal to help you catch issues as you are developing.
You can customize the TypeScript configuration to fit your specific needs by modifying the tsconfig.json
and tsconfig.prod.json
files.
Linting, Prettier, Lint-Staged and Husky
Linting
We use ESLint to enforce consistent code style and catch errors in our JavaScript and TypeScript code. ESLint is a highly configurable linter that can be customized to fit our specific needs. Our linting configuration can be found in .eslintrc.js
.
To run ESLint, we have added two scripts to our package.json file:
{
"scripts": {
...
"lint": "eslint --ext .js,.jsx,.ts,.tsx .",
"lint:fix": "eslint --ext .js,.jsx,.ts,.tsx . --fix"
}
}
The lint script runs ESLint on all files in the current directory and its subdirectories, with the extensions .js, .jsx, .ts, and .tsx. This will output any errors or warnings that ESLint finds.
The lint:fix script runs ESLint with the --fix flag, which automatically fixes any fixable errors or warnings that ESLint finds. This can save us time and effort when fixing code style issues.
To run ESLint, we can use the following script:
npm run lint
npm run lint:fix
Prettier
Prettier is a code formatter that enforces a consistent code style across your entire project. It supports a wide range of languages, including JavaScript and TypeScript, and can be customized to fit your specific needs. By using Prettier, you can save time and effort by automatically formatting your code to a consistent style, making it easier to read and maintain.
{
"scripts": {
...
"prettier:write": "prettier --write"
}
}
The --write flag tells Prettier to overwrite the existing files with the formatted code.
Lint-Staged
lint-staged is a tool that allows us to run linters and other scripts on only the files that have been staged for commit. This can help us catch errors and enforce code quality before the code is even committed.
You will then add lint-staged to the package.json file.
"lint-staged": {
"*.{js,jsx,ts,tsx}": [
"npm run lint:fix",
"npm run prettier:write",
"npm run test:staged"
]
}
To run lint-staged locally, you can use the npx command:
npx lint-staged
This will run lint-staged on all files that have been staged for commit.
Husky
Husky is a tool that allows us to easily add Git hooks to our project. Git hooks are scripts that run automatically when certain Git events occur, such as committing or pushing code.
One of the most commonly used Git hooks is the pre-commit hook, which runs automatically before each commit. By using the pre-commit hook, you can ensure that your code is always of high quality and free of errors before it is committed to your repository. Our pre-commit hook has 2 steps:
npm run typescript
npx lint-staged
npm run typescript
: This command runs the TypeScript compiler on your TypeScript code and generates JavaScript code. The TypeScript compiler checks your code for syntax errors and type errors, and generates JavaScript code that can be run in a browser or Node.js environment. By running this command, you can ensure that your TypeScript code is free of errors and can be compiled to JavaScript without issues.
npx lint-staged
: This command runs lint-staged on all files that have been staged for commit. lint-staged is a tool that allows you to run linters and other scripts on only the files that have been staged for commit. By using lint-staged, you can ensure that only high-quality code is committed to your repository, which can help you avoid issues down the line. In your case, lint-staged is configured to run ESLint and Prettier on your staged files, which can help you catch errors and enforce consistent code style.
By running these two commands before each commit, you can ensure that your code is free of errors and meets your code quality standards. This can help you avoid issues down the line and make it easier to maintain your codebase.
Builds (Rollup), Continous Integration (BitBucket) & Publishing
Rollup
Rollup is a module bundler for JavaScript that is designed to create smaller, more efficient bundles. It is particularly well-suited for building libraries and packages that are intended to be consumed by other developers.
In our project, we use Rollup to build our component library. Rollup allows us to create a single bundle that includes all of our components and their dependencies, while also removing any unused code and optimizing the bundle for performance.
In our project, we use Rollup to bundle our JavaScript code into a single file that can be used in a browser or Node.js environment. Our Rollup configuration file is located at rollup.config.mjs
and specifies how our code should be bundled.
The .mjs
extension on our Rollup configuration file indicates that it is a JavaScript module file. This extension is used to indicate that the file contains ES modules, which are a standardized way of organizing and sharing JavaScript code.
Our Rollup configuration file specifies the entry point for our code (src/index.ts
), the output file name and format (dist/esm/index.js
in ES module format), and any plugins or other options that should be used during the bundling process. Our Rollup configuration file also points to tsconfig.prod.json, which has production specific instructions. This allows us to run the TypeScript compiler in development and only conduct Type Checking where as in production we rely on Rollup, to product our .d.ts files.
To run the build, use:
npm run build
This will call this script:
{
"scripts": {
...
"build": "rollup -c",
}
}
When this is called, a new folder named 'dist' will be created in the root directory of the project, which will contain all of the build assets
Rollup Plugins
Our Rollup configuration file uses several plugins to help with the bundling process:
@rollup/plugin-node-resolve
: This plugin allows Rollup to resolve Node.js modules during the bundling process. This is useful when you are using third-party modules in your code.@rollup/plugin-commonjs
: This plugin allows Rollup to convert CommonJS modules to ES modules during the bundling process. This is useful when you are using modules that were designed for Node.js in a browser environment.@rollup/plugin-typescript
: This plugin allows Rollup to bundle TypeScript code. It automatically compiles TypeScript code to JavaScript during the bundling process.rollup-plugin-dts
: This plugin generates TypeScript declaration files (.d.ts) from TypeScript code. This is useful when you are building a library that will be consumed by other developers.@rollup/plugin-terser
: This plugin compresses (minifies) your code during the bundling process. This can help to reduce the size of your bundle and improve the performance of your application.@rollup/plugin-postcss
: This plugin allows Rollup to process CSS files using PostCSS during the bundling process, allowing for css imports into component files.rollup-plugin-peer-deps-external
: This plugin excludes peer dependencies from your bundle. Peer dependencies are dependencies that your library requires, but that should be installed by the consumer of your library rather than included in your bundle.
By using these plugins in our Rollup configuration file, we can create a highly optimized and efficient bundle of our JavaScript code that can be easily consumed by other developers.
Continous Integration (BitBucket)
We use Bitbucket Pipelines for continuous integration and deployment. Our pipeline is defined in the bitbucket-pipelines.yml
file in the root directory of the project.
Our pipeline includes the following steps:
Check for TypeScript Errors: This step checks for TypeScript errors in our codebase using the
npm run typescript
command.Check and Fix Linting Errors and Formatting Errors: This step checks for linting errors and formatting errors in our codebase using the
npm run lint:fix
andnpx prettier --write .
commands.Unit Tests: This step runs our unit tests using the
npm run test
command.Build and Test: This step builds our application using the
npm run build
command and runs integration tests using thenpm run integration-test
command.Merge and Check Coverage: This step merges any changes into the
main
branch and checks code coverage using thenpm run coverage
command.Sonar Scan and Quality Gate: This step performs a Sonar scan and quality gate check using the
sonarsource/sonarcloud-scan
andsonarsource/sonarcloud-quality-gate
pipes.Storybook Build for AWS: This step builds our Storybook documentation for deployment to AWS using the
npm run build-storybook
command.
Our pipeline is configured to run automatically on every push to the main
branch and on every pull request. This helps to ensure that our codebase is always up-to-date, well-tested, and meets our quality standards.
Publishing
This repo will automatically publish releases to @arcteryx/design-system on npm when there is a merge to main
. We utilize the semantic-release package to handle this for us. It also depends on our commit messages following conventional commits.
TypeScript
Enums
Enums are a great way to force consistency in variable values throughout your application. An example would be using an enum to define a component type, and then matching that component type using an enum when mapping the component in a dynamic component ie a list of buttons, images and text.
enum ElementEnum {
Button = 'button',
Image = 'image',
Text = 'text'
}
interface Button {
type: ElementEnum.Button,
...
}
interface Image {
type: ElementEnum.Image,
...
}
type Components = Array<Button, Image>
const componentList = (components:Components) => {
return <>{components.map(({type, ...rest}) => {
if(type === ElementEnum.Button){
return <Button {...rest} />
}
})}</>
}
However, we have experienced a few issues when using this method:
- JSON is invalid when used in tests and Storybook
- When this library is used elsewhere, the values passed into the components also need to be enums
In both approaches, the data needs to be adjusted to use the enum when appropriate. This is a cumbersome process and redcues developer flow. Therefore, it is suggested to avoid the use of Enums and use string literals instead.
interface Button {
type: 'button',
...
}
interface Image {
type: 'image,
...
}
type Components = Array<Button, Image>
const componentList = (components:Components) => {
return <>{components.map(({type, ...rest}) => {
if(type === 'button'){
return <Button {...rest} />
}
...
})}</>
}
In this case, the compiler will be smart enough to know that the type is either "button" or "image" and it will appropriately type check. If you try to pass in a string that isn't "button" or "image" it will throw a typescript error.
Both methods also instantiate a typeguard, which allows you to access only the props defined on the value you are accessing. For instance, if you are in the scope of type === 'button', you will not be able to access any image props.
Yalc
yalc acts as very simple local repository for your locally developed packages that you want to share across your local environment.
npm i yalc -g
To create a version of the package to be used globally, run:
npm run yalc:publish
This will run the following command:
{
"scripts": {
...
"yalc:publish": "npm run build && yalc publish --no-scripts --sig"
}
}
This will first build the package, and then publish it using the no scripts and sig flags. The --no-scripts flag prevents the publish command in our package.json file from being executed, which would publish the package to npm. The --sig flag adds a hash to the package version, which is helpful when you are looking at the package.json file in the consuming app as you want to ensure the versions match.
Issues / Notes
Testing JS Helper Functions / Components
At this time it is not possible to test JS Helpers Functions / Components. To test these modules you must use jest.mock:
jest.mock('@arcteryx/components-button', () => {
return {
Button: ({text}: {text: string}) => <button>{text}</button>
};
});
This pattern of testing is used in other parts of the Arc'teryx infrastructure, specifically in JS Helpers. By mocking these components you can ensure that your tests will always pass even if the underlying component being used has had a change. This improves developer performance as you need to spend less time focussing on breaking changes / tests.
Lint-Staged
I tried to get TypeScript to work in this flow but it ignores your TsConfig. So it is recommended to keep your type checking in the pre-commit hook. This means it will run on all your files unforunately.
When updating the scripts in the lint-staged section of package.json, you need to stage the file for lint-staged to work correctly. Otherwise the changes you make to package.json will not be reflected when you run npx lint-staged.
External Packages
node_modules/@arcteryx/components-button/dist/es/index.js:4:15: 4 │ import cx from 'classnames'; ╵ ~~~~~~~~~~~~
You can mark the path "classnames" as external to exclude it from the bundle, which will remove this error.
This error occured when I imported the @arcteryx/components-button and ran storybook. It was looking for the classnames library. To get around this, I installed it in this package. I do not believe this is correct in that we shouldn't be responsible for installing those packages.
Trouble Shooting
If you encounter an issue where TypeScript is not compiling correctly and you are getting an error such as "Cannot find file", this code may help. Simply restart the TypeScript server with cmd + shift + p and the prompt "Restart TypeScript Server" to resolve the issue.
- npx lint-staged not reflecting changes in package.json -> stage the package.json file.
- Semantic Release "fatal: tag 'v' already exists",
This happens when the git tags are out of sync. For example, a new beta branch was created that published a new version package that is not in sync with the main branch. When this happens, simply find the next version number and bump it manually.
You can add a new Git tag for version 1.8.0 using the git tag
command. Here's how you can do it:
git tag v1.8.0
This command creates a new lightweight tag with the name v1.8.0
.
If you want to create an annotated tag, which includes extra information such as the tagger name, email, and date, you can use the -a
option:
git tag -a v1.8.0 -m "Release version 1.8.0"
The -m
option allows you to specify a message for the tag.
After creating the tag, you can push it to the remote repository using the git push
command:
git push origin v1.8.0
This command pushes the v1.8.0
tag to the origin
remote.