npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@ikhokha/proxy-tools

v1.1.1

Published

A set of tools to get a GraphQL proxy microservice running by forwarding on requests from you service onto a 3rd party and forwarding on the response

Downloads

3

Readme

Proxy Tools

codecov Commitizen friendly

Getting Started

  1. install the package as a dependency -> npm install -s @ikhokha/proxy-tools
  2. Create a file called .filterschemarc.js in your project root directory. See "run-save-remote-schema" section below for an example of this file and how to configure it.
  3. Execute npx save-remote-schema to run the tool that will introspect the remote schema and apply the transformations that are in your .filterschemarc.js file. Once this is done, you may still need to manually edit the schema to get it to a point where you're happy as this tool is still in early development stages.
  4. Where you instantiate your Apollo Server for this service, be sure to configure and add the Federation Proxy object to your context. See the Federation Proxy class example below
  5. In each resolver query / mutation that you want to proxy, call either the query or mutate method on the Federation Proxy object to execute your query. Either directly return the response if it matches your schema definition as you defined earlier (in this case, no need to implement individual type resolvers) or edit the response before returning it to the client.

See the examples folder for a comprehensive example.

Description

This is a collection of tools used to help setup a proxy microservice.

It's made up of both development and runtime tools.

  1. development tools: The development tools aim to help you introspect a remote schema during dev time and apply various filters which are higher level abstractions on top of Apollo's transformSchema object. The main reason here is to try and filter specifically only what you need from the foreign schema and to, in some cases, prefix it with a "namespace" (useful for federated services that have objects with the same name)
  2. runtime tools: Run-time tools are basically a "data source" to attach to your context object. This data source takes in some query / mutation information from the current request and is then used to forward the request onto a 3rd party service in your resolver function for that query.

API

FederationProxy class

Constructor

  1. options: (Federation proxy options): This options object is used to configure your Apollo Client. It takes the URL that the remote GraphQL service is running on and a key value object for any headers you want to attach to the client requests (Authorisation headers, etc.)

setOperation

This method is called when adding the proxy to your context object. When the context object is being configured, we have acccess to the raw GraphQL operation type and query. We use this to set these values of our proxy class. These values will be used when we execute our proxy from within our resolver.

  1. query: (string) This is the raw GraphQL query received from our proxy server from the client.
  2. variables: (FederationProxyVariables) This is a key value store of any variables to be inserted into the above query that were received by our proxy service from the client.

Query

Executes the Apollo client query method and sends the information from the setOperation function onto the remote GraphQL server. This method is executed from within the resolver of the specific query we want to forward onto our remote service.

Generics
  1. T: Specify the shape of the return data from the remote graphQL query to be executed.
Returns
  1. FetchResult<T>: Returns the response from the remote graphql service after forwarding on the request.

Mutate

Executes the Apollo client mutate method and sends the information from the setOperation function onto the remote GraphQL server. This method is executed from within the resolver of the specific query we want to forward onto our remote service.

Generics
  1. T: Specify the shape of the return data from the remote graphQL query to be executed.
Returns
  1. FetchResult<T>: Returns the response from the remote graphql service after forwarding on the request.
Example implementation

// before creating your server
const options = {
	uri: "https://ikhokha.myshopify.com/api/graphql",
	headers: {
		Authorisation: "asdasd"
	}
}
const proxy = new FederationProxy(options)

// create your server using the context callback object to set some values and attach the proxy object to context so its accessible in reducers
const server = new ApolloServer({
		schema,
		context: (req: {event}) => {
			const body = JSON.parse(req.event.body as string)
			proxy.setOperation(body.query, body.variables)

			return {
				federationProxy: proxy
			}
		}
	})

This now allows us to implement remote schema query and mutations in our own resolvers and gives us finer control over editing the request before forwarding it or editing the response to match our local schema (if we want to hide certain object variables, modify variables on the return object, etc.):


const resolvers = {
	QueryRoot: {
		productByHandle: async (root, args, context, info)=> {
			const response = await context.federationProxy.query()
			return response.data.productByHandle
		},
		products: async (root, args, context, info) => {
			const response = await context.federationProxy.query()
			return response.data.products
		}
	},
	Mutation: {
		checkoutCreate: async (root, args, context) => {
			const response = await context.federationProxy.mutate()
			return response.data.checkoutCreate
		}
	}
}

save-remote-schema

Note: this is a command line method. Ideally add it to your scripts section in your project, for example: "save-remote-schema": "npx save-remote-schema"

The save remote schema method should be used in conjunction with the federation proxy class and is used to programatically make a copy of a rmeote schema, filter it by removing any unecessary types that you perhaps don't want to expose to your clients, remove any unnecessary mutations and queries and to prefix the types. Type prefixing / namespacing is relevant for use with Apollo Federation if you want to have multiple of the same types in each of your microservices (for example if you want to use both Shopify Storefront and Shopify Admin API and federate those services together...)

TODO: Add filtering for directives as apollo federation currently doesn't support custom directives TODO: Review what else is needed to make a development experience seamless such that you don't have to do any manual schema edits after using this tool.... that's the main goal.

This command line tool looks for a file in the root of your project called .filterschemarc.js which has all of the various rules for transforming your schema as well as the connection details during development time for the apollo client to introspect the schema. For security purposes, as this file should ALWAYS be commited to the repo, you can add dotenv to store any sensitive authorisation headers, etc.

Below is an example of this file:

require('dotenv').config({ path: '.env.development' })

module.exports = {
	url: process.env.SHOPIFY_STOREFRONT_API_URL,
	operationsToKeep: {
		Query: [
			// queries to keep
			'products',
			'productByHandle'
			
		],
		Mutation: [
			// mutations to keep
			'checkoutAttributesUpdateV2',
			'checkoutCreate',
			'checkoutDiscountCodeApplyV2',
			'checkoutDiscountCodeRemove',
			'checkoutEmailUpdateV2',
			'checkoutGiftCardRemoveV2',
			'checkoutGiftCardsAppend',
			'checkoutShippingAddressUpdateV2',
			'checkoutShippingLineUpdate'
		]
	},
	typesToKeep: [
		// general types
		'QueryRoot',
		'Mutation',
		// productTypes query types
		'StringEdge',
	
		// products query types
		'ProductSortKeys',
		'ProductConnection',
		'PageInfo',
		'ProductEdge',
		'Product',
		'Node',
		'ID',
		'String',
		'Boolean',
		'URL',
		'Int',
		'Float',
		'HasMetafields',
		'CollectionConnection',
		'CollectionEdge',
		'Collection',
		'PageInfo',
		'DateTime',
		'HTML',
		'ProductImageSortKeys',
		'CropRegion',
		'ImageConnection',
		'ImageEdge',
		'Image',
		'ImageContentType',
		'Metafield',
		'MetafieldParentResource',
		'ProductVariant',
		'MetafieldValueType',
		'MetafieldConnection',
		'MetafieldEdge',
	
		'ProductOption',
		'ProductPriceRange',
		'MoneyV2',
		'SelectedOptionInput',
		'ProductVariant',
		// 'Money', @deprecated
		'CurrencyCode',
		'ProductVariantPricePairConnection',
		'ProductVariantPricePairEdge',
		'ProductVariantPricePair',
		'ProductVariantSortKeys',
		'ProductVariantConnection',
	
		// checkoutCompleteFree mutation types
		'Checkout',
		'AppliedGiftCard',
		'AvailableShippingRates',
		'ShippingRate',
		'Attribute',
		// 'Customer',
		
		'DiscountApplicationConnection',
		'DiscountApplicationEdge',
		'DiscountApplication',
		'DiscountApplicationAllocationMethod',
		'DiscountApplicationTargetSelection',
		'DiscountApplicationTargetType',
		'PricingValue',
		'PricingPercentageValue',
		'DiscountCodeApplication',
		'ManualDiscountApplication',
		'ScriptDiscountApplication',
		'AutomaticDiscountApplication',
		'CheckoutLineItemConnection',
		'CheckoutLineItemEdge',
		'CheckoutLineItem',
		'DiscountAllocation',
		'Order',
		'OrderLineItemConnection',
		'OrderLineItemEdge',
		'OrderLineItem',
		'MailingAddressConnection',
		'MailingAddressEdge',
		'MailingAddress',
		'CountryCode',
		'CheckoutErrorCode',
		'DisplayableError',
		// 'UserError', @deprecated
	
		// checkoutAttributesUpdateV2
		'CheckoutAttributesUpdateV2Input',
		'AttributeInput',
		'CheckoutAttributesUpdateV2Payload',
	
		// checkoutCreate
		'CheckoutCreateInput',
		'CheckoutLineItemInput',
		'CheckoutCreatePayload',
	
		// checkoutDiscountApplyV2
		'CheckoutUserError',
		'CheckoutErrorCode',
		'CheckoutDiscountCodeApplyV2Payload',
	
		//checkoutDiscountCodeRemove
		'CheckoutDiscountCodeRemovePayload',
	
		//checkoutGiftCardRemoveV2
		'CheckoutGiftCardRemoveV2Payload',
	
		//checkoutGiftCardsAppend
		'CheckoutGiftCardsAppendPayload',
	
	
	
		// checkoutShippingAddressUpdateV2
		'MailingAddressInput',
		'CheckoutShippingAddressUpdateV2Payload',
	
		//checkoutShippingLineUpdate
		'CheckoutShippingLineUpdatePayload',
	
		//CheckoutEmailUpdateV2
		'CheckoutEmailUpdateV2Payload',
	
	],
	typeNamePrefix: 'ShopifyStorefront',
	outputFile: './src/schema.graphql',
	headers: {
		'X-Shopify-Storefront-Access-Token': process.env.SHOPIFY_STOREFRONT_READ_SCHEMA_TOKEN
	}
}

Testing

If you investigate the jest configuration you will see 2 projects configured:

  1. package responsible for running all tests within the src folder. These should be all unit and integration tests
  2. e2e responsible for running all tests within the __e2e_tests__ directory. These tests never run when you execute npm run test or npm run test:watch from your host for performance reasons. These tests run when you try to commit code to the repo or when the bitbucket pipeline runs them. These tests are execute as part of the docker image's entrypoint.

The Docker image is used mainly to closely emulate the production deployment of our package and real life tests in demo nodeJS projects located in the __e2e_tests__ folder so that we can truly test the production code without mocking 3rd party dependencies, the file system and publishing to npm. Docker execute tasks/e2e-test.sh when you run npm run test:e2e:docker:run, mentioned above, which does the following:

  1. stops any lingering local repos
  2. uses Verdaccio to create a local private npm registry
  3. "Fake" publishes that package to your registry using semantic-release which is what we use to manage releases (see comments in bash file as to why this is "fake")
  4. runs npm pack to create a tarball of your production package
  5. "Real" publishes the tarball to your private registry
  6. loops through the subprojects in __e2e_tests__ and installs the local package
  7. Goes into the root directory of our project and runs npm run test:e2e:internal and makes sure all of our tests (package and e2e) are executed
  8. cleans up

Based on the above, you may get an error in the __e2e_tests__ subfolders saying that Error: Cannot find module *** from ***. You can ignore this as the module is installed by docker when executing our end to end tests.

If you would like to add more end to end tests, feel free to run the commands in tasks/e2e-test.sh to install the tarballed package into each of the __e2e_tests__ subfolders and add test cases there and run the npm run test:e2e:internal command on your host machine to test your tests (testception).

Committing work, Versioning, CI/CD

We are making use of Commitizen to structure and forrmat commit messages. We then use Commitlint which checks your commit message before a commit is made to make sure it conforms to the standard that we are implementing which is the Angular implementation. Read more about it here.

We use Semantic Release to manage versioning and deploying to npm. Semantic follows semver versioning standard, read more about it here. To determine what versioning change should be made, semantic release analyses the commit messages since the last version tag and determines if any changes are to be made. If changes are to be made, it will update the version in the package.json file and commit these changes to our git repo. It will also add a git tag to the repo automatically with the version. Read more about semantic release here

This entire process is automated by our CI/CD. We are making use of bitbucket pipelines which essentially runs our docker-compose test scripts, see running docker in docker (dockerception) on bitbucket pipelines here. Once these tests all pass, it publishes these test reports to codecov and executes semantic release to deploy our package.

We don't like pushing to master and therefore only PR's make it into master to make sure we correctly manage our automagic versioning.

License

@ikhokha/proxy-tools is open source software licensed as MIT.

Authored by Daniel Blignaut