npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@keystonejs/file-adapters

v7.1.2

Published

Adapters for handling storage of the File type

Downloads

1,471

Readme

File adapters

This is the last active development release of this package as Keystone 5 is now in a 6 to 12 month active maintenance phase. For more information please read our Keystone 5 and beyond post.

The File field type can support files hosted in a range of different contexts, e.g. in the local filesystem, or on a cloud based file server.

Different contexts are supported by different file adapters. This package contains the built-in file adapters supported by KeystoneJS.

LocalFileAdapter

Usage

const { LocalFileAdapter } = require('@keystonejs/file-adapters');

const fileAdapter = new LocalFileAdapter({...});

Config

| Option | Type | Default | Description | | ------------- | ---------- | -------------- | --------------------------------------------------------------------------------------------------------------------------- | | src | String | Required | The path where uploaded files will be stored on the server. | | path | String | Value of src | The path from which requests for files will be served from the server. | | getFilename | Function | null | Function taking a { id, originalFilename } parameter. Should return a string with the name for the uploaded file on disk. |

Note: src and path may be the same. Note 2: You may also need to use a static file server to host the uploaded files.

Methods

delete

Takes a file object (such as the one returned in file field hooks) and deletes that file from disk.

const { File } = require('@keystonejs/fields');

const fileAdapter = new LocalFileAdapter({
  src: './files',
  path: '/files',
});

keystone.createList('UploadTest', {
  fields: {
    file: {
      type: File,
      adapter: fileAdapter,
      hooks: {
        beforeChange: async ({ existingItem }) => {
          if (existingItem && existingItem.file) {
            await fileAdapter.delete(existingItem.file);
          }
        },
      },
    },
  },
  hooks: {
    afterDelete: async ({ existingItem }) => {
      if (existingItem.file) {
        await fileAdapter.delete(existingItem.file);
      }
    },
  },
});

GraphQL Usage

You can upload files directly through the GraphQL API. For example, with the above list you can do the following:

// Query
mutation uploadImageQuery ($file: Upload){
  createUploadTest(data: {
    file: $file,
  }) {
    id
    file {
      publicUrl
    }
  }
}

// Variables
variables: {
  file: // File path
},

Note that you'll need support in your GraphQL Client to make this work. Two popular ones include apollo-upload-client and urql.

If you're not familiar with file uploads in GraphQL, check out Altair Playground and follow the file upload docs to try it out.

CloudinaryFileAdapter

Usage

const { CloudinaryAdapter } = require('@keystonejs/file-adapters');

const fileAdapter = new CloudinaryAdapter({...});

Config

| Option | Type | Default | Description | | ----------- | -------- | ----------- | ----------- | | cloudName | String | Required | | | apiKey | String | Required | | | apiSecret | String | Required | | | folder | String | undefined | |

Methods

delete

Deletes the provided file from cloudinary. Takes a file object (such as the one returned in file field hooks) and an optional options argument passed to the Cloudinary destroy method. For available options refer to the Cloudinary destroy API.

S3FileAdapter

Usage

const { S3Adapter } = require('@keystonejs/file-adapters');

const CF_DISTRIBUTION_ID = 'cloudfront-distribution-id';
const S3_PATH = 'uploads';

const fileAdapter = new S3Adapter({
  bucket: 'bucket-name',
  folder: S3_PATH,
  publicUrl: ({ id, filename, _meta }) =>
    `https://${CF_DISTRIBUTION_ID}.cloudfront.net/${S3_PATH}/${filename}`,
  s3Options: {
    // Optional paramaters to be supplied directly to AWS.S3 constructor
    apiVersion: '2006-03-01',
    accessKeyId: 'ACCESS_KEY_ID',
    secretAccessKey: 'SECRET_ACCESS_KEY',
    region: 'us-west-2',
  },
  uploadParams: ({ filename, id, mimetype, encoding }) => ({
    Metadata: {
      keystone_id: `${id}`,
    },
  }),
});

Config

| Option | Type | Default | Description | | -------------- | ---------------------- | ----------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | bucket | String | Required | S3 bucket name | | folder | String | '' | Upload folder from root of bucket. By default uploads will be sent to the bucket's root folder. | | getFilename | Function | null | Function taking a { id, originalFilename } parameter. Should return a string with the name for the uploaded file on disk. | | publicUrl | Function | | By default the publicUrl returns a url for the S3 bucket in the form https://{bucket}.s3.amazonaws.com/{key}/{filename}. This will only work if the bucket is configured to allow public access. | | s3Options | Object | undefined | For available options refer to the AWS S3 API | | uploadParams | Object | Function | {} | A config object or function returning a config object to be passed with each call to S3.upload. For available options refer to the AWS S3 upload API. |

Note: Authentication can be done in many different ways. One option is to include valid accessKeyId and secretAccessKey properties in the s3Options parameter. Other methods include setting environment variables. See Setting Credentials in Node.js for a complete set of options.

S3-compatible object storage providers

You can also use any S3 compatible object storage provider with this adapter. You must provide the config s3Options.endpoint to correctly point to provider's server. Other options may be required based on which provider you choose.

const s3Options = {
  accessKeyId: 'YOUR-ACCESSKEYID',
  secretAccessKey: 'YOUR-SECRETACCESSKEY',
  endpoint: 'https://${REGION}.digitaloceanspaces.com', // REGION is datacenter region e.g. nyc3, sgp1 etc
};
const s3Options = {
  accessKeyId: 'YOUR-ACCESSKEYID',
  secretAccessKey: 'YOUR-SECRETACCESSKEY',
  endpoint: 'http://127.0.0.1:9000', // locally or cloud url
  s3ForcePathStyle: true, // needed with minio?
  signatureVersion: 'v4', // needed with minio?
};

For Minio ACL: 'public-read' config on uploadParams.Metadata option does not work compared to AWS or DigitalOcean, you must set public policy on bucket to use anonymous access, if you want to provide authenticated access, you must use afterChange hook to populate publicUrl. Check policy command in Minio client

Methods

delete

Deletes the provided file in the S3 bucket. Takes a file object (such as the one returned in file field hooks) and an optional options argument for overriding S3.deleteObject options. Options Bucket and Key are set by default. For available options refer to the AWS S3 deleteObject API.

// Optional params
const deleteParams = {
  BypassGovernanceRetention: true,
};

keystone.createList('Document', {
  fields: {
    file: {
      type: File,
      adapter: fileAdapter,
      hooks: {
        beforeChange: async ({ existingItem }) => {
          if (existingItem && existingItem.file) {
            await fileAdapter.delete(existingItem.file, deleteParams);
          }
        },
      },
    },
  },
  hooks: {
    afterDelete: async ({ existingItem }) => {
      if (existingItem.file) {
        await fileAdapter.delete(existingItem.file, deleteParams);
      }
    },
  },
});