crawfishcloud
v0.8.3
Published
A Streaming S3 Bucket Glob Crawler
Downloads
26
Maintainers
Readme
crawfishcloud
A Streaming S3 Bucket Glob Crawler
crawfishcloud
makes it VERY easy to get started pulling globbed files from S3.
Install
Table of Contents
Setup
// supports BOTH import and require
import {crawler, asVfile} from 'crawfishcloud'
// Setup AWS-S3 with your credentials
import {S3, SharedIniFileCredentials} from 'aws-sdk'
const credentials = new SharedIniFileCredentials({profile: 'default'})
// crawfish uses your configured S3 Client to get data from S3
const crawfish = crawler({s3c: new S3({credentials})})
Usage
Async Generator
for await (const vf of crawler({s3c}).vfileIter('s3://Bucket/path/*.jpg')){
console.log({vf})
}
Promise<Arrray<Vfile | Vinyl>>
const allJpgs = await crawler({s3c}).vinylArray('s3://Bucket/path/*.jpg')
Stream< Vfile | Vinyl >
crawler({s3c}).vfileStream('/prefix/**/*.jpg').pipe(destination())
Why use crawfishcloud?
Ever had a set of files in S3 and you are thinking "Why can't I use a glob pattern like I would in a unix command, or in gulp, and pull all of those files out together?"
Now you can.
Features
crawfishcloud
supports 3 different processing patterns to handle data from your buckets.
- Promised Arrays
- While this structure is admittedly the most straight forward, it can also blow through your RAM because collapsing an S3 stream to one array can often take more space than is commericial available for RAM. Sure, maybe you are thinking "I know my data, and I just need the 5 files loaded together from this s3 prefix, and I know it will fit" - then the
array pattern
is just the ticket.
- While this structure is admittedly the most straight forward, it can also blow through your RAM because collapsing an S3 stream to one array can often take more space than is commericial available for RAM. Sure, maybe you are thinking "I know my data, and I just need the 5 files loaded together from this s3 prefix, and I know it will fit" - then the
- Node Streams
- Node Streams are incredible if you are familiar with them. The
.stream()
pattern allows you to stream out a set of obejcts to your down stream processing.
- Node Streams are incredible if you are familiar with them. The
- AsyncGenerators
- For many people, although Async Generators are a newer addition to the language, it will strike a sweet spot of "ease of use" and still being able to process terribly large amounts of data. since its pulled from the network on demand.
- Uses Modern Syntax
- async/ await
- All in about 230 lines of js code sread over about 3 files.
Contributions
Pull requests welcome. Drop me a line in the Discussion area - or submit an issue.
Inspired By
I was about to use one of those - except "before marrying an old lady" - I thought I would see if I realy understood these before taking on caring for an old lady, and by the time I understood I had opinions which are now roughly manifest through out the package.
License
MIT © Eric D Moore
API Reference
- crawler (s3c, body, maxkeys, ...filters) : crawfishcloud
- Base Functions
- Readable Node Streams
- AsynGenerators
- Promised Arrays
- Exporting Functions
crawler()
the default export function "aka: crawler"
params
- s3c: :
S3
- body: :
boolean
- maxkeys: :
number
- ...filters:
string[]
- s3c: :
returns
crawfishcloud
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}, 's3://ericdmoore.com-images/*.jpg') const arr = await crab.all({body:true, using: asVfile})
> Base Returns
iter()
get an
AsyncGenerator<T>
ready to use with afor await (){}
loop where each elemement is of Type based on the Using Function
params
- body :
boolean
- using :
UsingFunc: (i:S3Item)=><T>
- NextContinuationToken? :
string | undefined
- ...filters:
string[]
- will overite any configured filters already given to the crawfish - last filters in wins
- body :
returns
- AsyncGenerator with elements of type where
T
is the
- AsyncGenerator with elements of type where
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}, 's3://ericdmoore.com-images/*.jpg') for await (const vf of crab.iter({body:true, using: asVfile}) ){ console.log(vf) }
stream()
get a Readable Node Stream ready to pipe to a transform or writable stream
params
- body :
boolean
- using :
UsingFunc: (i:S3Item)=><T>
- ...filters:
string[]
- will overite any configured filters already given to the crawfish - last filters in wins
- body :
returns
- Readable
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}, 's3://ericdmoore.com-images/*.jpg') crab.stream({body:true, using: asVfile}) .pipe(rehypePipe()) .pipe(destinationFolder())
all()
load all of the s3 url into an array. Where the array is resolved when all of the elements are populated to the array.
params
- body :
boolean
- using :
UsingFunc: (i:S3Item)=><T>
- ...filters:
string[]
- will overite any configured filters already given to the crawfish - last filters in wins
- body :
returns
Promise<T[]>
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}, 's3://ericdmoore.com-images/*.jpg') const arr = await crab.all({body:true, using: asVfile})
reduce()
Reduce the files represented in the glob into a new Type. The process batches sets of 1000 elements into memory and reduces.
params
- init :
<OutputType>
- starting value for reducer - using :
UsingFunc: (i:S3Item)=><ElementType>
- reducer :
(prior:OutputType, current:ElementType, i:number)=>OutputType
- ...filters:
string[]
- will overite any configured filters already given to the crawfish - last filters in wins
- init :
returns
Promise<OutputType>
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}, 's3://ericdmoore.com-images/*.jpg') const count = await crab.reduce(0, using: asS3, reducer:(p)=>p+1)
> Streams
vfileStream()
a stream of vfiles
params
- ...filters:
string[]
- will overite any configured filters already given to the crawfish - last filters in wins
- ...filters:
returns
Readable
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}) crab.vfileStream('s3://ericdmoore.com-images/*.jpg') .pipe(jpgOptim()) .pipe(destinationFolder())
vinylStream()
a stream of vinyls
params
- ...filters:
string[]
- will overite any configured filters already given to the crawfish - last filters in wins
- ...filters:
returns
Readable
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}) crab.vinylStream('s3://ericdmoore.com-images/*.jpg') .pipe(jpgOptim()) .pipe(destinationFolder())
s3Stream()
a stream of S3 Items where S3 list object keys are mixed in with the the getObject keys - called an
S3Item
params
- ...filters:
string[]
- will overite any configured filters already given to the crawfish - last filters in wins
- ...filters:
returns
Readable
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}) crab.s3Stream('s3://ericdmoore.com-images/*.jpg') .pipe(S3ImageOptim()) .pipe(destinationFolder())
> AsyncGenerators
vfileIter()
get an AyncGenerator thats is ready to run through a set of
VFiles
params
- ...filters:
string[]
- will overite any configured filters already given to the crawfish - last filters in wins
- ...filters:
returns
AsyncGenerator<VFile, void, undefined>
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}) for await (const vf of crab.vfileIter('s3://ericdmoore.com-images/*.jpg') ){ console.log(vf) }
vinylIter()
get an AyncGenerator thats is ready to run through a set of
Vinyls
params
- ...filters:
string[]
- will overite any configured filters already given to the crawfish - last filters in wins
- ...filters:
returns
AsyncGenerator<Vinyl, void, undefined>
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}) for await (const v of crab.vinylIter('s3://ericdmoore.com-images/*.jpg') ){ console.log(vf) }
s3Iter()
get an AyncGenerator thats is ready to run through a set of
S3Item
params
- ...filters:
string[]
- will overite any configured filters already given to the crawfish - last filters in wins
- ...filters:
returns
AsyncGenerator<S3Item, void, undefined>
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}) for await (const s3i of crab.s3Iter('s3://ericdmoore.com-images/*.jpg') ){ console.log(s3i) }
> Promised Arrays
vfileArray()
get an array of
vfiles
all loaded into a variable
params
- ...filters:
string[]
- will overite any configured filters already given to the crawfish - last filters in wins
- ...filters:
returns
- Promise<Vfile[]>
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}, 's3://ericdmoore.com-images/*.jpg') const vfArr = await crab.vfileArray()
vinylArray()
get an array of
vinyls
all loaded into a variable
params
- ...filters:
string[]
- will overite any configured filters already given to the crawfish - last filters in wins
- ...filters:
returns
- Promise<Vinyl[]>
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}, 's3://ericdmoore.com-images/*.jpg') const vArr = await crab.vinylArray()
s3Array()
get an array of
S3Items
all loaded into a variable
params
- ...filters:
string[]
- will overite any configured filters already given to the crawfish - last filters in wins
- ...filters:
returns
- Promise<S3Item[]>
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}, 's3://ericdmoore.com-images/*.jpg') const arr = await crab.s3Array()
> Exporting Functions
asVfile()
turn an S3 object into a vfile
params
- s3i :
S3Item
- i :
number
- s3i :
returns
Vfile
import {crawler, asVfile} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}, 's3://ericdmoore.com-images/*.jpg') for await (const vf of crab.iter({body:true, using: asVfile}) ){ console.log(vf) }
asVinyl()
turn an S3 object into a vinyl
params
- s3i :
S3Item
- i :
number
- s3i :
returns
Vinyl
import {crawler, asVinyl} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}, 's3://ericdmoore.com-images/*.jpg') for await (const vf of crab.iter({body:true, using: asVinyl}) ){ console.log(vf) }
asS3()
Just pass the S3 object structure along
params
- s3i :
S3Item
- i :
number
- s3i :
returns
S3Item
import {crawler, asS3} from 'crawfishcloud' import {S3, SharedIniFileCredentials} from 'aws-sdk' const credentials = new SharedIniFileCredentials({profile:'default'}) const s3c = new S3({credentials, region:'us-west-2'}) const crab = crawler({s3c}, 's3://ericdmoore.com-images/*.jpg') for await (const vf of crab.iter({body:true, using: asS3}) ){ console.log(vf) }
namesake
crawfish cloud
because why not, and because regular crawfish are delightful and they crawl around in a a bucket for a time. So clearly crawfishcloud
is a crawler of cloud buckets.
Logo credit: deepart.io