Can S3 Upload Files In Parallels
gulp-awspublish
awspublish plugin for gulp
Usage
Start, install gulp-awspublish
equally a evolution dependency:
npm install --relieve-dev gulp-awspublish
Then, add it to your gulpfile.js
:
var awspublish = crave ( 'gulp-awspublish' ) ; gulp . task ( 'publish' , function ( ) { // create a new publisher var publisher = awspublish . create ( { "key": "..." , "secret": "..." , "saucepan": "..." , "region": "..." } ) ; // define custom headers var headers = { 'Cache-Control': 'max-age=315360000, no-transform, public' // ... } ; return gulp . src ( './public/*.js' ) // gzip, Set up Content-Encoding headers and add .gz extension . pipe ( awspublish . gzip ( { ext: '.gz' } ) ) // publisher will add Content-Length, Content-Type and headers specified above // If not specified it will set x-amz-acl to public-read past default . pipage ( publisher . publish ( headers ) ) // create a cache file to speed upwards consecutive uploads . pipe ( publisher . cache ( ) ) // impress upload updates to console . pipe ( awspublish . reporter ( ) ) ; } ) ; // output // [gulp] [create] file1.js.gz // [gulp] [create] file2.js.gz // [gulp] [update] file3.js.gz // [gulp] [cache] file3.js.gz // ...
- Note: If you follow the aws-sdk suggestions for providing your credentials you don't need to pass them in to create the publisher.
Testing
add an aws-credentials.json json file to the project directory with your bucket credentials, then run mocha.
{ "key": "..." , "hugger-mugger": "..." , "bucket": "..." }
API
awspublish.gzip(options)
create a through stream, that gzip file and add together Content-Encoding header.
Available options:
- ext: file extension to add to gzipped file (eg: { ext: '.gz' })
awspublish.create(options)
Create a Publisher. Options are used to create an aws-sdk
S3 client. At a minimum you must laissez passer a bucket
option, to ascertain the site saucepan. If you are using the aws-sdk suggestions for credentials you do not need to provide anything else.
Also supports credentials specified in the old knox format, a profile
property for choosing a specific set of shared AWS creds, or and accessKeyId
and secretAccessKey
provided explicitly.
Publisher.publish([headers], [options])
Create a through stream, that push files to s3.
- header: hash of headers to add or override to existing s3 headers.
- options: optional additional publishing options
- strength: bypass cache / skip
- simulate: debugging choice to simulate s3 upload
- createOnly: skip file updates
Files that become through the stream receive extra properties:
- s3.path: s3 path
- s3.etag: file etag
- s3.date: file terminal modified date
- s3.country: publication country (create, update, delete, enshroud or skip)
- s3.headers: s3 headers for this file. Defaults headers are:
- ten-amz-acl: public-read
- Content-Type
- Content-Length
publisher.enshroud()
Create a through stream that create or update a cache file using file s3 path and file etag. Consecutive runs of publish volition employ this file to avoid reuploading identical files.
Cache file is save in the current working dir and is named .awspublish-<bucket>
. The cache file is flushed to deejay every 10 files just to be safe.
Publisher.sync([prefix])
create a transform stream that delete onetime files from the bucket. Yous can speficy a prefix to sync a specific directory.
warning
sync
volition delete files in your bucket that are non in your local binder.
// this will publish and sync saucepan files with the one in your public directory gulp . src ( './public/*' ) . pipe ( publisher . publish ( ) ) . piping ( publisher . sync ( ) ) . piping ( awspublish . reporter ( ) ) ; // output // [gulp] [create] file1.js // [gulp] [update] file2.js // [gulp] [delete] file3.js // ...
Publisher.customer
The aws-sdk
S3 client is exposed to let you lot do other s3 operations.
awspublish.reporter([options])
Create a reporter that logs s3.path and s3.land (delete, create, update, cache, skip).
Available options:
- states: list of state to log (default to all)
// this will publish,sync bucket files and impress created, updated and deleted files gulp . src ( './public/*' ) . piping ( publisher . publish ( ) ) . pipe ( publisher . sync ( ) ) . piping ( awspublish . reporter ( { states: [ 'create' , 'update' , 'delete' ] } ) ) ;
Examples
rename file & directory
You lot can use gulp-rename
to rename your files on s3
// see examples/rename.js gulp . src ( 'examples/fixtures/*.js' ) . pipage ( rename ( function ( path ) { path . dirname += '/s3-examples' ; path . basename += '-s3' ; } ) ) . piping ( publisher . publish ( ) ) . pipe ( awspublish . reporter ( ) ) ; // output // [gulp] [create] s3-examples/bar-s3.js // [gulp] [create] s3-examples/foo-s3.js
upload file in parallel
You lot can use concurrent-transform
to upload files in parallel to your amazon saucepan
var parallelize = crave ( "concurrent-transform" ) ; gulp . src ( 'examples/fixtures/*.js' ) . piping ( parallelize ( publisher . publish ( ) , ten ) ) . pipage ( awspublish . reporter ( ) ) ;
Plugins
gulp-awspublish-router
A router for defining file-specific rules https://world wide web.npmjs.org/bundle/gulp-awspublish-router
License
MIT License
Source: https://githubhelp.com/vlad/gulp-awspublish
Posted by: cambellwhold1986.blogspot.com
0 Response to "Can S3 Upload Files In Parallels"
Post a Comment