@adobe/aem-upload
v2.0.3
Published
AEM Assets direct binary access uploading tool
Downloads
3,082
Maintainers
Keywords
Readme
Background
In AEM Assets 6.5 and prior, a single post request to a servlet that manges asset binaries is enough for uploading files. Newer versions of AEM can be configured to use direct binary upload, which means that asset binaries are no longer uploaded straight to AEM. Because of this there is a more complex algorithm to follow when uploading asset binaries. This library will check the configuration of the target AEM instance, and will either use the direct binary upload algorithm or the create asset servlet, depending on the configuration.
This tool is provided for making uploading easier, and can be used as a command line executable or required as a Node.js module.
Command Line
A command line tool for for uploading assets to an AEM instance is available as a plugin for the Adobe I/O CLI. Please see the plugin repository for more information.
Usage
This library supports uploading files to a target instance, while providing support for monitoring transfer progress, cancelling transfers, and other features.
Install
This project uses node and npm. Go check them out if you don't have them locally installed.
It can be installed like any other Node.js module.
$ npm install @adobe/aem-upload
Requiring the Module
To add the module to your Node.js project:
- Install the module in your project.
- Require the module in the javascript file where it will be consumed:
const DirectBinary = require('@adobe/aem-upload');
Uploading Files
Following is the minimum amount of code required to upload files to a target AEM instance.
const DirectBinary = require('@adobe/aem-upload');
// URL to the folder in AEM where assets will be uploaded. Folder
// must already exist.
const targetUrl = 'http://localhost:4502/content/dam/target';
// list of all local files that will be uploaded.
const uploadFiles = [
{
fileName: 'file1.jpg', // name of the file as it will appear in AEM
fileSize: 1024, // total size, in bytes, of the file
filePath: '/Users/me/Documents/my_file.jpg' // Full path to the local file
},
{
fileName: 'file2.jpg',
fileSize: 512,
filePath: '/Users/me/Documents/file2.jpg'
}
];
const upload = new DirectBinary.DirectBinaryUpload();
const options = new DirectBinary.DirectBinaryUploadOptions()
.withUrl(targetUrl)
.withUploadFiles(uploadFiles);
// this call will upload the files. The method returns a Promise, which will be resolved
// when all files have uploaded.
upload.uploadFiles(options)
.then(result => {
// "result" contains various information about the upload process, including
// performance metrics and errors that may have occurred for individual files
// at this point, assuming no errors, there will be two new assets in AEM:
// http://localhost:4502/content/dam/target/file1.jpg
// http://localhost:4502/content/dam/target/file2.jpg
})
.catch(err => {
// the Promise will reject if something causes the upload process to fail at
// a high level. Note that individual file failures will NOT trigger this
// "err" will be an instance of UploadError. See "Error Handling"
// for more information
});
Supported Options
The DirectBinaryUploadOptions
class supports the following options. Items with * are required.
Error Handling
If a file fails to upload, the process will move to the next file in the list. The overall process itself will only fail if something catastrophic happens that prevents it from continuing to upload files. It's left up to the consumer to determine if there were individual file upload failures and react to them accordingly.
All errors reported by the upload process will be instances of UploadError
, which are
standard javascript Error
instances with an additional code
value that indicates
the type of error. Specific codes are available in DirectBinary.DirectBinaryUploadErrorCodes
.
The following is an example of handling errors, at either the process or file level.
const codes = DirectBinary.DirectBinaryUploadErrorCodes;
const upload = new DirectBinary.DirectBinaryUpload();
upload.uploadFiles(options) // assume that options is defined previously
.then(result => {
// use this method to retrieve ALL errors during the process
const { errors = [] } = result;
errors.forEach(error => {
if (error.code === codes.ALREADY_EXISTS) {
// handle case where a file already exists
}
});
const { detailedResult = [] } = result;
// or retrieve individual file errors
detailedResult.forEach((fileResult) => {
const { result = {} } = fileResult;
const { errors = [] } = result;
errors.forEach((fileErr) => {
// content of fileErr may vary
});
});
})
.catch(err => {
if (err.code === codes.NOT_SUPPORTED) {
// handle case where direct binary access is not enabled
// on the target instance
}
});
Another way of handling individual file errors is to listen for the upload process's Events.
The process implements automatic HTTP retry handling, meaning that if an HTTP request fails then the process will wait for a specified interval and retry the same HTTP request a given number of times. If the request still fails after the given number of retries, it will report the error as normal using the last error. Any errors that caused a retry, in either a success scenario or failure scenario, will be reported in the result in a dedicated structure.
Upload Events
As the upload process moves through individual files, it will send events as it goes through the stages of uploading a file. These events are listed below.
The following is an example of how to handle various events.
const upload = new DirectBinary.DirectBinaryUpload();
upload.on('filestart', data => {
const { fileName } = data;
// specific handling that should occur when a file begins uploading
});
upload.on('fileprogress', data => {
const { fileName, transferred } = data;
// specific handling that should occur as a file uploads
});
upload.on('fileend', data => {
const { fileName } = data;
// specific handling that should occur when a file finishes uploading successfully
});
upload.on('fileerror', data => {
const { fileName, errors } = data;
// specific handling that should occur when a file files to upload
});
// assume options has been declared previously
upload.uploadFiles(options);
Uploading Local Files
The library supports uploading local files and folders. For folders, the tool will include all immediate children files in the folder. It will not process sub-folders unless the "deep upload" option is specified.
When deep uploading, the library will create a folder structure in the target that mirrors the folder being uploaded. The title of the newly created folders will match the name of the folder as it exists on the local filesystem. The path of the target may be modified depending on path character restrictions in AEM, and depending on the options provided in the upload (see "Function for processing folder node names" for more information).
Whenever the library creates a new folder, it will emit the foldercreated
event. See event documentation for details.
The following example illustrates how to upload local files.
const {
FileSystemUploadOptions,
FileSystemUpload
} = require('@adobe/aem-upload');
// configure options to use basic authentication
const options = new FileSystemUploadOptions()
.withUrl('http://localhost:4502/content/dam/target-folder')
.withHttpOptions({
headers: {
Authorization: Buffer.from('admin:admin').toString('base64')
}
});
// upload a single asset and all assets in a given folder
const fileUpload = new FileSystemUpload();
await fileUpload.upload(options, [
'/Users/me/myasset.jpg',
'/Users/me/myfolder'
]);
Supported File Options
There is a set of options, FileSystemUploadOptions
, that are specific to uploading local files. In addition to default options, the following options are available.
Logging
The library will log various messages as it goes through the process of uploading items. It will use whichever logger it's given, as long as the object supports methods debug()
, info()
, warn()
, and error()
. For maximum detail, the library also assumes that each of these methods can accept formatted messages: log.info('message with %s', 'formatting');
. The logging will work regardless of formatting support, but there will be more information when formatting works correctly.
To provide a logger to the library, pass a log
element in the options sent into the DirectBinaryUpload
constructor. Here is a simple example that will log all the library's messages to console
:
const upload = new DirectBinary.DirectBinaryUpload({
log: {
debug: (...theArguments) => console.log.apply(null, theArguments),
info: (...theArguments) => console.log.apply(null, theArguments),
warn: (...theArguments) => console.log.apply(null, theArguments),
error: (...theArguments) => console.log.apply(null, theArguments),
}
});
Note that this will also work with the FileSystemUpload
constructor.
Proxy Support
The library utilizes the Fetch API, so when running in a browser, proxy settings are detected and applied by the browser. In Node.JS, all HTTP requests are sent directly to the target, without going through a proxy. Auto detecting a system's proxy settings is not supported in Node.JS, but
consumers can use DirectBinaryUploadOptions.withHttpOptions()
to provde an agent
value as recommended by node-fetch
.
Features
- Well tuning to take advantage of nodejs for best uploading performance
- Track transfer progress of files
- Cancel in-progress transfers
- Transfer multiple files "in batch"
- Upload local folder/file structures
Releasing
This module uses semantic-release when publishing new versions. The process is initiated upon merging commits to the master
branch. Review semantic-release's documentation for commit message format.
PRs whose messages do not meet semantic-release's format will not generate a new release.
Release notes are generated based on git commit messages. Release notes will appear in CHANGELOG.md
.
Todo
- Pause/resume uploads
Contributing
Contributions are welcomed! Read the Contributing Guide for more information.
Licensing
This project is licensed under the Apache V2 License. See LICENSE for more information.