@nqminds/aliyun-oss-bridge
v0.2.66
Published
A databot/application that stores metadata on the TDX, but stores data on Aliyun OSS.
Downloads
65
Readme
Aliyun OSS Bridge Server
Code for uploading metadata to TDX and actual data to Aliyun OSS or AWS S3.
Installation
As this package relies on the cropdoc-schemas package, you should use lerna to bootstrap this package.
npx lerna bootstrap --scope @nqminds/aliyun-oss-bridge --include-dependenciesUsage
Creating/loading config the appropriate TDX Dataset and OSS Bucket
const {GeoTiffStorer} = require("@nqminds/aliyun-oss-bridge");
const TDXApi = require("nqm-api-tdx");
const tdxApi = new TDXApi(tdxConfig);
const folderId = nqmUtils.shortHash(nqmUtils.constants.applicationServerDataFolderPrefix + tdxAuthentication.id);
await tdxApi.authenticate(tdxAuthentication.id, tdxAuthentication.secret);
const tdxResourceParams = {
// schema is automatically injected
name: "my-super-cool-example-geotiff-dataset",
parentId: folderId, // optional
tags: ["geotiff", "example", "super-cool"],
};
const bucketParams = {
bucketProvider = "AWS_S3", // or ALIYUN_OSS
bucketName = "my-cool-example-geotiff-bucket",
region: "eu-west-1", // Aliyun endpoints can be found here https://www.alibabacloud.com/help/doc-detail/31837.htm
accessKeyId: "MY_SECRET_AWS_LOGIN_ID",
secretAccessKey: "MY_SECRET_AWS_LOGIN_SECRET",
};
// create a new GeoTiffStorer, bucket, and TDX dataset
let geoTiffStorer = await GeoTiffStorer.createNew(
tdxApi, tdxResourceParams, bucketParams,
);
// we can use datasetId with the TDXApi to run queries on the metadata
const datasetId = geoTiffStorer.datasetId;
const allCredentials = {
"AWS_S3": {accessKeyId: "LOGIN", secretAccessKey: "Secret"},
"ALIYUN_OSS": {accessKeyId: "LOGIN", secretAccessKey: "Secret"},
};
// automatically load the bucket based on an existing dataset stored on the TDX
geoTiffStorer = await GeoTiffStorer.fromExistingDataset(
tdxApi, datasetId, allCredentials
);Uploading and downloading GeoTiffs
// load geotiff from file
const geotiff = await fsPromises.readFile("example.tif");
await geoTiffStorer.push(geotiff, "example-key", {extraMetadata: "hi"});
// push a big file and view progress
await geoTiffStorer.push(geotiff, "example-key", {extraMetadata: "hi"}, {
progress(percentage, checkpoint) {
// you can store checkpoint if you want to resume the upload on failure
console.log(`Uploading ${percentage * 100}% done`);
},
partSize = 128 * 1000, // upload in 128kB chunks of data
retries: 5, // retry up to 5 times
});
const {data} = await geoTiffStorer.get("example-key");
// useful for HTTP range requests
const downloadUrl = await geoTiffStorer.getUrl("example-key");Testing
Tests are written in mocha.
Make sure to set the environment variables (you can put then into a .env file):
S3_ID='AWS IAM ACCOUNT ID WITH PERMISSION TO PUSH TO S3'
S3_SECRET='AWS IAM ACCOUNT SECRET'
TDX_ID='TDX ID ON NQM-1.COM'
TDX_SECRET='TDX SECRET ON NQM-1.COM'
ALIYUN_OSS_ID='ALIYUN RAM ACCOUNT ID WITH PERMISSION TO PUSH TO OSS'
ALIYUN_OSS_SECRET='ALIYUN RAM ACCOUNT SECRET'Tests can be run with:
npm test