npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@push.rocks/smartarchive

v4.0.39

Published

A library for working with archive files, providing utilities for compressing and decompressing data.

Downloads

327

Readme

@push.rocks/smartarchive

@push.rocks/smartarchive is a powerful library designed for managing archive files. It provides utilities for compressing and decompressing data in various formats such as zip, tar, gzip, and bzip2. This library aims to simplify the process of handling archive files, making it an ideal choice for projects that require manipulation of archived data.

Install

To install @push.rocks/smartarchive, you can either use npm or yarn. Run one of the following commands in your project directory:

npm install @push.rocks/smartarchive --save

or if you prefer yarn:

yarn add @push.rocks/smartarchive

This will add @push.rocks/smartarchive to your project's dependencies.

Usage

@push.rocks/smartarchive provides an easy-to-use API for extracting, creating, and analyzing archive files. Below, we'll cover how to get started and explore various features of the module.

Importing SmartArchive

First, import SmartArchive from @push.rocks/smartarchive using ESM syntax:

import { SmartArchive } from '@push.rocks/smartarchive';

Extracting Archive Files

You can extract archive files from different sources using SmartArchive.fromArchiveUrl, SmartArchive.fromArchiveFile, and SmartArchive.fromArchiveStream. Here's an example of extracting an archive from a URL:

import { SmartArchive } from '@push.rocks/smartarchive';

async function extractArchiveFromURL() {
  const url = 'https://example.com/archive.zip';
  const targetDir = '/path/to/extract';

  const archive = await SmartArchive.fromArchiveUrl(url);
  await archive.exportToFs(targetDir);

  console.log('Archive extracted successfully.');
}

extractArchiveFromURL();

Extracting an Archive from a File

Similarly, you can extract an archive from a local file:

import { SmartArchive } from '@push.rocks/smartarchive';

async function extractArchiveFromFile() {
  const filePath = '/path/to/archive.zip';
  const targetDir = '/path/to/extract';

  const archive = await SmartArchive.fromArchiveFile(filePath);
  await archive.exportToFs(targetDir);

  console.log('Archive extracted successfully.');
}

extractArchiveFromFile();

Stream-Based Extraction

For larger files, you might prefer a streaming approach to prevent high memory consumption. Here’s an example:

import { SmartArchive } from '@push.rocks/smartarchive';
import { createReadStream } from 'fs';

async function extractArchiveUsingStream() {
  const archiveStream = createReadStream('/path/to/archive.zip');
  const archive = await SmartArchive.fromArchiveStream(archiveStream);
  const extractionStream = await archive.exportToStreamOfStreamFiles();
  
  extractionStream.pipe(createWriteStream('/path/to/destination'));
}

extractArchiveUsingStream();

Analyzing Archive Files

Sometimes, you may need to inspect the contents of an archive before extracting it. The following example shows how to analyze an archive:

import { SmartArchive } from '@push.rocks/smartarchive';

async function analyzeArchive() {
  const filePath = '/path/to/archive.zip';
  
  const archive = await SmartArchive.fromArchiveFile(filePath);
  const analysisResult = await archive.analyzeContent();
  
  console.log(analysisResult); // Outputs details about the archive content
}

analyzeArchive();

Creating Archive Files

Creating an archive file is straightforward. Here we demonstrate creating a tar.gz archive:

import { SmartArchive } from '@push.rocks/smartarchive';

async function createTarGzArchive() {
  const archive = new SmartArchive();
  
  // Add directories and files
  archive.addedDirectories.push('/path/to/directory1');
  archive.addedFiles.push('/path/to/file1.txt');
  
  // Export as tar.gz
  const tarGzStream = await archive.exportToTarGzStream();
  
  // Save to filesystem or handle as needed
  tarGzStream.pipe(createWriteStream('/path/to/destination.tar.gz'));
}

createTarGzArchive();

Stream Operations

Here's an example of using smartarchive's streaming capabilities:

import { createReadStream, createWriteStream } from 'fs';
import { SmartArchive } from '@push.rocks/smartarchive';

async function extractArchiveUsingStreams() {
  const archiveStream = createReadStream('/path/to/archive.zip');
  const archive = await SmartArchive.fromArchiveStream(archiveStream);
  const extractionStream = await archive.exportToStreamOfStreamFiles();
  
  extractionStream.pipe(createWriteStream('/path/to/extracted'));
}

extractArchiveUsingStreams();

Advanced Decompression Usage

smartarchive supports multiple compression formats. It also provides detailed control over the decompression processes:

  • For ZIP files, ZipTools handles decompression using the fflate library.
  • For TAR files, TarTools uses tar-stream.
  • For GZIP files, GzipTools provides a CompressGunzipTransform and DecompressGunzipTransform.
  • For BZIP2 files, Bzip2Tools utilizes custom streaming decompression.

Example: Working with a GZIP-compressed archive:

import { createReadStream, createWriteStream } from 'fs';
import { SmartArchive } from '@push.rocks/smartarchive';

async function decompressGzipArchive() {
  const filePath = '/path/to/archive.gz';
  const targetDir = '/path/to/extract';

  const archive = await SmartArchive.fromArchiveFile(filePath);
  await archive.exportToFs(targetDir);

  console.log('GZIP archive decompressed successfully.');
}

decompressGzipArchive();

Advancing with Custom Decompression Streams

You can inject custom decompression streams where needed:

import { createReadStream, createWriteStream } from 'fs';
import { SmartArchive, GzipTools } from '@push.rocks/smartarchive';

async function customDecompression() {
  const filePath = '/path/to/archive.gz';
  const targetDir = '/path/to/extract';

  const archive = await SmartArchive.fromArchiveFile(filePath);
  const gzipTools = new GzipTools();
  const decompressionStream = gzipTools.getDecompressionStream();

  const archiveStream = await archive.getArchiveStream();
  archiveStream.pipe(decompressionStream).pipe(createWriteStream(targetDir));

  console.log('Custom GZIP decompression successful.');
}

customDecompression();

Custom Pack and Unpack Tar

When dealing with tar archives, you may need to perform custom packing and unpacking:

import { SmartArchive, TarTools } from '@push.rocks/smartarchive';
import { createWriteStream } from 'fs';

async function customTarOperations() {
  const tarTools = new TarTools();

  // Packing a directory into a tar stream
  const packStream = await tarTools.packDirectory('/path/to/directory');
  packStream.pipe(createWriteStream('/path/to/archive.tar'));

  // Extracting files from a tar stream
  const extractStream = tarTools.getDecompressionStream();
  createReadStream('/path/to/archive.tar').pipe(extractStream).on('entry', (header, stream, next) => {
    const writeStream = createWriteStream(`/path/to/extract/${header.name}`);
    stream.pipe(writeStream);
    stream.on('end', next);
  });
}

customTarOperations();

Extract and Analyze All-in-One

To extract and simultaneously analyze archive content:

import { createReadStream, createWriteStream } from 'fs';
import { SmartArchive } from '@push.rocks/smartarchive';

async function extractAndAnalyze() {
  const filePath = '/path/to/archive.zip';
  const targetDir = '/path/to/extract';

  const archive = await SmartArchive.fromArchiveFile(filePath);
  const analyzedStream = archive.archiveAnalyzer.getAnalyzedStream();
  const extractionStream = await archive.exportToStreamOfStreamFiles();

  analyzedStream.pipe(extractionStream).pipe(createWriteStream(targetDir));

  analyzedStream.on('data', (chunk) => {
    console.log(JSON.stringify(chunk, null, 2));
  });
}

extractAndAnalyze();

Final Words

These examples demonstrate various use cases for @push.rocks/smartarchive. Depending on your specific project requirements, you can adapt these examples to suit your needs. Always refer to the latest documentation for the most current information and methods available in @push.rocks/smartarchive.

For more information and API references, check the official @push.rocks/smartarchive GitHub repository.

License and Legal Information

This repository contains open-source code that is licensed under the MIT License. A copy of the MIT License can be found in the license file within this repository.

Please note: The MIT License does not grant permission to use the trade names, trademarks, service marks, or product names of the project, except as required for reasonable and customary use in describing the origin of the work and reproducing the content of the NOTICE file.

Trademarks

This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH and are not included within the scope of the MIT license granted herein. Use of these trademarks must comply with Task Venture Capital GmbH's Trademark Guidelines, and any usage must be approved in writing by Task Venture Capital GmbH.

Company Information

Task Venture Capital GmbH
Registered at District court Bremen HRB 35230 HB, Germany

For any legal inquiries or if you require further information, please contact us via email at [email protected].

By using this repository, you acknowledge that you have read this section, agree to comply with its terms, and understand that the licensing of the code does not imply endorsement by Task Venture Capital GmbH of any derivative works.