npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@serve.zone/spark

v1.0.90

Published

A comprehensive tool for maintaining and configuring servers, integrating with Docker and supporting advanced task scheduling, targeted at the Servezone infrastructure and used by @serve.zone/cloudly as a cluster node server system manager.

Downloads

11

Readme

@serve.zone/spark

A comprehensive tool for maintaining and configuring servers, integrating with Docker and supporting advanced task scheduling, targeted at the serve.zone infrastructure. It's mainly designed to be utilized by @serve.zone/cloudly as a cluster node server system manager, maintaining and configuring servers on the base OS level.

Install

To install @serve.zone/spark, run the following command in your terminal:

npm install @serve.zone/spark --save

Ensure you have both Node.js and npm installed on your machine.

Usage

Getting Started

To use @serve.zone/spark in your project, you need to include and initiate it in your TypeScript project. Ensure you have TypeScript and the necessary build tools set up in your project.

First, import @serve.zone/spark:

import { Spark } from '@serve.zone/spark';

Initializing Spark

Create an instance of the Spark class to start using Spark. This instance will serve as the main entry point for interacting with Spark functionalities.

const sparkInstance = new Spark();

Running Spark as a Daemon

To run Spark as a daemon, which is useful for maintaining and configuring servers at the OS level, you can use the CLI feature bundled with Spark. This should ideally be handled outside of your code through a command-line terminal but can also be automated within your Node.js scripts if required.

spark installdaemon

The command above sets up Spark as a system service, enabling it to run and maintain server configurations automatically.

Updating Spark or Maintained Services

Spark can self-update and manage updates for its maintained services. Trigger an update check and process by calling the updateServices method on the Spark instance.

await sparkInstance.sparkUpdateManager.updateServices();

Managing Configuration and Logging

Spark allows extensive configuration and logging customization. Use the SparkLocalConfig and logging features to tailor Spark's operation to your needs.

// Accessing the local configuration
const localConfig = sparkInstance.sparkLocalConfig;

// Utilizing the logger for custom log messages
import { logger } from '@serve.zone/spark';

logger.log('info', 'Custom log message');

Advanced Usage

@serve.zone/spark offers tools for detailed server and service management, including but not limited to task scheduling, daemon management, and service updates. Explore the SparkTaskManager for scheduling specific tasks, SparkUpdateManager for handling service updates, and SparkLocalConfig for configuration.

Example: Scheduling Custom Tasks

import { SparkTaskManager } from '@serve.zone/spark';

const sparkInstance = new Spark();
const myTask = {
  name: 'customTask',
  taskFunction: async () => {
    console.log('Running custom task');
  },
};

sparkInstance.sparkTaskManager.taskmanager.addAndScheduleTask(myTask, '* * * * * *');

The example above creates a simple task that logs a message every second, demonstrating how to use Spark's task manager for custom scheduled tasks.

Detailed Service Management

For advanced configurations, including Docker and service management, you can utilize the following patterns:

  • Use SparkUpdateManager to handle Docker image updates, service creation, and management.
  • Access and modify Docker and service configurations through Spark's integration with configuration files and environment variables.
// Managing Docker services with Spark
await sparkInstance.sparkUpdateManager.dockerHost.someDockerMethod();

// Example: Creating a Docker service
const newServiceDefinition = {...};
await sparkInstance.sparkUpdateManager.createService(newServiceDefinition);

CLI Commands

Spark provides several CLI commands to interact with and manage the system services:

Installing Spark as a Daemon

spark installdaemon

Sets up Spark as a system service to maintain server configurations automatically.

Updating the Daemon

spark updatedaemon

Updates the daemon service if a new version is available.

Running Spark as Daemon

spark asdaemon

Runs Spark in daemon mode, which is suitable for executing automated tasks.

Viewing Logs

spark logs

Views the logs of the Spark daemon service.

Cleaning Up Services

spark prune

Stops and cleans up all Docker services (stacks, networks, secrets, etc.) and prunes the Docker system.

Programmatic Daemon Management

You can also manage the daemon programmatically:

import { SmartDaemon } from '@push.rocks/smartdaemon';
import { Spark } from '@serve.zone/spark';

const sparkInstance = new Spark();
const smartDaemon = new SmartDaemon();

const startDaemon = async () => {
  const sparkService = await smartDaemon.addService({
    name: 'spark',
    version: sparkInstance.sparkInfo.projectInfo.version,
    command: 'spark asdaemon',
    description: 'Spark daemon service',
    workingDir: '/path/to/project',
  });
  await sparkService.save();
  await sparkService.enable();
  await sparkService.start();
};

const updateDaemon = async () => {
  const sparkService = await smartDaemon.addService({
    name: 'spark',
    version: sparkInstance.sparkInfo.projectInfo.version,
    command: 'spark asdaemon',
    description: 'Spark daemon service',
    workingDir: '/path/to/project',
  });
  await sparkService.reload();
};

startDaemon();
updateDaemon();

This illustrates how to initiate and update the Spark daemon using the SmartDaemon class from @push.rocks/smartdaemon.

Configuration Management

Extensive configuration management is possible through the SparkLocalConfig and other configuration classes. This feature allows you to make your application's behavior adaptable based on different environments and requirements.

// Example on setting local config
import { SparkLocalConfig } from '@serve.zone/spark';

const localConfig = new SparkLocalConfig(sparkInstance);
await localConfig.kvStore.set('someKey', 'someValue');

// Retrieving a value from local config
const someConfigValue = await localConfig.kvStore.get('someKey');

console.log(someConfigValue); // Outputs: someValue

Detailed Log Management

Logging is a crucial aspect of any automation tool, and @serve.zone/spark offers rich logging functionality through its built-in logging library.

import { logger, Spark } from '@serve.zone/spark';

const sparkInstance = new Spark();

logger.log('info', 'Spark instance created.');

// Using logger in various levels of severity
logger.log('debug', 'This is a debug message');
logger.log('warn', 'This is a warning message');
logger.log('error', 'This is an error message');
logger.log('ok', 'This is a success message');

Real-World Scenarios

Automated System Update and Restart

In real-world scenarios, you might want to automate system updates and reboots to ensure your services are running the latest security patches and features.

import { Spark } from '@serve.zone/spark';
import { SmartShell } from '@push.rocks/smartshell';

const sparkInstance = new Spark();
const shell = new SmartShell({ executor: 'bash' });

const updateAndRestart = async () => {
  await shell.exec('apt-get update && apt-get upgrade -y');
  console.log('System updated.');
  await shell.exec('reboot');
};

sparkInstance.sparkTaskManager.taskmanager.addAndScheduleTask(
  { name: 'updateAndRestart', taskFunction: updateAndRestart },
  '0 3 * * 7' // Every Sunday at 3 AM
);

This example demonstrates creating and scheduling a task to update and restart the server every Sunday at 3 AM using Spark's task management capabilities.

Integrating with Docker for Service Deployment

Spark's tight integration with Docker makes it an excellent tool for deploying containerized applications across your infrastructure.

import { Spark } from '@serve.zone/spark';
import { DockerHost } from '@apiclient.xyz/docker';

const sparkInstance = new Spark();
const dockerHost = new DockerHost({});

const deployService = async () => {
  const image = await dockerHost.pullImage('my-docker-repo/my-service:latest');
  const newService = await dockerHost.createService({
    name: 'my-service',
    image,
    ports: ['80:8080'],
    environmentVariables: {
      NODE_ENV: 'production',
    },
  });
  console.log(`Service ${newService.name} deployed.`);
};

deployService();

This example demonstrates how to pull a Docker image and deploy it as a new service in your infrastructure using Spark's Docker integration.

Managing Secrets

Managing secrets and sensitive data is crucial in any configuration and automation tool. Spark's integration with Docker allows you to handle secrets securely.

import { Spark, SparkUpdateManager } from '@serve.zone/spark';
import { DockerSecret } from '@apiclient.xyz/docker';

const sparkInstance = new Spark();
const updateManager = new SparkUpdateManager(sparkInstance);

const createDockerSecret = async () => {
  const secret = await DockerSecret.createSecret(updateManager.dockerHost, {
    name: 'dbPassword',
    contentArg: 'superSecretPassword',
  });
  console.log(`Secret ${secret.Spec.Name} created.`);
};

createDockerSecret();

This example shows how to create a Docker secret using Spark's SparkUpdateManager class, ensuring that sensitive information is securely stored and managed.

License and Legal Information

This repository contains open-source code that is licensed under the MIT License. A copy of the MIT License can be found in the license file within this repository.

Please note: The MIT License does not grant permission to use the trade names, trademarks, service marks, or product names of the project, except as required for reasonable and customary use in describing the origin of the work and reproducing the content of the NOTICE file.

Trademarks

This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH and are not included within the scope of the MIT license granted herein. Use of these trademarks must comply with Task Venture Capital GmbH's Trademark Guidelines, and any usage must be approved in writing by Task Venture Capital GmbH.

Company Information

Task Venture Capital GmbH
Registered at District court Bremen HRB 35230 HB, Germany

For any legal inquiries or if you require further information, please contact us via email at [email protected].

By using this repository, you acknowledge that you have read this section, agree to comply with its terms, and understand that the licensing of the code does not imply endorsement by Task Venture Capital GmbH of any derivative works.