npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@google-cloud/logging-min

v11.2.0

Published

Cloud Logging Client Library for Node.js

Downloads

362,687

Readme

Cloud Logging: Node.js Client

release level npm version

Google Cloud Logging allows you to store, search, analyze, monitor, and alert on log data and events from Google Cloud Platform and Amazon Web Services.

If you require lightweight dependencies, an experimental, minified version of this library is available at @google-cloud/logging-min. Note: logging-min is experimental, and its feature surface is subject to change. To install @google-cloud/logging-min library run the following command:

npm install @google-cloud/logging-min

For an interactive tutorial on using the client library in a Node.js application, click Guide Me:

Guide Me

A comprehensive list of changes in each version may be found in the CHANGELOG.

Read more about the client libraries for Cloud APIs, including the older Google APIs Client Libraries, in Client Libraries Explained.

Table of contents:

Quickstart

Before you begin

  1. Select or create a Cloud Platform project.
  2. Enable the Cloud Logging API.
  3. Set up authentication with a service account so you can access the API from your local workstation.

Installing the client library

npm install @google-cloud/logging

Using the client library

// Imports the Google Cloud client library
const {Logging} = require('@google-cloud/logging');

async function quickstart(
  projectId = 'YOUR_PROJECT_ID', // Your Google Cloud Platform project ID
  logName = 'my-log' // The name of the log to write to
) {
  // Creates a client
  const logging = new Logging({projectId});

  // Selects the log to write to
  const log = logging.log(logName);

  // The data to write to the log
  const text = 'Hello, world!';

  // The metadata associated with the entry
  const metadata = {
    resource: {type: 'global'},
    // See: https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry#logseverity
    severity: 'INFO',
  };

  // Prepares a log entry
  const entry = log.entry(metadata, text);

  async function writeLog() {
    // Writes the log entry
    await log.write(entry);
    console.log(`Logged: ${text}`);
  }
  writeLog();
}

Batching Writes

High throughput applications should avoid awaiting calls to the logger:

await log.write(logEntry1);
await log.write(logEntry2);

Rather, applications should use a fire and forget approach:

log.write(logEntry1);
log.write(logEntry2);

The @google-cloud/logging library will handle batching and dispatching these log lines to the API.

Writing to Stdout

The LogSync class helps users easily write context-rich structured logs to stdout or any custom transport. It extracts additional log properties like trace context from HTTP headers and can be used as an on/off toggle between writing to the API or to stdout during local development.

Logs written to stdout are then picked up, out-of-process, by a Logging agent in the respective GCP environment. Logging agents can add more properties to each entry before streaming it to the Logging API.

Read more about Logging agents.

Serverless applications like Cloud Functions, Cloud Run, and App Engine are highly recommended to use the LogSync class as async logs may be dropped due to lack of CPU.

Read more about structured logging.

// Optional: Create and configure a client
const logging = new Logging();
await logging.setProjectId()
await logging.setDetectedResource()

// Create a LogSync transport, defaulting to `process.stdout`
const log = logging.logSync(logname);
const meta = { // optional field overrides here };
const entry = log.entry(meta, 'Your log message');
log.write(entry);

// Syntax sugar for logging at a specific severity
log.alert(entry);
log.warning(entry);

Populating Http request metadata

Metadata about Http request is a part of the structured log info that can be captured within each log entry. It can provide a context for the application logs and is used to group multiple log entries under the load balancer request logs. See the sample how to populate the Http request metadata for log entries.

If you already have a "raw" Http request object you can assign it to entry.metadata.httpRequest directly. More information about how the request is interpreted as raw can be found in the code.

Automatic Trace/Span ID Extraction

Cloud Logging libraries use trace fields within LogEntry to capture trace contexts, which enables the correlation of logs and traces, and distributed tracing troubleshooting. These tracing fields, including trace, spanId, and traceSampled, define the trace context for a LogEntry.

If not provided explicitly in a LogEntry, the Cloud Logging library automatically populates trace, span_id, and trace_sampled fields from detected OpenTelemetry span contexts, or from HTTP request headers.

Extracting Trace/Span ID from OpenTelemetry Context

If you are using OpenTelemetry and there is an active span in the OpenTelemetry Context, the trace, span_id, and trace_sampled fields in the log entry are automatically populated from the active span. More information about OpenTelemetry can be found here.

Extracting Trace/Span ID from HTTP Headers

If tracing fields are not provided explicitly and no OpenTelemetry context is detected, the trace / span_id fields are extracted automatically from HTTP headers. Trace information can be automatically populated from either the W3C Traceparent or X-Cloud-Trace-Context headers.

Error handling with logs written or deleted asynchronously

The Log class provide users the ability to write and delete logs asynchronously. However, there are cases when log entries cannot be written or deleted and error is thrown - if error is not handled properly, it could crash the application. One possible way to catch the error is to await the log write/delete calls and wrap it with try/catch like in example below:

    // Write log entry and and catch any errors
    try {
      await log.write(entry);
    } catch (err) {
      console.log('Error is: ' + err);
    }

However, awaiting for every log.write or log.delete calls may introduce delays which could be avoided by simply adding a callback like in the example below. This way the log entry can be queued for processing and code execution will continue without further delays. The callback will be called once the operation is complete:

    // Asynchronously write the log entry and handle respone or any errors in provided callback
    log.write(entry, err => {
      if (err) {
        // The log entry was not written.
        console.log(err.message);
      } else {
        console.log('No error in write callback!');
      }
    });

Adding a callback to every log.write or log.delete calls could be a burden, especially if code handling the error is always the same. For this purpose we introduced an ability to provide a default callback for Log class which could be set through LogOptions passed to Log constructor as in example below - this way you can define a global callback once for all log.write and log.delete calls and be able to handle errors:

  const {Logging} = require('@google-cloud/logging');
  const logging = new Logging();
  
  // Create options with default callback to be called on every write/delete response or error
  const options = {
    defaultWriteDeleteCallback: function (err) {
      if (err) {
        console.log('Error is: ' + err);
      } else {
        console.log('No error, all is good!');
      }
    },
  };

  const log = logging.log('my-log', options);

See the full sample in writeLogWithCallback function here.

Samples

Samples are in the samples/ directory. Each sample's README.md has instructions for running its sample.

| Sample | Source Code | Try it | | --------------------------- | --------------------------------- | ------ | | Fluent | source code | Open in Cloud Shell | | Log HTTP Request | source code | Open in Cloud Shell | | Logs | source code | Open in Cloud Shell | | Quickstart | source code | Open in Cloud Shell | | Sinks | source code | Open in Cloud Shell |

The Cloud Logging Node.js Client API Reference documentation also contains samples.

Supported Node.js Versions

Our client libraries follow the Node.js release schedule. Libraries are compatible with all current active and maintenance versions of Node.js. If you are using an end-of-life version of Node.js, we recommend that you update as soon as possible to an actively supported LTS version.

Google's client libraries support legacy versions of Node.js runtimes on a best-efforts basis with the following warnings:

  • Legacy versions are not tested in continuous integration.
  • Some security patches and features cannot be backported.
  • Dependencies cannot be kept up-to-date.

Client libraries targeting some end-of-life versions of Node.js are available, and can be installed through npm dist-tags. The dist-tags follow the naming convention legacy-(version). For example, npm install @google-cloud/logging@legacy-8 installs client libraries for versions compatible with Node.js 8.

Versioning

This library follows Semantic Versioning.

This library is considered to be stable. The code surface will not change in backwards-incompatible ways unless absolutely necessary (e.g. because of critical security issues) or with an extensive deprecation period. Issues and requests against stable libraries are addressed with the highest priority.

More Information: Google Cloud Platform Launch Stages

Contributing

Contributions welcome! See the Contributing Guide.

Please note that this README.md, the samples/README.md, and a variety of configuration files in this repository (including .nycrc and tsconfig.json) are generated from a central template. To edit one of these files, make an edit to its templates in directory.

License

Apache Version 2.0

See LICENSE