npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

lilo-cli

v0.0.2

Published

CLI to download your Google Cloud Platform Logs to a SQLite database.

Downloads

2

Readme

lilo

lilo is a CLI to download and tail Google Cloud Platform logs to a SQLite database.

# Install
npm install -g bun
npm install -g lilo-cli

# Note: `gcloud` also needs to be installed and signed in. 

# Usage
lilo \
    --resource-names "projects/YOUR_GCP_PROJECT_ID_HERE" \
    --filter "timestamp > 2023-06-01" \
    --db ./db.sqlite \
    --watch 2000

See lilo help for detailed flag docs.

  • Log JSON values from your application and ask questions later with SQL queries.

    • Avoid creating a data schema for ad-hoc events/data when developing prototypes.
    • GCP Logging supports JSON lines - your application can log JSON lines to stdout.
  • Use fast & familiar local tools.

    • Use a native GUI such as TablePlus for viewing tables.
    • Use SQLites SQL dialect instead of BigQuery.
    • SQLite indexes on expressions can index a JSON path to speed up queries.
      • Like this: create index i_01 on logs(jsonPayload ->> "$.msg")
    • Use the SQLite CLI to query and pipe output JSON to Excel or other tools to create reports.
    • Any programming language that supports SQLite.
    • Fast interactivity for small to medium datasets.
  • Avoid having to store large amounts of data forever in the cloud.

    • Set your Logging retention to one month, and use lilo to archive your data to a local SQLite DB.
  • Query the db.sqlite on the server to respond to events.

    • Use polling SQL queries to observe events.
  • Understand what's happening on your GCP account/audit logs.

  • Convert the SQLite file to DuckDB for fast analytical queries.

Name origin: SQLite GCP Logs

Table rows Example table schema and rows in TablePlus

FAQ

How long does it take to do a full download?

  • It depends on how many logs you have, but by default GCP accounts are limited to 60 HTTP read calls per minute, and each of those has 1000 entries.
  • lilo begins downloading from the last log entry it downloaded.
    • You can start and stop the CLI at any time.
    • lilo only downloads the log entries since the last run.

Why Bun?

  • It has SQLite built in, so SQLite will not need to be compiled when installing lilo via NPM.
  • I wanted to test it out.

Alternatives & Notes

  • Set up a log sink to BigQuery, then pipe a BigQuery query result to Google Sheets.

  • Copy your existing logs to a Storage Bucket

    • Use for log entries that were written before you set up a streaming log sink.
    • https://cloud.google.com/logging/docs/routing/copy-logs
  • https://cloud.google.com/logging/docs/samples/logging-tail-log-entries

  • https://cloud.google.com/logging/docs/reference/v2/rest/v2/entries/tail

    • The tail endpoint does not work with JSON, only GRPC.
    • Bun does not currently support GRPC, so lilo polls entries:list
  • gcloud alpha logging tail

  • https://til.simonwillison.net/cloudrun/tailing-cloud-run-request-logs