npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

adist

v1.0.17

Published

A project indexing and distribution management tool for LLMs

Downloads

1,157

Readme

Adist

A powerful CLI tool for indexing, searching, and having AI-powered conversations about your projects.

Developed by okik.ai.

Contributing

Contributions are welcome! Feel free to submit issues and pull requests to help improve Adist.

The repository is hosted at github.com/okikorg/adist.

⚠️ IMPORTANT: This is an active development project. Breaking changes may occur between versions as we continue to improve the tool. Please check the changelog when updating.

Features

  • 🔍 Fast document indexing and semantic searching
  • 📁 Support for multiple projects
  • 🎯 Project-specific search
  • 🧩 Block-based indexing for more precise document analysis
  • 🤖 LLM-powered document summarization using Anthropic's Claude or local Ollama models
  • 🗣️ Interactive chat with AI about your codebase
  • 📊 Project statistics and file analysis
  • 🔄 Easy project switching and reindexing
  • ⚡ Real-time streaming responses for chat and queries

Installation

npm install -g adist

Usage

Initialize a Project

adist init <project-name>

This will:

  1. Create a new project configuration
  2. Index all supported files in the current directory
  3. Optionally generate LLM summaries if you have the ANTHROPIC_API_KEY set

Search Documents

adist get "<query>"

Search for documents in the current project using natural language queries.

Query Your Project with AI

adist query "<question>"

Ask questions about your project and get AI-powered answers. The AI analyzes relevant documents from your codebase to provide contextual answers with proper code highlighting.

For real-time streaming responses (note that code highlighting may be limited):

adist query "<question>" --stream

Chat with AI About Your Project

adist chat

Start an interactive chat session with AI about your project. This mode provides:

  • Persistent conversation history within the session
  • Context awareness across multiple questions
  • Code syntax highlighting for better readability
  • Automatic retrieval of relevant documents for each query

By default, chat mode displays a loading spinner while generating responses. For real-time streaming responses, use:

adist chat --stream

Note that code highlighting may be limited in streaming mode.

Type /exit to end the chat session.

Switch Projects

adist switch <project-name>

Switch to a different project for searching.

List Projects

adist list

View all configured projects.

Reindex Project

adist reindex

Reindex the current project. Use --summarize to generate LLM summaries:

adist reindex --summarize

This will:

  1. Show project statistics (total files, size, word count)
  2. Ask for confirmation before proceeding with summarization
  3. Generate summaries for each file
  4. Create an overall project summary

View Summaries

adist summary

View the overall project summary. To view a specific file's summary:

adist summary --file <filename>

Configure LLM Provider

adist llm-config

Configure which LLM provider to use:

  • Anthropic Claude (cloud-based, requires API key)
    • Claude 3 Opus
    • Claude 3 Sonnet
    • Claude 3 Haiku
  • OpenAI (cloud-based, requires API key)
    • GPT-4o
    • GPT-4 Turbo
    • GPT-3.5 Turbo
  • Ollama (run locally, no API key needed)
    • Choose from any locally installed models

When using Ollama, you can select from your locally installed models and customize the API URL if needed.

LLM Features

The tool supports several LLM-powered features using Anthropic's Claude models, OpenAI's GPT models, or Ollama models (local):

Document Summarization

Generate summaries of your project files to help understand large codebases quickly.

Question Answering

Get specific answers about your codebase without having to manually search through files.

Interactive Chat

Have a natural conversation about your project, with the AI maintaining context between questions.

Streaming Responses

AI interactions can be used in two modes:

  • Default mode: Shows a loading spinner while generating responses with full code highlighting
  • Streaming mode: Shows real-time responses as they're being generated (use --stream flag)
# Default mode with loading spinner and code highlighting
adist query "How does authentication work?"

# Streaming mode with real-time responses
adist query "How does authentication work?" --stream

Setting Up

You have three options for using LLM features:

Option 1: Anthropic Claude (Cloud)

  1. Set your Anthropic API key in the environment:

    export ANTHROPIC_API_KEY='your-api-key-here'
  2. Configure to use Anthropic and select your preferred model:

    adist llm-config

Option 2: OpenAI (Cloud)

  1. Set your OpenAI API key in the environment:

    export OPENAI_API_KEY='your-api-key-here'
  2. Configure to use OpenAI and select your preferred model:

    adist llm-config

Option 3: Ollama (Local)

  1. Install Ollama from ollama.com/download

  2. Run Ollama and pull a model (e.g., llama3):

    ollama pull llama3
  3. Configure adist to use Ollama:

    adist llm-config
  4. Select Ollama and choose your preferred model from the list.

Initialize Your Project

After setting up your preferred LLM provider:

  1. Initialize your project:

    adist init <project-name>
  2. Start interacting with your codebase:

    adist query "How does the authentication system work?"
    # or
    adist chat

Supported File Types

The tool indexes a wide range of file types including:

  • Markdown (.md)
  • Text (.txt)
  • Code files (.js, .ts, .py, .go, etc.)
  • Documentation (.rst, .asciidoc)
  • Configuration files (.json, .yaml, .toml)
  • And many more

Configuration

The tool stores its configuration in:

  • macOS: ~/Library/Application Support/adist
  • Linux: ~/.config/adist
  • Windows: %APPDATA%\adist

Recent Updates

  • Improved chat and query commands with better code highlighting in non-streaming mode (default)
  • Added --stream flag to chat and query commands for real-time streaming responses
  • Added support for OpenAI models (GPT-4o, GPT-4 Turbo, GPT-3.5 Turbo)
  • Added support for all Claude 3 models (Opus, Sonnet, Haiku)
  • Added block-based indexing as the default method for faster and more precise document analysis
  • Made block-based search the default search method for better contextual understanding
  • Legacy indexing and search methods are still available under legacy-reindex and legacy-get
  • Added support for Ollama to run LLM features locally without an API key
  • Added LLM provider configuration command for easy switching between Anthropic, OpenAI, and Ollama
  • Enhanced document relevance ranking for more accurate results
  • Added automatic related document discovery for richer context
  • Optimized token usage to reduce API costs

Block-Based Indexing

The latest version of adist uses block-based indexing by default, which:

  1. Splits documents into semantic blocks (functions, sections, paragraphs)
  2. Indexes each block individually with its metadata
  3. Allows for more precise searching and better context understanding
  4. Improves AI interactions by providing more relevant code snippets

The previous full-document indexing method is still available as legacy-reindex and legacy-get commands.

License

MIT