adist
v1.0.17
Published
A project indexing and distribution management tool for LLMs
Downloads
1,157
Readme
Adist
A powerful CLI tool for indexing, searching, and having AI-powered conversations about your projects.
Developed by okik.ai.
Contributing
Contributions are welcome! Feel free to submit issues and pull requests to help improve Adist.
The repository is hosted at github.com/okikorg/adist.
⚠️ IMPORTANT: This is an active development project. Breaking changes may occur between versions as we continue to improve the tool. Please check the changelog when updating.
Features
- 🔍 Fast document indexing and semantic searching
- 📁 Support for multiple projects
- 🎯 Project-specific search
- 🧩 Block-based indexing for more precise document analysis
- 🤖 LLM-powered document summarization using Anthropic's Claude or local Ollama models
- 🗣️ Interactive chat with AI about your codebase
- 📊 Project statistics and file analysis
- 🔄 Easy project switching and reindexing
- ⚡ Real-time streaming responses for chat and queries
Installation
npm install -g adist
Usage
Initialize a Project
adist init <project-name>
This will:
- Create a new project configuration
- Index all supported files in the current directory
- Optionally generate LLM summaries if you have the ANTHROPIC_API_KEY set
Search Documents
adist get "<query>"
Search for documents in the current project using natural language queries.
Query Your Project with AI
adist query "<question>"
Ask questions about your project and get AI-powered answers. The AI analyzes relevant documents from your codebase to provide contextual answers with proper code highlighting.
For real-time streaming responses (note that code highlighting may be limited):
adist query "<question>" --stream
Chat with AI About Your Project
adist chat
Start an interactive chat session with AI about your project. This mode provides:
- Persistent conversation history within the session
- Context awareness across multiple questions
- Code syntax highlighting for better readability
- Automatic retrieval of relevant documents for each query
By default, chat mode displays a loading spinner while generating responses. For real-time streaming responses, use:
adist chat --stream
Note that code highlighting may be limited in streaming mode.
Type /exit
to end the chat session.
Switch Projects
adist switch <project-name>
Switch to a different project for searching.
List Projects
adist list
View all configured projects.
Reindex Project
adist reindex
Reindex the current project. Use --summarize
to generate LLM summaries:
adist reindex --summarize
This will:
- Show project statistics (total files, size, word count)
- Ask for confirmation before proceeding with summarization
- Generate summaries for each file
- Create an overall project summary
View Summaries
adist summary
View the overall project summary. To view a specific file's summary:
adist summary --file <filename>
Configure LLM Provider
adist llm-config
Configure which LLM provider to use:
- Anthropic Claude (cloud-based, requires API key)
- Claude 3 Opus
- Claude 3 Sonnet
- Claude 3 Haiku
- OpenAI (cloud-based, requires API key)
- GPT-4o
- GPT-4 Turbo
- GPT-3.5 Turbo
- Ollama (run locally, no API key needed)
- Choose from any locally installed models
When using Ollama, you can select from your locally installed models and customize the API URL if needed.
LLM Features
The tool supports several LLM-powered features using Anthropic's Claude models, OpenAI's GPT models, or Ollama models (local):
Document Summarization
Generate summaries of your project files to help understand large codebases quickly.
Question Answering
Get specific answers about your codebase without having to manually search through files.
Interactive Chat
Have a natural conversation about your project, with the AI maintaining context between questions.
Streaming Responses
AI interactions can be used in two modes:
- Default mode: Shows a loading spinner while generating responses with full code highlighting
- Streaming mode: Shows real-time responses as they're being generated (use
--stream
flag)
# Default mode with loading spinner and code highlighting
adist query "How does authentication work?"
# Streaming mode with real-time responses
adist query "How does authentication work?" --stream
Setting Up
You have three options for using LLM features:
Option 1: Anthropic Claude (Cloud)
Set your Anthropic API key in the environment:
export ANTHROPIC_API_KEY='your-api-key-here'
Configure to use Anthropic and select your preferred model:
adist llm-config
Option 2: OpenAI (Cloud)
Set your OpenAI API key in the environment:
export OPENAI_API_KEY='your-api-key-here'
Configure to use OpenAI and select your preferred model:
adist llm-config
Option 3: Ollama (Local)
Install Ollama from ollama.com/download
Run Ollama and pull a model (e.g., llama3):
ollama pull llama3
Configure adist to use Ollama:
adist llm-config
Select Ollama and choose your preferred model from the list.
Initialize Your Project
After setting up your preferred LLM provider:
Initialize your project:
adist init <project-name>
Start interacting with your codebase:
adist query "How does the authentication system work?" # or adist chat
Supported File Types
The tool indexes a wide range of file types including:
- Markdown (.md)
- Text (.txt)
- Code files (.js, .ts, .py, .go, etc.)
- Documentation (.rst, .asciidoc)
- Configuration files (.json, .yaml, .toml)
- And many more
Configuration
The tool stores its configuration in:
- macOS:
~/Library/Application Support/adist
- Linux:
~/.config/adist
- Windows:
%APPDATA%\adist
Recent Updates
- Improved chat and query commands with better code highlighting in non-streaming mode (default)
- Added
--stream
flag to chat and query commands for real-time streaming responses - Added support for OpenAI models (GPT-4o, GPT-4 Turbo, GPT-3.5 Turbo)
- Added support for all Claude 3 models (Opus, Sonnet, Haiku)
- Added block-based indexing as the default method for faster and more precise document analysis
- Made block-based search the default search method for better contextual understanding
- Legacy indexing and search methods are still available under
legacy-reindex
andlegacy-get
- Added support for Ollama to run LLM features locally without an API key
- Added LLM provider configuration command for easy switching between Anthropic, OpenAI, and Ollama
- Enhanced document relevance ranking for more accurate results
- Added automatic related document discovery for richer context
- Optimized token usage to reduce API costs
Block-Based Indexing
The latest version of adist uses block-based indexing by default, which:
- Splits documents into semantic blocks (functions, sections, paragraphs)
- Indexes each block individually with its metadata
- Allows for more precise searching and better context understanding
- Improves AI interactions by providing more relevant code snippets
The previous full-document indexing method is still available as legacy-reindex
and legacy-get
commands.
License
MIT