@fnet/ollama-chat
v0.1.13
Published
The `@fnet/ollama-chat` project offers a straightforward chat interface with AI models from Ollama. This application allows you to easily communicate with various AI models by typing messages and receiving responses. It aims to provide a simple way to exp
Downloads
790
Readme
@fnet/ollama-chat
The @fnet/ollama-chat
project offers a straightforward chat interface with AI models from Ollama. This application allows you to easily communicate with various AI models by typing messages and receiving responses. It aims to provide a simple way to explore and interact with different AI models through an accessible command line interface.
How It Works
The application connects to the Ollama API to fetch available AI models. Once connected, you can start a chat session by typing messages, which are processed by the selected AI model, and the responses are displayed in real-time. The app supports various commands for managing chat history, switching models, saving conversations, and more, making it a flexible tool for interacting with AI.
Key Features
- Model Selection: Choose from a list of available Ollama models to drive the chat conversation.
- Command Interface: Use simple commands like
/help
,/exit
,/history
,/save
, and/load
to interact with the chat session. - History Management: Save and load conversation history to or from a file for reference or continued interaction.
- Retry Messages: Option to retry the last input, in case a different response is desired.
- Interaction Customization: Add system, user, or assistant messages manually to shape the conversation dynamics.
- API URL Management: Update the API URL to switch between different Ollama servers and their respective models.
Conclusion
@fnet/ollama-chat
is a practical utility for those looking to engage with AI models in a chat format. It offers essential functionalities to manage conversations and interact with various AI models, providing users with a practical experience of AI-driven dialogues.
@fnet/ollama-chat Developer Guide
Overview
The @fnet/ollama-chat
library provides a simplified interface for interacting with AI models using the Ollama platform. It facilitates real-time chat sessions with a conversational AI model, enabling users to engage in dialogue while leveraging various functionalities like managing conversation histories, switching models, and retrying messages.
Installation
To install the library, use either npm or yarn. Choose one of the following commands:
npm install @fnet/ollama-chat
or
yarn add @fnet/ollama-chat
Usage
The @fnet/ollama-chat
library is intended for use in Node.js environments. Here is an outline of how to initiate a simple chat session:
Import the Library: Begin by importing the main functionality from the library into your Node.js application:
import ollamaChat from '@fnet/ollama-chat';
Start a Chat Session: You can create a new chat session by calling the
ollamaChat
function with appropriate parameters. It returns the chat history upon completion:ollamaChat({ url: "http://127.0.0.1:11434", model: "gemma2:latest", history: [], }).then(history => { console.log("Chat session ended. Here's the full conversation history:", history); });
Examples
Here are some examples showcasing how to interact with the library’s core functionalities:
Example 1: Basic Chat Session
import ollamaChat from '@fnet/ollama-chat';
ollamaChat({
url: "http://127.0.0.1:11434",
model: "gemma2:latest",
history: [],
}).then(history => {
console.log("Chat session ended. Here's the full conversation history:", history);
});
Example 2: Saving and Loading History
In your chat session, you can save and load conversation histories, which can be useful for reviewing past interactions or resuming sessions:
Save History
// Within the session, type /save to save the history.
// You will be prompted to enter a filename.
Load History
// Before starting a session or during one, type /load to load a saved history.
// You will be prompted to enter a filename.
Example 3: Model Management
Switch AI models during a chat session:
// Within the session, type /model to interactively choose from the list of available models.
Acknowledgement
This library is built using the Ollama API for engaging with conversational AI models. It leverages internal functionalities such as Ollama
and external libraries like prompt
and chalk
for enhanced interactivity and output styling.
Input Schema
$schema: https://json-schema.org/draft/2020-12/schema
type: object
properties:
url:
type: string
description: API URL for Ollama.
default: http://127.0.0.1:11434
model:
type: string
description: Default AI model to use.
default: gemma2:latest
history:
type: array
description: Chat history.
items:
type: object
properties:
role:
type: string
description: The role of the message sender (user, assistant, system, tool).
content:
type: string
description: The content of the message.
format:
type: object
description: JSON schema for the response.
tools:
type: array
description: List of tools with definitions and call capabilities.
items:
type: object
properties:
definition:
type: object
description: Definition of the tool.
call:
type: object
description: Tool call method.
required: []