velosify-ai
v1.0.0
Published
An npm library for integrating ChatGPT and other AI services.
Downloads
62
Maintainers
Readme
Velosify AI
velosify-ai
is a Node.js library designed to seamlessly integrate AI services like OpenAI’s ChatGPT into your Node.js projects. This library enables developers to easily interact with AI APIs and get powerful results with minimal setup.
Author
- Pedro Veloso ([email protected])
Repository
- GitHub Repository: https://github.com/s2925534/velosify-ai
Table of Contents
Installation
To install the velosify-ai
library, use the following command:
npm install velosify-ai
Make sure you have Node.js and npm installed before proceeding.
Usage
Once installed, you can use velosify-ai
to interact with various AI services. The first version of this library supports OpenAI’s ChatGPT, and more AI platforms will be integrated in future updates.
Example with ChatGPT (OpenAI)
- Install dependencies (if not already installed):
npm install velosify-ai axios dotenv
- Create a
.env
file in your project root:
OPENAI_API_KEY=your_openai_api_key_here
- Create a simple JavaScript file (e.g.,
index.js
) to use the library:
const AIClient = require('velosify-ai');
// Initialize the AIClient with your OpenAI API key
const client = new AIClient(process.env.OPENAI_API_KEY);
async function getChatGPTResponse() {
try {
const response = await client.sendMessage('What is the capital of France?');
console.log('ChatGPT says:', response);
} catch (error) {
console.error('Error:', error.message);
}
}
getChatGPTResponse();
Detailed Explanation
AIClient
Constructor:- Takes your API key (from OpenAI) as the first parameter.
- The second parameter is optional and specifies the AI service provider (defaults to
'openai'
).
sendMessage
method:- Sends a message to the specified AI model (defaults to
'gpt-3.5-turbo'
). - Returns the AI’s response.
- Sends a message to the specified AI model (defaults to
API Methods
new AIClient(apiKey, provider = 'openai')
- Description: Initializes the AIClient with the provided API key and the AI provider (currently only supports
'openai'
). - Parameters:
apiKey
(string): Your API key for the AI service (required).provider
(string, optional): The AI provider to use. Defaults to'openai'
.
sendMessage(message, model = 'gpt-3.5-turbo')
- Description: Sends a message to the specified AI model and returns the response.
- Parameters:
message
(string): The text message to send to the AI model.model
(string, optional): The AI model to use (e.g.,'gpt-3.5-turbo'
). Defaults to'gpt-3.5-turbo'
.
- Returns: The AI’s response as a string.
Future Implementations
In upcoming versions, support for additional AI platforms will be added. Future AI integrations include:
- Google Cloud AI (Vertex AI)
- Microsoft Azure Cognitive Services
- Hugging Face API
- Cohere
- Anthropic’s Claude
- Stability AI
- IBM Watson
- AssemblyAI
- DeepL
Running Tests
The library includes tests that ensure everything works as expected. To run the tests, follow these steps:
- Install dependencies:
npm install
- Run the tests using Jest:
npm test
This will execute the test suite located in the test/
directory, ensuring the library’s functionality is intact.
Contributing
We welcome contributions to the velosify-ai
library! To contribute:
- Fork the repository.
- Create a new branch for your feature or fix.
- Make your changes.
- Run the tests to ensure everything works.
- Submit a pull request with a description of your changes.
License
This project is licensed under the MIT License. See the LICENSE file for more details.
Contact
For questions, feature requests, or other inquiries, feel free to contact me:
- Pedro Veloso ([email protected])