Skip to content

Commit

Permalink
init
Browse files Browse the repository at this point in the history
  • Loading branch information
tomatyss committed Mar 11, 2024
1 parent b4eb252 commit c38f1bc
Show file tree
Hide file tree
Showing 8 changed files with 520 additions and 78 deletions.
53 changes: 15 additions & 38 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,52 +1,29 @@
# Simple Echo Connector
# Prompt Mixer Cohere Connector

This repository contains a simplified connector that uses an echo function to simulate responses from a chat model. It's designed to mimic the behavior of more complex connectors (like those interfacing with models such as Ollama), but without the need for external API calls. This can be particularly useful for testing, development, or educational purposes.
This is a connector for Prompt Mixer that allows you to access the Cohere API from within Prompt Mixer.

## Features

- Echo function that simulates chat responses
- Mapping of chat completions to a standardized response format
- Simple integration into existing TypeScript projects
- Configurable to simulate different model types
- Generate text using Cohere's language models
- Pass prompts and settings to the Cohere API easily
- View generated output directly in Prompt Mixer

## Installation

Before installing this connector, ensure you have [Node.js](https://nodejs.org/) installed on your system.
To install the Cohere Connector:

1. **Clone the repository**
1. Open Prompt Mixer and navigate to Connectors
2. Select the Cohere Connector from the list of available connectors
3. Click on "Install Plugin"
4. Go to Connectors > Installed > Cohere to enter your Cohere API key

```bash
git clone https://github.com/PromptMixerDev/prompt-mixer-sample-connector.git
cd prompt-mixer-sample-connector
```
## Usage
Once you have installed the connector and configured your API key, you can start using Cohere's language models through the prompt editor in Prompt Mixer.

2. **Install dependencies**

```bash
npm install
```

This will install all necessary dependencies, including TypeScript and any types required for development.

## Configuration

The `config` object can be adjusted to suit your needs. It's located in `config.ts`. By default, it might include placeholders for various configurations. Ensure you review and update it as necessary for your project.
Please refer to the [Cohere API documentation](https://docs.cohere.com/reference/about) for details on supported models, prompt formatting, and available parameters.

## Contributing

Contributions are welcome! If you have improvements or bug fixes, please follow these steps:

1. Fork the repository
2. Create your feature branch (`git checkout -b feature/AmazingFeature`)
3. Commit your changes (`git commit -am 'Add some AmazingFeature'`)
4. Push to the branch (`git push origin feature/AmazingFeature`)
5. Open a Pull Request
Contributions to improve the Cohere Connector are welcome! Feel free to submit pull requests or open issues on the GitHub repository if you encounter any problems or have suggestions for enhancements.

## License

This project is licensed under the MIT License

## Acknowledgments

- This project is inspired by the need for simple, mock connectors in development environments.
- Thanks to all contributors and users for their interest and feedback.
This connector is released under the MIT License.
25 changes: 25 additions & 0 deletions config.d.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
export interface ModelConfig {
connectorName: string;
models: string[];
properties: Property[];
settings: Setting[];
iconBase64: string;
description?: string;
author?: string;
}

export interface Property {
id: string;
name: string;
value: string | number | boolean | string[];
type: 'string' | 'number' | 'boolean' | 'array';
}

export interface Setting {
id: string;
name: string;
value: string;
type: 'string';
}

export declare const config: ModelConfig;
61 changes: 56 additions & 5 deletions config.js
Original file line number Diff line number Diff line change
@@ -1,10 +1,61 @@
export const config = {
connectorName: 'Sample Connector',
models: ['sample-model'],
settings: [],
connectorName: 'Cohere',
models: [
'command-light',
'command-light-nightly',
'command',
'command-nightly',
'command-r',
],
settings: [
{
id: 'COHERE_API_KEY',
name: 'API Key',
value: '',
type: 'string',
},
],
properties: [
{
id: 'max_tokens',
name: 'Max Tokens',
value: 4096,
type: 'number',
},
{
id: 'temperature',
name: 'Temperature',
value: 0.3,
type: 'number',
},
{
id: 'k',
name: 'K',
value: 0,
type: 'number',
},
{
id: 'p',
name: 'P',
value: 0.75,
type: 'number',
},
{
id: 'frequency_penalty',
name: 'Frequency Penalty',
value: 0.75,
type: 'number',
},
{
id: 'presence_penalty',
name: 'Presence Penalty',
value: 0.0,
type: 'number',
},
],
description:
'This is a sample connector to demonstrate the Prompt Mixer connector API. It does not connect to any external service.',
author: 'Prompt Mixer team',
author: 'Prompt Mixer',
iconBase64:
'data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMTYiIGhlaWdodD0iMTYiIHZpZXdCb3g9IjAgMCAxNiAxNiIgZmlsbD0ibm9uZSIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KPHBhdGggZD0iTTQuNjY2NjcgMy4zMzMzNUM0LjY2NjY3IDEuODYwNTkgNS44NjA1NyAwLjY2NjY4NyA3LjMzMzMzIDAuNjY2Njg3QzguODA2MDcgMC42NjY2ODcgMTAgMS44NjA1OSAxMCAzLjMzMzM1SDEyQzEyLjM2ODIgMy4zMzMzNSAxMi42NjY3IDMuNjMxODMgMTIuNjY2NyA0LjAwMDAyVjYuMDAwMDJDMTQuMTM5NCA2LjAwMDAyIDE1LjMzMzMgNy4xOTM5NSAxNS4zMzMzIDguNjY2NjlDMTUuMzMzMyAxMC4xMzk0IDE0LjEzOTQgMTEuMzMzNCAxMi42NjY3IDExLjMzMzRWMTMuMzMzNEMxMi42NjY3IDEzLjcwMTYgMTIuMzY4MiAxNCAxMiAxNEgyLjY2NjY3QzIuMjk4NDggMTQgMiAxMy43MDE2IDIgMTMuMzMzNFY0LjAwMDAyQzIgMy42MzE4MyAyLjI5ODQ4IDMuMzMzMzUgMi42NjY2NyAzLjMzMzM1SDQuNjY2NjdaTTcuMzMzMzMgMi4wMDAwMkM2LjU5Njk1IDIuMDAwMDIgNiAyLjU5Njk3IDYgMy4zMzMzNUM2IDMuNDkwMzggNi4wMjY4NyAzLjYzOTcgNi4wNzU3IDMuNzc3ODVDNi4xNDc4MSAzLjk4MTkgNi4xMTY0MSA0LjIwODI1IDUuOTkxNDUgNC4zODQ5NUM1Ljg2NjQ5IDQuNTYxNjQgNS42NjM1NSA0LjY2NjY5IDUuNDQ3MTQgNC42NjY2OUgzLjMzMzMzVjEyLjY2NjdIMTEuMzMzM1YxMC41NTI5QzExLjMzMzMgMTAuMzM2NSAxMS40Mzg0IDEwLjEzMzYgMTEuNjE1MSAxMC4wMDg2QzExLjc5MTggOS44ODM2MiAxMi4wMTgxIDkuODUyMjIgMTIuMjIyMSA5LjkyNDM1QzEyLjM2MDMgOS45NzMxNSAxMi41MDk3IDEwIDEyLjY2NjcgMTBDMTMuNDAzMSAxMCAxNCA5LjQwMzA5IDE0IDguNjY2NjlDMTQgNy45MzAyOSAxMy40MDMxIDcuMzMzMzUgMTIuNjY2NyA3LjMzMzM1QzEyLjUwOTcgNy4zMzMzNSAxMi4zNjAzIDcuMzYwMjIgMTIuMjIyMSA3LjQwOTAyQzEyLjAxODEgNy40ODExNSAxMS43OTE4IDcuNDQ5NzUgMTEuNjE1MSA3LjMyNDgyQzExLjQzODQgNy4xOTk4MiAxMS4zMzMzIDYuOTk2ODkgMTEuMzMzMyA2Ljc4MDQ5VjQuNjY2NjlIOS4yMTk1M0M5LjAwMzEzIDQuNjY2NjkgOC44MDAyIDQuNTYxNjQgOC42NzUyIDQuMzg0OTVDOC41NTAyNyA0LjIwODI1IDguNTE4ODcgMy45ODE5IDguNTkxIDMuNzc3ODVDOC42Mzk4IDMuNjM5NyA4LjY2NjY3IDMuNDkwMzkgOC42NjY2NyAzLjMzMzM1QzguNjY2NjcgMi41OTY5NyA4LjA2OTczIDIuMDAwMDIgNy4zMzMzMyAyLjAwMDAyWiIgZmlsbD0iIzZGNzM3QSIvPgo8L3N2Zz4K',
'data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMTYiIGhlaWdodD0iMTYiIHZpZXdCb3g9IjAgMCAxNiAxNiIgZmlsbD0ibm9uZSIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KPG1hc2sgaWQ9Im1hc2swXzMyMF80ODc4OSIgc3R5bGU9Im1hc2stdHlwZTpsdW1pbmFuY2UiIG1hc2tVbml0cz0idXNlclNwYWNlT25Vc2UiIHg9IjEiIHk9IjEiIHdpZHRoPSIxNCIgaGVpZ2h0PSIxNCI+CjxwYXRoIGQ9Ik0xNC40IDEuNTk5OThIMS42MDAwMVYxNC40SDE0LjRWMS41OTk5OFoiIGZpbGw9IndoaXRlIi8+CjwvbWFzaz4KPGcgbWFzaz0idXJsKCNtYXNrMF8zMjBfNDg3ODkpIj4KPHBhdGggZmlsbC1ydWxlPSJldmVub2RkIiBjbGlwLXJ1bGU9ImV2ZW5vZGQiIGQ9Ik01Ljc0NyA5LjIyMTE3QzYuMDkxNTMgOS4yMjExNyA2Ljc3Njg1IDkuMjAyMjcgNy43MjQxNCA4LjgxMjI1QzguODI4MDQgOC4zNTc3NiAxMS4wMjQzIDcuNTMyNzIgMTIuNjA4NiA2LjY4NTI3QzEzLjcxNjYgNi4wOTI1MyAxNC4yMDI0IDUuMzA4NjEgMTQuMjAyNCA0LjI1MjkyQzE0LjIwMjQgMi43ODc3NSAxMy4wMTQ2IDEuNTk5OTggMTEuNTQ5NCAxLjU5OTk4SDUuNDEwNkMzLjMwNjA3IDEuNTk5OTggMS42MDAwMSAzLjMwNjAzIDEuNjAwMDEgNS40MTA1N0MxLjYwMDAxIDcuNTE1MTEgMy4xOTczOCA5LjIyMTE3IDUuNzQ3IDkuMjIxMTdaIiBmaWxsPSIjNkY3MzdBIi8+CjxwYXRoIGZpbGwtcnVsZT0iZXZlbm9kZCIgY2xpcC1ydWxlPSJldmVub2RkIiBkPSJNNi43ODU5NSAxMS44NDUzQzYuNzg1OTUgMTAuODEzNyA3LjQwNyA5Ljg4MzU1IDguMzU5OCA5LjQ4ODFMMTAuMjkzMSA4LjY4NTc3QzEyLjI0ODUgNy44NzQyMSAxNC40MDA5IDkuMzExMjQgMTQuNDAwOSAxMS40Mjg1QzE0LjQwMDkgMTMuMDY4OCAxMy4wNzA5IDE0LjM5ODQgMTEuNDMwNSAxNC4zOThMOS4zMzc0MyAxNC4zOTc0QzcuOTI4MTYgMTQuMzk3MSA2Ljc4NTk1IDEzLjI1NDUgNi43ODU5NSAxMS44NDUzWiIgZmlsbD0iIzZGNzM3QSIvPgo8cGF0aCBkPSJNMy43OTY3IDkuNzIzNTFDMi41ODM1MiA5LjcyMzUxIDEuNjAwMDEgMTAuNzA2OSAxLjYwMDAxIDExLjkyMDFWMTIuMjA0N0MxLjYwMDAxIDEzLjQxNzggMi41ODM0OCAxNC40MDEzIDMuNzk2NjcgMTQuNDAxM0M1LjAwOTg1IDE0LjQwMTMgNS45OTMzNyAxMy40MTc4IDUuOTkzMzcgMTIuMjA0N1YxMS45MjAxQzUuOTkzMzcgMTAuNzA2OSA1LjAwOTg5IDkuNzIzNTEgMy43OTY3IDkuNzIzNTFaIiBmaWxsPSIjNkY3MzdBIi8+CjwvZz4KPC9zdmc+Cg==',
};
48 changes: 30 additions & 18 deletions main.ts
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
import { config } from './config';
import { CohereClient } from 'cohere-ai';

const API_KEY = 'COHERE_API_KEY';

interface Message {
role: string;
content: string;
role: 'USER' | 'CHATBOT';
message: string;
}

interface Completion {
Expand All @@ -24,40 +27,49 @@ const mapToResponse = (outputs: ChatCompletion[]): ConnectorResponse => {
return {
Completions: outputs.map((output) => ({
Content: output.output,
TokenUsage: undefined, // Token usage is not provided in this simple echo connector
TokenUsage: undefined, // Token usage is not provided by Cohere API
})),
ModelType: outputs[0].stats.model,
};
};

// This is a mock function that simulates the behavior of an echo chat model
async function echoChatModel(model: string, message: string): Promise<string> {
// Simply return the message as is, simulating an echo
return message;
}

async function main(
model: string,
prompts: string[],
properties: Record<string, unknown>,
settings: Record<string, unknown>,
): Promise<ConnectorResponse> {
const messageHistory: Message[] = [
{ role: 'system', content: 'You are a helpful assistant.' },
];
const cohere = new CohereClient({
token: settings?.[API_KEY] as string,
});

const { ...restProperties } = properties;

const messageHistory: Message[] = [];

const outputs: ChatCompletion[] = [];

try {
for (const prompt of prompts) {
messageHistory.push({ role: 'user', content: prompt });
messageHistory.push({ role: 'USER', message: prompt });

// Using the echoChatModel instead of an external API
const assistantResponse = await echoChatModel(model, prompt);
const response = await cohere.chatStream({
chatHistory: messageHistory,
message: prompt,
model,
...restProperties,
});

messageHistory.push({ role: 'assistant', content: assistantResponse });
let assistantResponse = '';
for await (const message of response) {
if (message.eventType === 'text-generation') {
assistantResponse += message.text;
}
}

messageHistory.push({ role: 'CHATBOT', message: assistantResponse });
outputs.push({ output: assistantResponse, stats: { model } });

console.log(`Echo response to prompt: ${prompt}`, assistantResponse);
console.log(`Cohere response to prompt: ${prompt}`, assistantResponse);
}

return mapToResponse(outputs);
Expand Down
8 changes: 4 additions & 4 deletions manifest.json
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
{
"id": "prompt-mixer-ollama-connector",
"name": "Ollama Connector",
"id": "prompt-mixer-cohere-connector",
"name": "Cohere Connector",
"version": "1.0.0",
"minAppVersion": "0.1.0",
"description": "Connector for model installed by Ollama",
"description": "Cohere Prompt Mixer connector to run Cohere's language models",
"author": "Prompt Mixer",
"authorUrl": "",
"authorUrl": "https://promptmixer.dev",
"fundingUrl": "",
"isDesktopOnly": true
}
Loading

0 comments on commit c38f1bc

Please sign in to comment.