Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom system prompt? #171

Open
ionflow opened this issue Apr 26, 2024 · 4 comments
Open

Custom system prompt? #171

ionflow opened this issue Apr 26, 2024 · 4 comments

Comments

@ionflow
Copy link

ionflow commented Apr 26, 2024

Where is this system prompt coming from and can it be customized?

Given a user prompt, you will return fully valid JSON based on the following description and schema.
You will return no other prose. You will take into account any descriptions or required parameters within the schema
and return a valid and fully escaped JSON object that matches the schema and those instructions.

description: 
json schema:
@andraz
Copy link

andraz commented Apr 28, 2024

System prompt is first message in chat from system:

const UserSchema = z.object({
  // Description will be used in the prompt
  age: z.number().min(0).max(120).int().describe('The age of the user'),
  firstName: z
    .string()
    .describe(
      'The first name of the user, lowercase with capital first letter'
    ),
  surname: z
    .string()
    .describe('The surname of the user, lowercase with capital first letter'),
  sex: z
    .enum(['M', 'F'])
    .describe('The sex of the user, guess if not provided'),
})

// User will be of type z.infer<typeof UserSchema>
const user = await client.chat.completions.create({
  messages: [
    {
      role: 'system',
      content:
        'You are a world class extractor. You always respond in JSON. Current date is ' +
        new Date().toISOString(),
    },
    {
      role: 'user',
      content: 'John Doe born in 1988',
    },
  ],
  model: 'llama3-70b-8192',
  temperature: 0.0,
  max_retries: 3,
  response_model: { schema: UserSchema, name: 'UserSchema' },
})

@ionflow
Copy link
Author

ionflow commented Apr 29, 2024

The system prompt I mentioned is added to the array of messages before your "You are a world class extractor..." message. So there are two system prompts with your example stacked.

@roodboi
Copy link
Collaborator

roodboi commented May 2, 2024

This system prompt is only used when using the "MD_JSON" mode - it gets added in the params resolver that we use here:
https://github.com/hack-dance/island-ai/blob/main/public-packages/zod-stream/src/oai/params.ts#L77

I have not had any requests to be able to customize, but it would be relatively straightforward to add an option - I've mostly just tried to avoid adding too many options but maybe a straight up pass-through to zod-stream would work?

@swyxio
Copy link
Contributor

swyxio commented Sep 12, 2024

this issue is now urgent with the o1 models erroring when a system prompt is given

BadRequestError: 400 litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

and you seem to inject ith with EVERY mode not just MD_JSON

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants