-
Notifications
You must be signed in to change notification settings - Fork 786
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Llama 3.2-Vision Implementation #1160
Open
mktexan
wants to merge
9
commits into
Skyvern-AI:main
Choose a base branch
from
mktexan:Llama
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
9 commits
Select commit
Hold shift + click to select a range
36403e3
Adding in support for Llama 3.2
mktexan aecaa23
Data return instruction updates for Ollama. Added function for markdo…
mktexan 6c646b5
Provided more explicit instructions to Llama on its expected output.
mktexan 35d94e0
Testing llama specific instructions.
mktexan 44a04cd
Refactor of promtps. Temporary disabling of skyvern prompts for Llama…
mktexan 4219ecf
Further testing of custom prompts for llama.
mktexan 4b3ee6b
Switched LLM server. Fine tuned extract-actions for llama.
mktexan d14dd20
Llama Extract-action outputting json response.
mktexan ba837f3
Added utils function to compensate for LLama conversational responses…
mktexan File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,4 +1,5 @@ | ||
from skyvern.forge.sdk.prompting import PromptEngine | ||
|
||
# Initialize the prompt engine | ||
prompt_engine = PromptEngine("skyvern") | ||
prompt_engine = PromptEngine("ollama") | ||
prompt_engine_llama = PromptEngine("ollama") |
45 changes: 45 additions & 0 deletions
45
skyvern/forge/prompts/ollama/answer-user-detail-questions.j2
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,45 @@ | ||
You are a JSON API endpoint that answers questions based on user details and goals. API endpoints ONLY return data - no explanations allowed. | ||
|
||
Purpose: | ||
- Answer user questions based on provided information | ||
- Use exact information from user details | ||
- Keep answers direct and concise | ||
- Fill in answers as JSON key-value pairs | ||
|
||
Input data: | ||
User's goal: {{ navigation_goal }} | ||
User's details: {{ navigation_payload }} | ||
User's questions: {{ queries_and_answers }} | ||
|
||
Instructions for answering: | ||
1. Read each question carefully | ||
2. Find relevant information in user's goal and details | ||
3. Provide only the exact information needed | ||
4. Include answers in the JSON response | ||
5. Keep answers direct - no explanations | ||
6. Use precise values from provided details | ||
|
||
CRITICAL FORMATTING RULES: | ||
1. Start response with { and end with } | ||
2. NO text before or after JSON | ||
3. NO markdown formatting or code blocks | ||
4. NO explanations, notes, or comments | ||
5. NO additional formatting or whitespace | ||
6. Response must be pure JSON only | ||
|
||
Response format (replace with actual answers): | ||
{ | ||
"question_1": "", | ||
"question_2": "", | ||
"question_3": "" | ||
} | ||
|
||
AUTOMATIC FAILURE TRIGGERS: | ||
- Text before the opening { | ||
- Text after the closing } | ||
- Explanations or markdown | ||
- Notes or comments | ||
- Code blocks or ``` | ||
- Any content outside JSON structure | ||
|
||
These answers will be used to fill out information on a webpage automatically. Invalid format will cause system errors. |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The new
llama_handler
function appears to duplicate existing functionality inskyvern/forge/sdk/api/llm/llama_handler.py
. Consider reusing or extending the existing function instead of adding a new one.llama_handler
(llama_handler.py)