Skip to content

Commit

Permalink
Merge branch 'microsoft:main' into main
Browse files Browse the repository at this point in the history
  • Loading branch information
ShobhitVishnoi30 authored Jan 10, 2024
2 parents 02c46c3 + b548e55 commit fc8fb78
Show file tree
Hide file tree
Showing 124 changed files with 14,777 additions and 5,164 deletions.
2 changes: 1 addition & 1 deletion .devcontainer/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# Licensed under the MIT License. See LICENSE file in the project root for license information.
#-------------------------------------------------------------------------------------------------------------

FROM mcr.microsoft.com/vscode/devcontainers/python:0-3.10
FROM mcr.microsoft.com/vscode/devcontainers/python:3.10

#
# Update the OS and maybe install packages
Expand Down
6 changes: 2 additions & 4 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -41,17 +41,15 @@ jobs:
pip install -e .
python -c "import autogen"
pip install -e. pytest mock
pip uninstall -y openai
- name: Test with pytest
if: matrix.python-version != '3.10'
run: |
pytest test
pytest test --skip-openai
- name: Coverage
if: matrix.python-version == '3.10'
run: |
pip install -e .[test]
pip uninstall -y openai
coverage run -a -m pytest test --ignore=test/agentchat/contrib
coverage run -a -m pytest test --ignore=test/agentchat/contrib --skip-openai
coverage xml
- name: Upload coverage to Codecov
if: matrix.python-version == '3.10'
Expand Down
3 changes: 3 additions & 0 deletions .github/workflows/contrib-openai.yml
Original file line number Diff line number Diff line change
Expand Up @@ -200,6 +200,9 @@ jobs:
pip install -e .
python -c "import autogen"
pip install coverage pytest-asyncio
- name: Install packages for test when needed
run: |
pip install -e .[autobuild]
- name: Coverage
env:
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
Expand Down
27 changes: 11 additions & 16 deletions .github/workflows/contrib-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,15 +45,14 @@ jobs:
- name: Install packages and dependencies for RetrieveChat
run: |
pip install -e .[retrievechat]
pip uninstall -y openai
- name: Test RetrieveChat
run: |
pytest test/test_retrieve_utils.py test/agentchat/contrib/test_retrievechat.py test/agentchat/contrib/test_qdrant_retrievechat.py
pytest test/test_retrieve_utils.py test/agentchat/contrib/test_retrievechat.py test/agentchat/contrib/test_qdrant_retrievechat.py --skip-openai
- name: Coverage
if: matrix.python-version == '3.10'
run: |
pip install coverage>=5.3
coverage run -a -m pytest test/test_retrieve_utils.py test/agentchat/contrib
coverage run -a -m pytest test/test_retrieve_utils.py test/agentchat/contrib/test_retrievechat.py test/agentchat/contrib/test_qdrant_retrievechat.py --skip-openai
coverage xml
- name: Upload coverage to Codecov
if: matrix.python-version == '3.10'
Expand Down Expand Up @@ -82,16 +81,15 @@ jobs:
- name: Install packages and dependencies for Compression
run: |
pip install -e .
pip uninstall -y openai
- name: Test Compression
if: matrix.python-version != '3.10' # diversify the python versions
run: |
pytest test/agentchat/contrib/test_compressible_agent.py
pytest test/agentchat/contrib/test_compressible_agent.py --skip-openai
- name: Coverage
if: matrix.python-version == '3.10'
run: |
pip install coverage>=5.3
coverage run -a -m pytest test/agentchat/contrib/test_compressible_agent.py
coverage run -a -m pytest test/agentchat/contrib/test_compressible_agent.py --skip-openai
coverage xml
- name: Upload coverage to Codecov
if: matrix.python-version == '3.10'
Expand Down Expand Up @@ -120,16 +118,15 @@ jobs:
- name: Install packages and dependencies for GPTAssistantAgent
run: |
pip install -e .
pip uninstall -y openai
- name: Test GPTAssistantAgent
if: matrix.python-version != '3.11' # diversify the python versions
run: |
pytest test/agentchat/contrib/test_gpt_assistant.py
pytest test/agentchat/contrib/test_gpt_assistant.py --skip-openai
- name: Coverage
if: matrix.python-version == '3.11'
run: |
pip install coverage>=5.3
coverage run -a -m pytest test/agentchat/contrib/test_gpt_assistant.py
coverage run -a -m pytest test/agentchat/contrib/test_gpt_assistant.py --skip-openai
coverage xml
- name: Upload coverage to Codecov
if: matrix.python-version == '3.11'
Expand All @@ -155,19 +152,18 @@ jobs:
run: |
python -m pip install --upgrade pip wheel
pip install pytest
- name: Install packages and dependencies for TeachableAgent
- name: Install packages and dependencies for Teachability
run: |
pip install -e .[teachable]
pip uninstall -y openai
- name: Test TeachableAgent
if: matrix.python-version != '3.9' # diversify the python versions
run: |
pytest test/agentchat/contrib/test_teachable_agent.py
pytest test/agentchat/contrib/test_teachable_agent.py --skip-openai
- name: Coverage
if: matrix.python-version == '3.9'
run: |
pip install coverage>=5.3
coverage run -a -m pytest test/agentchat/contrib/test_teachable_agent.py
coverage run -a -m pytest test/agentchat/contrib/test_teachable_agent.py --skip-openai
coverage xml
- name: Upload coverage to Codecov
if: matrix.python-version == '3.9'
Expand Down Expand Up @@ -196,15 +192,14 @@ jobs:
- name: Install packages and dependencies for LMM
run: |
pip install -e .[lmm]
pip uninstall -y openai
- name: Test LMM and LLaVA
run: |
pytest test/agentchat/contrib/test_img_utils.py test/agentchat/contrib/test_lmm.py test/agentchat/contrib/test_llava.py
pytest test/agentchat/contrib/test_img_utils.py test/agentchat/contrib/test_lmm.py test/agentchat/contrib/test_llava.py --skip-openai
- name: Coverage
if: matrix.python-version == '3.10'
run: |
pip install coverage>=5.3
coverage run -a -m pytest test/agentchat/contrib/test_img_utils.py test/agentchat/contrib/test_lmm.py test/agentchat/contrib/test_llava.py
coverage run -a -m pytest test/agentchat/contrib/test_img_utils.py test/agentchat/contrib/test_lmm.py test/agentchat/contrib/test_llava.py --skip-openai
coverage xml
- name: Upload coverage to Codecov
if: matrix.python-version == '3.10'
Expand Down
37 changes: 37 additions & 0 deletions .github/workflows/dotnet-build.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# This workflow will build a .NET project
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-net

name: dotnet-ci

on:
pull_request:
branches: [ "main" ]
paths:
- 'dotnet/**'

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ github.head_ref }}
cancel-in-progress: ${{ github.ref != 'refs/heads/main' }}

permissions:
contents: read

jobs:
build:
name: CI
runs-on: ubuntu-latest
defaults:
run:
working-directory: dotnet
steps:
- uses: actions/checkout@v3
- name: Setup .NET
uses: actions/setup-dotnet@v3
with:
global-json-file: global.json
- name: Restore dependencies
run: dotnet restore
- name: Build
run: dotnet build --no-restore
- name: Unit Test
run: dotnet test --no-build --verbosity normal
54 changes: 54 additions & 0 deletions .github/workflows/dotnet-run-openai-test-and-notebooks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
name: run-openai-test-and-notebooks

on:
pull_request_target:
branches: [ "main" ]
paths:
- 'dotnet/**'
env:
BUILD_CONFIGURATION: Release # set this to the appropriate build configuration

jobs:
build:
environment: dotnet
name: run-openai-test-and-notebooks
runs-on: ubuntu-latest
defaults:
run:
working-directory: dotnet
steps:
- name: Checkout
uses: actions/checkout@v3
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Setup .NET
uses: actions/setup-dotnet@v3
with:
global-json-file: dotnet/global.json

- name: Restore dependencies
run: dotnet restore
- name: Restore tool
run: dotnet tool restore
- name: Build
run: dotnet build --no-restore -p:VersionSuffix=$GITHUB_RUN_ID --configuration '${{ env.BUILD_CONFIGURATION }}'
- name: Pack
run: dotnet pack --no-restore -p:VersionSuffix=$GITHUB_RUN_ID --no-build --configuration '${{ env.BUILD_CONFIGURATION }}' --output ./artifacts
- name: run all tests
run: dotnet test --no-restore --no-build --configuration '${{ env.BUILD_CONFIGURATION }}'
env:
AZURE_OPENAI_API_KEY: ${{ secrets.AZURE_OPENAI_API_KEY }}
AZURE_OPENAI_ENDPOINT: ${{ secrets.AZURE_OPENAI_ENDPOINT }}
AZURE_GPT_35_MODEL_ID: ${{ secrets.AZURE_GPT_35_MODEL_ID }}

- name: Add local feed
run: dotnet nuget add source --name local artifacts --configfile NuGet.config
- name: Perform a Pester test from the .tools/run_all_notebooks.ps1
shell: pwsh
run: |
Invoke-Pester .tools/run_all_notebook.ps1 -Passthru
env:
AZURE_OPENAI_API_KEY: ${{ secrets.AZURE_OPENAI_API_KEY }}
AZURE_OPENAI_ENDPOINT: ${{ secrets.AZURE_OPENAI_ENDPOINT }}
AZURE_GPT_35_MODEL_ID: ${{ secrets.AZURE_GPT_35_MODEL_ID }}

8 changes: 4 additions & 4 deletions .github/workflows/pre-commit.yml
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
name: Code formatting

# see: https://help.github.com/en/actions/reference/events-that-trigger-workflows
on: # Trigger the workflow on push or pull request, but only for the main branch
push:
branches: [main]
pull_request: {}
on: # Trigger the workflow on pull request or merge
pull_request:
merge_group:
types: [checks_requested]

defaults:
run:
Expand Down
6 changes: 5 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -165,9 +165,13 @@ key_aoai.txt
base_aoai.txt
wolfram.txt

# DB on disk for TeachableAgent
# DB on disk for Teachability
tmp/
test/my_tmp/*

# Storage for the AgentEval output
test/test_files/agenteval-in-out/out/

# Files created by tests
*tmp_code_*
test/agentchat/test_agent_scripts/*
20 changes: 19 additions & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
default_language_version:
python: python3

exclude: 'dotnet'
ci:
autofix_prs: true
autoupdate_commit_msg: '[pre-commit.ci] pre-commit suggestions'
Expand Down Expand Up @@ -31,3 +31,21 @@ repos:
hooks:
- id: ruff
args: ["--fix"]
- repo: https://github.com/codespell-project/codespell
rev: v2.2.6
hooks:
- id: codespell
args: ["-L", "ans,linar,nam,"]
exclude: |
(?x)^(
pyproject.toml |
website/static/img/ag.svg |
website/yarn.lock |
notebook/.*
)$
- repo: https://github.com/nbQA-dev/nbQA
rev: 1.7.1
hooks:
- id: nbqa-ruff
args: ["--fix"]
- id: nbqa-black
6 changes: 4 additions & 2 deletions OAI_CONFIG_LIST_sample
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
// Please modify the content, remove these two lines of comment and rename this file to OAI_CONFIG_LIST to run the sample code.
// if using pyautogen v0.1.x with Azure OpenAI, please replace "base_url" with "api_base" (line 11 and line 18 below). Use "pip list" to check version of pyautogen installed.
// Please modify the content, remove these four lines of comment and rename this file to OAI_CONFIG_LIST to run the sample code.
// If using pyautogen v0.1.x with Azure OpenAI, please replace "base_url" with "api_base" (line 13 and line 20 below). Use "pip list" to check version of pyautogen installed.
//
// NOTE: This configuration lists GPT-4 as the default model, as this represents our current recommendation, and is known to work well with AutoGen. If you use a model other than GPT-4, you may need to revise various system prompts (especially if using weaker models like GPT-3.5-turbo). Moreover, if you use models other than those hosted by OpenAI or Azure, you may incur additional risks related to alignment and safety. Proceed with caution if updating this default.
[
{
"model": "gpt-4",
Expand Down
35 changes: 17 additions & 18 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,26 +7,29 @@


# AutoGen

[📚 Cite paper](#related-papers).
<!-- <p align="center">
<img src="https://github.com/microsoft/autogen/blob/main/website/static/img/flaml.svg" width=200>
<br>
</p> -->
:fire: Nov 24: pyautogen [v0.2](https://github.com/microsoft/autogen/releases/tag/v0.2.0) is released with many updates and new features compared to v0.1.1. It switches to using openai-python v1. Please read the [migration guide](https://microsoft.github.io/autogen/docs/Installation#python).

:fire: Nov 11: OpenAI's Assistants are available in AutoGen and interoperatable with other AutoGen agents! Checkout our [blogpost](https://microsoft.github.io/autogen/blog/2023/11/13/OAI-assistants) for details and examples.
:fire: Dec 31: [AutoGen: Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework](https://arxiv.org/abs/2308.08155) is selected by [TheSequence: My Five Favorite AI Papers of 2023](https://thesequence.substack.com/p/my-five-favorite-ai-papers-of-2023).

<!-- :fire: Nov 24: pyautogen [v0.2](https://github.com/microsoft/autogen/releases/tag/v0.2.0) is released with many updates and new features compared to v0.1.1. It switches to using openai-python v1. Please read the [migration guide](https://microsoft.github.io/autogen/docs/Installation#python). -->

<!-- :fire: Nov 11: OpenAI's Assistants are available in AutoGen and interoperatable with other AutoGen agents! Checkout our [blogpost](https://microsoft.github.io/autogen/blog/2023/11/13/OAI-assistants) for details and examples. -->

:fire: Nov 8: AutoGen is selected into [Open100: Top 100 Open Source achievements](https://www.benchcouncil.org/evaluation/opencs/annual.html) 35 days after spinoff.

:fire: Nov 6: AutoGen is mentioned by Satya Nadella in a [fireside chat](https://youtu.be/0pLBvgYtv6U) around 13:20.

:fire: Nov 1: AutoGen is the top trending repo on GitHub in October 2023.

:tada: Oct 03: AutoGen spins off from [FLAML](https://github.com/microsoft/FLAML) on Github and has a major paper update.
:tada: Oct 03: AutoGen spins off from FLAML on Github and has a major paper update (first version on Aug 16).

:tada: Aug 16: Paper about AutoGen on [arxiv](https://arxiv.org/abs/2308.08155). [📚 Cite paper](#related-papers).
<!-- :tada: Aug 16: Paper about AutoGen on [arxiv](https://arxiv.org/abs/2308.08155). -->

:tada: Mar 29: AutoGen is first created in [FLAML](https://github.com/microsoft/FLAML/pull/968).
:tada: Mar 29: AutoGen is first created in [FLAML](https://github.com/microsoft/FLAML).

<!--
:fire: FLAML is highlighted in OpenAI's [cookbook](https://github.com/openai/openai-cookbook#related-resources-from-around-the-web).
Expand Down Expand Up @@ -58,17 +61,13 @@ The easiest way to start playing is
2. Copy OAI_CONFIG_LIST_sample to ./notebook folder, name to OAI_CONFIG_LIST, and set the correct configuration.
3. Start playing with the notebooks!

## Using existing docker image
Install docker, save your oai key into an environment variable name OPENAI_API_KEY, and then run the following.

```
docker pull yuandongtian/autogen:latest
docker run -it -e OPENAI_API_KEY=$OPENAI_API_KEY -p 8081:8081 docker.io/yuandongtian/autogen:latest
```
*NOTE*: OAI_CONFIG_LIST_sample lists GPT-4 as the default model, as this represents our current recommendation, and is known to work well with AutoGen. If you use a model other than GPT-4, you may need to revise various system prompts (especially if using weaker models like GPT-3.5-turbo). Moreover, if you use models other than those hosted by OpenAI or Azure, you may incur additional risks related to alignment and safety. Proceed with caution if updating this default.
## [Installation](https://microsoft.github.io/autogen/docs/Installation)
### Option 1. Install and Run AutoGen in Docker

Then open `http://localhost:8081/` in your browser to use AutoGen. The UI is from `./samples/apps/autogen-assistant`. See docker hub [link](https://hub.docker.com/r/yuandongtian/autogen) for more details.
Find detailed instructions for users [here](https://microsoft.github.io/autogen/docs/Installation#option-1-install-and-run-autogen-in-docker), and for developers [here](https://microsoft.github.io/autogen/docs/Contribute#docker).

## Installation
### Option 2. Install AutoGen Locally

AutoGen requires **Python version >= 3.8, < 3.12**. It can be installed from pip:

Expand All @@ -83,11 +82,11 @@ Minimal dependencies are installed without extra options. You can install extra
pip install "pyautogen[blendsearch]"
``` -->

Find more options in [Installation](https://microsoft.github.io/autogen/docs/Installation).
Find more options in [Installation](https://microsoft.github.io/autogen/docs/Installation#option-2-install-autogen-locally-using-virtual-environment).

<!-- Each of the [`notebook examples`](https://github.com/microsoft/autogen/tree/main/notebook) may require a specific option to be installed. -->

For [code execution](https://microsoft.github.io/autogen/docs/FAQ/#code-execution), we strongly recommend installing the Python docker package and using docker.
Even if you are installing AutoGen locally out of docker, we recommend performing [code execution](https://microsoft.github.io/autogen/docs/FAQ/#code-execution) in docker. Find more instructions [here](https://microsoft.github.io/autogen/docs/Installation#docker).

For LLM inference configurations, check the [FAQs](https://microsoft.github.io/autogen/docs/FAQ#set-your-api-endpoints).

Expand Down Expand Up @@ -222,7 +221,7 @@ contact [[email protected]](mailto:[email protected]) with any additio

## Contributors Wall
<a href="https://github.com/microsoft/autogen/graphs/contributors">
<img src="https://contrib.rocks/image?repo=microsoft/autogen" />
<img src="https://contrib.rocks/image?repo=microsoft/autogen&max=200" />
</a>

# Legal Notices
Expand Down
Loading

0 comments on commit fc8fb78

Please sign in to comment.