Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Google Gemini API support #395

Closed
richlysakowski opened this issue Sep 22, 2023 · 14 comments
Closed

Google Gemini API support #395

richlysakowski opened this issue Sep 22, 2023 · 14 comments
Labels
enhancement New feature or request
Milestone

Comments

@richlysakowski
Copy link

richlysakowski commented Sep 22, 2023

Problem

Support / wrapper for Google PaLM and Bard missing. I don't find any mention of it here, yet PaLM is one of the major AI LLM. (@JasonWeill on 2023-12-29: The Gemini API is replacing PaLM — thanks @simonff)

Proposed Solution

Can we start a discussion about requirements packages, and tools needed to bring PaLM and Bard functionality into JupyterAI?

Is there a description of how to wrap new LLM engines?

Are there template wrappers or decorators to make this easy ?

What limitations do we need to take into account for this integration? I know that Google has been blocking Python wrappers that reverse engineer Bard, but what about PaLM API? What are the most effective workarounds for using a wrapper around Bard Chat?

Please help with this integration so we can get it done soon, because I am getting requests from Julyter.AI users I am training.

@richlysakowski richlysakowski added the enhancement New feature or request label Sep 22, 2023
@3coins
Copy link
Collaborator

3coins commented Sep 22, 2023

@richlysakowski
As long as LangChain has an LLM class for the provider you are interested in, this can be added to Jupyter AI. To get started, here are some existing example here for provider implementations.
https://github.com/jupyterlab/jupyter-ai/blob/main/packages/jupyter-ai-magics/jupyter_ai_magics/providers.py#L450

Here are the steps to add a new provider:

  1. Add a new class to providers.py which should extend from BaseProvider and VertexAI.
  2. Look at the existing implementation I referred above, and add model list, and any fields required to get API Keys, config etc.
  3. Add the provider id to the pyproject.toml file, the name should match the id of the class you created in 1.
  4. Add the new class in the import here.

@ishaan-jaff
Copy link

Hi @richlysakowski @3coins - I believe we can make this easier
I’m the maintainer of LiteLLM - we allow you to deploy a LLM proxy to call 100+ LLMs in 1 format - PaLM, Bedrock, OpenAI, Anthropic etc https://github.com/BerriAI/litellm/tree/main/openai-proxy.

If this looks useful (we're used in production)- please let me know how we can help.

Usage

PaLM request

curl http://0.0.0.0:8000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
     "model": "palm/chat-bison",
     "messages": [{"role": "user", "content": "Say this is a test!"}],
     "temperature": 0.7
   }'

gpt-3.5-turbo request

curl http://0.0.0.0:8000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
     "model": "gpt-3.5-turbo",
     "messages": [{"role": "user", "content": "Say this is a test!"}],
     "temperature": 0.7
   }'

claude-2 request

curl http://0.0.0.0:8000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
     "model": "claude-2",
     "messages": [{"role": "user", "content": "Say this is a test!"}],
     "temperature": 0.7
   }'

@simonff
Copy link

simonff commented Dec 29, 2023

Gemini API is superseding Palm API.

I'll follow @3coins's advice to add a new provider, but I can't figure out how to use jupyter-ai from local sources, so I might need some help with testing.

@krassowski
Copy link
Member

@simonff do you need help setting up jupyter-ai locally for development? Did you encounter any error, or failed at a particular step?

@JasonWeill JasonWeill changed the title Google PaLM support? when, how? Google Gemini API support Dec 29, 2023
@simonff
Copy link

simonff commented Dec 29, 2023

@krassowski : I think various installed versions got mixed up, and the last error I got is below. Yes, it would be great to try some self-contained instructions for running jupyter-ai locally. For now I'm just using github actions to run tests.

Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/notebook/traittypes.py", line 235, in _resolve_classes
klass = self._resolve_string(klass)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/google/home/simonf/.local/lib/python3.11/site-packages/traitlets/traitlets.py", line 2018, in _resolve_string
return import_item(string)
^^^^^^^^^^^^^^^^^^^
File "/usr/local/google/home/simonf/.local/lib/python3.11/site-packages/traitlets/utils/importstring.py", line 31, in import_item
module = import(package, fromlist=[obj])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ModuleNotFoundError: No module named 'jupyter_server.contents'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/bin/jupyter-notebook", line 33, in
sys.exit(load_entry_point('notebook==6.4.12', 'console_scripts', 'jupyter-notebook')())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3/dist-packages/jupyter_core/application.py", line 282, in launch_instance
super().launch_instance(argv=argv, **kwargs)
File "/usr/local/google/home/simonf/.local/lib/python3.11/site-packages/traitlets/config/application.py", line 1075, in launch_instance
app = cls.instance(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/google/home/simonf/.local/lib/python3.11/site-packages/traitlets/config/configurable.py", line 583, in instance
inst = cls(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^
File "/usr/local/google/home/simonf/.local/lib/python3.11/site-packages/traitlets/traitlets.py", line 1294, in new
inst.setup_instance(*args, **kwargs)
File "/usr/local/google/home/simonf/.local/lib/python3.11/site-packages/traitlets/traitlets.py", line 1337, in setup_instance
super(HasTraits, self).setup_instance(*args, **kwargs)
File "/usr/local/google/home/simonf/.local/lib/python3.11/site-packages/traitlets/traitlets.py", line 1313, in setup_instance
init(self)
File "/usr/lib/python3/dist-packages/notebook/traittypes.py", line 226, in instance_init
self._resolve_classes()
File "/usr/lib/python3/dist-packages/notebook/traittypes.py", line 238, in _resolve_classes
warn(f"{klass} is not importable. Is it installed?", ImportWarning)
TypeError: warn() missing 1 required keyword-only argument: 'stacklevel'

@krassowski
Copy link
Member

Yes, it looks like it stems from a conflict between packages installed on the system level and local version of Python. I would strongly suggest developing in a self-contained virtual environment.

@dlqqq
Copy link
Member

dlqqq commented Dec 29, 2023

@simonff To do a development installation locally, run these in your terminal:

# verify that you are at the root of the repo
pwd

# create & activate a new isolated Python environment named `jupyter-ai-dev`
# I recommend using `micromamba`, but you can use `conda` or `venv`
micromamba create -yn jupyter-ai-dev python=3.11 jupyterlab
micromamba activate jupyter-ai-dev

# install JS dependencies
jlpm

# build JS code & install `jupyter-ai` locally in this environment
jlpm dev-install

@simonff
Copy link

simonff commented Dec 29, 2023

I don't have conda or micromamba installed, so I'm trying venv. This worked:
/myenv/bin/pip install jupyterlab

But for the next step I get:
./myenv/bin/pip install jupyter-ai-dev
ERROR: Could not find a version that satisfies the requirement jupyter-ai-dev (from versions: none)
ERROR: No matching distribution found for jupyter-ai-dev

Sorry for beginner questions, I usually just run "pip install --break-system-packages". :)

@krassowski
Copy link
Member

In the example above jupyter-ai-dev is a name of the environment, not of the installable. Once you activate the environment, you should have a script named jlpm which is installed with jupyterlab package. Running this script using jlpm && jlpm dev-install should do the trick.

@simonff
Copy link

simonff commented Dec 29, 2023

Ok, 'jlpm' by itself runs (and I see it's coming from the myenv environment), but then I get (running from the root of the git repo):

jlpm dev-install

lerna notice cli v6.6.2

Lerna (powered by Nx) The following projects do not have a configuration for any of the provided targets ("dev-install")

  • @jupyter-ai/monorepo

Lerna (powered by Nx) Running target dev-install for 2 projects:

- @jupyter-ai/magics
- @jupyter-ai/core

——————————————————————————————————————————————————————————————————————————————————

@jupyter-ai/magics:dev-install

@jupyter-ai/magics: error: externally-managed-environment
@jupyter-ai/magics: × This environment is externally managed
@jupyter-ai/magics: ╰─> To install Python packages system-wide, try apt install
@jupyter-ai/magics: python3-xyz, where xyz is the package you are trying to
@jupyter-ai/magics: install.
@jupyter-ai/magics:
@jupyter-ai/magics: If you wish to install a non-Debian-packaged Python package,
@jupyter-ai/magics: create a virtual environment using python3 -m venv path/to/venv.
@jupyter-ai/magics: Then use path/to/venv/bin/python and path/to/venv/bin/pip. Make
@jupyter-ai/magics: sure you have python3-full installed.
@jupyter-ai/magics:
@jupyter-ai/magics: If you wish to install a non-Debian packaged Python application,
@jupyter-ai/magics: it may be easiest to use pipx install xyz, which will manage a
@jupyter-ai/magics: virtual environment for you. Make sure you have pipx installed.
@jupyter-ai/magics:
@jupyter-ai/magics: See /usr/share/doc/python3.11/README.venv for more information.
@jupyter-ai/magics: note: If you believe this is a mistake, please contact your Python installation or OS distribution provider. You can override this, at the risk of breaking your Python installation or OS, by passing --break-system-packages.
@jupyter-ai/magics: hint: See PEP 668 for the detailed specification.

——————————————————————————————————————————————————————————————————————————————————

Lerna (powered by Nx) Running target dev-install for 2 projects failed

Tasks not run because their dependencies failed or --nx-bail=true:

  • @jupyter-ai/core:dev-install

Failed tasks:

  • @jupyter-ai/magics:dev-install

@simonff
Copy link

simonff commented Dec 29, 2023

Meanwhile, this is the new code so far:

main...simonff:jupyter-ai:main

@giswqs
Copy link
Contributor

giswqs commented Mar 3, 2024

I tried to add the Gemini provider, but could not get Gemini to show up on the model list. Any help will be appreciated.
#666

@dlqqq
Copy link
Member

dlqqq commented Mar 6, 2024

This issue should be resolved by #666, which adds support for Gemini. Thanks to @giswqs for working on this! Users will have access to this in the next release, which we are planning for sometime this week.

@dlqqq dlqqq added this to the v2.12.0 milestone Mar 6, 2024
@dlqqq dlqqq closed this as completed Mar 6, 2024
@giswqs
Copy link
Contributor

giswqs commented Mar 6, 2024

@dlqqq Thank you for your help! I couldn't have done it without your guidance. Thank you for accepting the PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

7 participants