Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

request to merge packages #8

Open
Fuco1 opened this issue Mar 3, 2023 · 3 comments
Open

request to merge packages #8

Fuco1 opened this issue Mar 3, 2023 · 3 comments

Comments

@Fuco1
Copy link

Fuco1 commented Mar 3, 2023

Hey 😁

I started writing my own package following the release of chat completion Api couple of days ago.

When i tried to create the org it was already taken and I got here. We need to market this a lot more, the gpt integration into Emacs is a deal breaker.

I would like to collaborate on this, soon I will submit PRs with some functions from my own library.

How do you feel about EIEIO based low level library just implementing the API. with pcase/dash macros for expanding the responses. I'm thinking somethi g like lsp-mode.internals.

Cheers!

@jcs090218
Copy link
Member

Hi! 😁

When i tried to create the org it was already taken and I got here. We need to market this a lot more, the gpt integration into Emacs is a deal breaker.

Oops, sorry. I can add you to the org if you want?

How do you feel about EIEIO based low level library just implementing the API. with pcase/dash macros for expanding the responses. I'm thinking somethi g like lsp-mode.internals.

I haven't tried EIEO in Emacs ecosystem yet, but I would like how it works!

@Fuco1
Copy link
Author

Fuco1 commented Mar 3, 2023

I find EIEIO nice for APIs because you can define the interfaces as "classes" which makes it a bit more discoverable.

For example, this is what I cooked up for the new chat/completion endpoint:

You have some base class with common properties and then various requests and some method that can convert the objects into JSON (openapi-request-serialize). It's very easy to add additional messages, you just add the objects and the serialization function. In lsp-mode, they also somehow autogenerate pcase deconstructing macros (and dash) for the responses, so you can "drill down"

(defclass openapi-request ()
  ((content-type :initform "application/json")
   (endpoint :type string))
  :abstract t)

(defclass openapi-request-chat-completions-message ()
  ((role :initarg :role :type string)
   (content :initarg :content :type string)))

(cl-defmethod openapi-request-serialize ((this openapi-request-chat-completions-message))
  `((role . ,(oref this role)) (content . ,(oref this content))))

(defclass openapi-request-chat-completions (openapi-request)
  ((endpoint :initform "https://api.openai.com/v1/chat/completions")
   (model :initarg :model :initform "gpt-3.5-turbo")
   (messages :initarg :messages :initform nil)))

(cl-defmethod openapi-request-serialize ((this openapi-request-chat-completions))
  `((model . ,(oref this model))
    (messages . ,(apply #'vector (mapcar #'openapi-request-serialize (oref this messages))))))

Then there is one universal function to call the API (you pass it the request)

(cl-defgeneric openapi-request ((this openapi-client) (req openapi-request))
  (let* ((prog-timer (run-with-timer 0 0.1 (openai-thinking)))
         (process
          (plz 'post (oref req endpoint)
            :headers `(("Authorization" . ,(format "Bearer %s" openapi-token))
                       ("Content-Type" . ,(oref req content-type)))
            :body (json-serialize (openapi-request-serialize req)
                                  :null-object nil
                                  :false-object :json-false)
            :as #'json-read
            ;;; for now I'm using dumb alist destructuring, only works for chat completion requests.
            :then (-lambda ((&alist 'choices [(&alist 'message (&alist 'content))]))
                    (cancel-timer prog-timer)
                    (with-current-buffer (get-buffer-create "*openapi*")
                      (erase-buffer)
                      (insert content)
                      (markdown-mode)
                      (if (<= (count-lines (point-min) (point-max)) 1)
                          (message (buffer-string))
                        (pop-to-buffer (current-buffer)))))
            :else (lambda (&rest x) (cancel-timer prog-timer)))))))

I'm using plz instead of request, I find it very nice and light library, plus it uses curl instead of built-in emacs processes so it's much faster (and provides really cool concurrency features when you need to make hundreds of requests, like for mass embeddings).

I'll go over your code in more detail and see how it's done. I feel like a low-level-ish SDK would be immensly useful for people to quickly start hacking on their own cool plugins. But also we can provide a comprehensive package (like codegpt, i.e. this package) with pre-built cool features.

@jcs090218
Copy link
Member

I never tried plz and EIEIO, so I don't have too many thoughts on it! 😓

The intention of the upstream openai.el is indeed to provide low-level-ish SDK. Feel free to open issues or PRs; I'm always open to collaborating. :D

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants