You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I hope this message finds you well. I'm writing this issue to share a suggestion that could potentially enhance the performance and robustness of the Poe-OpenAI-Proxy project.
Currently, as I understand, clients provide tokens to access the API via server configuration. This method could potentially limit the project's resiliency and robustness, especially in scenarios where a dynamic switch of tokens is needed to bolster the project's self-recovery ability.
To address this, I propose a change to the codebase of the server-side that will allow it to read tokens directly from the Apikey sent through an HTTP request by the client. This way, we can achieve better mimicry of the standard OpenAI API and also allows more flexibility for users.
The proposed approach could take the form of an array – ["token1", "token2",...] – passed in as an Authorization header's field. With this in place, the server could effortlessly retrieve the latest tokens directly from this array without any need for manual server-side configuration by the user.
By incorporating this feature, the project could benefit from improved resilience, robustness, and provide an enhanced user experience for those who use this project.
Please consider this proposal, and let me know your thoughts on this matter. I believe this change will bring significant additions to the project and the users thereof.
Thank you for your time and consideration.
The text was updated successfully, but these errors were encountered:
It's unsafe to expose a route to do this stuff,and maybe using one given token to ask poe will cause problem in some app where the key should be ignored.
It's unsafe to expose a route to do this stuff,and maybe using one given token to ask poe will cause problem in some app where the key should be ignored.
Actually it solidify the security since you don't need to store token as plain text and TLS will ensure the point-to-point security while passing tokens via HTTP.
Hi @juzeon,
I hope this message finds you well. I'm writing this issue to share a suggestion that could potentially enhance the performance and robustness of the Poe-OpenAI-Proxy project.
Currently, as I understand, clients provide tokens to access the API via server configuration. This method could potentially limit the project's resiliency and robustness, especially in scenarios where a dynamic switch of tokens is needed to bolster the project's self-recovery ability.
To address this, I propose a change to the codebase of the server-side that will allow it to read tokens directly from the Apikey sent through an HTTP request by the client. This way, we can achieve better mimicry of the standard OpenAI API and also allows more flexibility for users.
The proposed approach could take the form of an array – ["token1", "token2",...] – passed in as an Authorization header's field. With this in place, the server could effortlessly retrieve the latest tokens directly from this array without any need for manual server-side configuration by the user.
By incorporating this feature, the project could benefit from improved resilience, robustness, and provide an enhanced user experience for those who use this project.
Please consider this proposal, and let me know your thoughts on this matter. I believe this change will bring significant additions to the project and the users thereof.
Thank you for your time and consideration.
The text was updated successfully, but these errors were encountered: