Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider adding support for gpt-3.5-turbo-instruct model #84

Open
smuotoe opened this issue Oct 2, 2023 · 2 comments
Open

Consider adding support for gpt-3.5-turbo-instruct model #84

smuotoe opened this issue Oct 2, 2023 · 2 comments

Comments

@smuotoe
Copy link

smuotoe commented Oct 2, 2023

Using the instruct model raises an error.

ai = AIChat(console=False, model="gpt-3.5-turbo-instruct")
KeyError: "No AI generation: {'error': {'message': 'This is not a chat model and thus not supported in the v1/chat/completions endpoint. Did you mean to use v1/completions?', 'type': 'invalid_request_error', 'param': 'model', 'code': None}}"
@minimaxir
Copy link
Owner

So I won't add support for this due to the reason gpt-3.5-turbo-instruct exists: it was added to accomidate the soon-to-be-depreciated GPT-3 models. As a result, it has a different API that's othogonal to the current API simpleaichat uses, so it will increase scope for not much gain.

OpenAI doesn't intend for its use in day-to-day operations compared to its ChatGPT APIs.

@minimaxir
Copy link
Owner

Actually it may be better to add an explicit assert dictating this.

@minimaxir minimaxir reopened this Nov 14, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants