Replies: 2 comments 5 replies
-
o1 doesn't support a lot of parameters that are normally accepted with the chat completions endpoint. o1 will work with agents so long as you don't have any tools selected, or any other parameters that it doesn't accept. It's also worth mentioning, o1 will only work for OpenAI when using Agents, not Azure, as o1 used from the latter doesn't support streaming, which is a requirement here. |
Beta Was this translation helpful? Give feedback.
-
@danny-avila - It might be worthwhile to maintain a table of supported features until everything reaches parity. The documentation on all of this doesn't warn a user that this will happen and I suspect folks will keep bringing up the issue because of it. Personally, using the code-interpreter with o1 is exactly what I wanted to do. But alas, I had no way to know it isn't supported at this time. |
Beta Was this translation helpful? Give feedback.
-
What happened?
I tested the last version of Agents, which works fine for GPT-4o or Claude 3.5 v2 for instance. But I get an error when trying to use it with o1-mini or o1-preview.
Here is the error I get in the logs :
2024-12-11T17:17:22.705Z error: [api/server/controllers/agents/client.js #sendCompletion] Unhandled error type 400 Unsupported value: 'temperature' does not support 0.25 with this mo... [truncated]
Indeed, I set the temperature to 0.25, but when I tried again with temperature set to 1 (default value), I had the same error.
Steps to Reproduce
Create an agent with o1-mini model or o1-preview model.
What browsers are you seeing the problem on?
Chrome
Relevant log output
Screenshots
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions