-
Notifications
You must be signed in to change notification settings - Fork 4.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: "The model produces invalid content" #5876
Comments
The trajectory you linked doesn't show anything for me, except "agent loading". Is that normal? Was this the very first thing in that session? |
That's the thing, I don't know if the error in the logs has impacted the ability to generate LLM responses for that session |
Can you tell what happened before you got that error? |
The agent attempts to initialise, the message in the UI is "Waiting for client to become available", and the prompt is already available so that it's used as soon as the agent has finished initializing |
You ran with the UI, right, with the command in the readme? It normally needs the user to say something first. Sorry, I don't understand if you did. What was your prompt? |
the command is docker run
-p 3000:3000
--env LOG_ALL_EVENTS=true
--env SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:main-nikolaik
--env WORKSPACE_MOUNT_PATH=C:\repositories\extensions\youtube-time-manager
--name openhands-app
--pull always
--add-host host.docker.internal:host-gateway
-v C:/repositories/extensions/youtube-time-manager:/opt/workspace_base
-v /var/run/docker.sock:/var/run/docker.sock
-v ~/.openhands-state:/.openhands-state
--rm
docker.all-hands.dev/all-hands-ai/openhands:0.17 the prompt is
|
I'd preferably want to use PMPM but considering its symlink nature, it will be easier for the agent to deal with NPM |
Probably unrelated, but from what I see, you're not running in WSL, but trying to run on Windows directly? If you run again, with this prompt, does it still happen? If yes, could you please get a little more logs, to see the full error, and perhaps step? |
Yes, works flawlessly
Sometimes it happens, sometimes it doesn't. I tried both running GPT-4o, GPT-4o Turbo and Claude 3.5 Sonnet with this prompt |
Please if you try again, use |
Is there an existing issue for the same bug?
Describe the bug and reproduction steps
https://www.all-hands.dev/share?share_id=dab4a77e7d64e7a4dc6124dc672d3f4beb2d411a33155977425b821e292d4f4c
The LLM is
gpt-4o
In the logs I got
{'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
OpenHands Installation
Docker command in README
OpenHands Version
0.17
Operating System
Windows
Logs, Errors, Screenshots, and Additional Context
No response
The text was updated successfully, but these errors were encountered: