You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
error message:
PS E:\users\xxx\Desktop\xxx_code> cd .\study\llm\pandas-ai-main
PS E:\users\xxx\Desktop\xxx_code\study\llm\pandas-ai-main> docker-compose build
Failed to load E:\users\xxx\Desktop\xxx_code\study\llm\pandas-ai-main\server.env: open E:\users\xxx\Desktop\xxx_code\study\llm\pandas-ai-main\server.env: The system cannot find the file specified.
I want to use a locally deployed vllm openai interface for a large model. How to configure it in the. env file?
Motivation, pitch
I want to use a locally deployed vllm openai interface for a large model. How to configure it in the. env file?
Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered:
To configure the Azure OpenAI key and base_url in the .env file for a locally deployed vllm OpenAI interface, you need to set the following environment variables:
These variables correspond to the api_token, azure_endpoint, and api_version parameters in the AzureOpenAI class [1][2].
Regarding the error message about the missing .env file, ensure that the .env file is present in the server directory. If it's missing, you can create it by copying the .env.example file provided in the repository:
Navigate to the root directory of the cloned PandasAI repository:
cd pandas-ai
Copy the .env.example file to .env in the server directory:
cp server/.env.example server/.env
Edit the server/.env file to update the necessary environment variables, including your Azure OpenAI configuration [3].
🚀 The feature
error message:
PS E:\users\xxx\Desktop\xxx_code> cd .\study\llm\pandas-ai-main
PS E:\users\xxx\Desktop\xxx_code\study\llm\pandas-ai-main> docker-compose build
Failed to load E:\users\xxx\Desktop\xxx_code\study\llm\pandas-ai-main\server.env: open E:\users\xxx\Desktop\xxx_code\study\llm\pandas-ai-main\server.env: The system cannot find the file specified.
I want to use a locally deployed vllm openai interface for a large model. How to configure it in the. env file?
Motivation, pitch
I want to use a locally deployed vllm openai interface for a large model. How to configure it in the. env file?
Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: