You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched the Inference issues and found no similar bug report.
Bug
If you pass the wrong API key to /inference_pipelines/initialise it will create a zombie process that spits out all sorts of errors indefinitely.
Logs:
[12/26/24 14:52:42] ERROR Could not handle Command. request_id=02820ce1-9dcb-4dd3-b233-f75c3a74f4a1, error=Could not find requested Roboflow resource. Check that the provided inference_pipeline_manager.py:517
dataset and version are correct, and check that the provided Roboflow API key has the correct permissions., error_type=ErrorType.NOT_FOUND,
public_error_message=Requested Roboflow resources (models / workflows etc.) not available or wrong API key used.
Traceback (most recent call last):
File "/Volumes/Code/inference/inference/core/roboflow_api.py", line 87, in wrapper
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Volumes/Code/inference/inference/core/roboflow_api.py", line 499, in get_workflow_specification
response = _get_from_url(url=api_url)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Volumes/Code/inference/inference/core/roboflow_api.py", line 599, in _get_from_url
api_key_safe_raise_for_status(response=response)
File "/Volumes/Code/inference/inference/core/utils/requests.py", line 15, in api_key_safe_raise_for_status
response.raise_for_status()
File "/Users/yeldarb/venvs/inference-devel/lib/python3.11/site-packages/requests/models.py", line 1024, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://api.roboflow.com/roboflow-docs/workflows/clip-frames?api_key=*******
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Volumes/Code/inference/inference/core/interfaces/stream_manager/manager_app/inference_pipeline_manager.py", line 175, in
_initialise_pipeline
self._inference_pipeline = InferencePipeline.init_with_workflow(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Volumes/Code/inference/inference/core/utils/function.py", line 35, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/Volumes/Code/inference/inference/core/interfaces/stream/inference_pipeline.py", line 596, in init_with_workflow
workflow_specification = get_workflow_specification(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Volumes/Code/inference/inference/core/roboflow_api.py", line 100, in wrapper
error_handler(error)
File "/Volumes/Code/inference/inference/core/roboflow_api.py", line 73, in <lambda>
404: lambda e: raise_from_lambda(
^^^^^^^^^^^^^^^^^^
File "/Volumes/Code/inference/inference/core/roboflow_api.py", line 63, in raise_from_lambda
raise exception_type(message) from inner_error
inference.core.exceptions.RoboflowAPINotNotFoundError: Could not find requested Roboflow resource. Check that the provided dataset and version are
correct, and check that the provided Roboflow API key has the correct permissions.
ERROR Could not handle Command. request_id=d0ab647dc91a42ef92faa560c88a5051, error=None, error_type=ErrorType.OPERATION_ERROR, public_error_message=Cannot inference_pipeline_manager.py:517
retrieve InferencePipeline status. Try again later - Inference Pipeline not initialised.
NoneType: None
[12/26/24 14:52:42] ERROR Error with command handling raised by InferencePipeline Manager. error_type=not_found error_class=RoboflowAPINotNotFoundError error_message=Could not find stream_manager_client.py:342
requested Roboflow resource. Check that the provided dataset and version are correct, and check that the provided Roboflow API key has the correct
permissions.
Traceback (most recent call last):
File "/Volumes/Code/inference/inference/core/interfaces/http/http_api.py", line 265, in wrapped_route
return await route(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Volumes/Code/inference/inference/core/interfaces/http/http_api.py", line 1387, in initialise
return await self.stream_manager_client.initialise_pipeline(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Volumes/Code/inference/inference/core/interfaces/stream_manager/api/stream_manager_client.py", line 132, in initialise_pipeline
response = await self._handle_command(command=command)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Volumes/Code/inference/inference/core/interfaces/stream_manager/api/stream_manager_client.py", line 213, in _handle_command
dispatch_error(error_response=response)
File "/Volumes/Code/inference/inference/core/interfaces/stream_manager/api/stream_manager_client.py", line 348, in dispatch_error
raise ERRORS_MAPPING[error_type](
inference.core.interfaces.stream_manager.api.errors.ProcessesManagerNotFoundError: Error with command handling raised by InferencePipeline Manager. Error type: not_found. Details: Could not find requested Roboflow resource. Check that the provided dataset and version are correct, and check that the provided Roboflow API key has the correct permissions.
INFO: 127.0.0.1:59145 - "POST /inference_pipelines/initialise HTTP/1.1" 404 Not Found
[12/26/24 14:52:43] ERROR Could not handle Command. request_id=7ffd4a3afd21479098ee9a5fc5f5e882, error=None, error_type=ErrorType.OPERATION_ERROR, public_error_message=Cannot inference_pipeline_manager.py:517
retrieve InferencePipeline status. Try again later - Inference Pipeline not initialised.
NoneType: None
[12/26/24 14:52:44] ERROR Could not handle Command. request_id=0f268db1ab6845d99055293492e6fb00, error=None, error_type=ErrorType.OPERATION_ERROR, public_error_message=Cannot inference_pipeline_manager.py:517
retrieve InferencePipeline status. Try again later - Inference Pipeline not initialised.
NoneType: None
[12/26/24 14:52:45] ERROR Could not handle Command. request_id=f4788cc9e67a431292d3a056ae1d30ca, error=None, error_type=ErrorType.OPERATION_ERROR, public_error_message=Cannot inference_pipeline_manager.py:517
retrieve InferencePipeline status. Try again later - Inference Pipeline not initialised.
NoneType: None
[12/26/24 14:52:46] ERROR Could not handle Command. request_id=6bb10427411245e5bd6f3c85bbdba14b, error=None, error_type=ErrorType.OPERATION_ERROR, public_error_message=Cannot inference_pipeline_manager.py:517
retrieve InferencePipeline status. Try again later - Inference Pipeline not initialised.
NoneType: None
[12/26/24 14:52:47] ERROR Could not handle Command. request_id=3fa4944af1214028b175977df5a8e871, error=None, error_type=ErrorType.OPERATION_ERROR, public_error_message=Cannot inference_pipeline_manager.py:517
retrieve InferencePipeline status. Try again later - Inference Pipeline not initialised.
NoneType: None
Environment
Inference installed from main branch
MacOS
Minimal Reproducible Example
Using a public Workflow that doesn't require an API Key but passing an invalid one:
from inference_sdk import InferenceHTTPClient
client = InferenceHTTPClient(
api_url="http://localhost:9001", # use local inference server
api_key="invalid" # optional to access your private data and models
)
result = client.start_inference_pipeline_with_workflow(
video_reference=[0],
workspace_name="roboflow-docs",
workflow_id="clip-frames",
max_fps=max_fps,
workflows_parameters={
"prompt": "blurry",
"threshold": 0.16
}
)
Additional
No response
Are you willing to submit a PR?
Yes I'd like to help by submitting a PR!
The text was updated successfully, but these errors were encountered:
Search before asking
Bug
If you pass the wrong API key to
/inference_pipelines/initialise
it will create a zombie process that spits out all sorts of errors indefinitely.Logs:
Environment
Minimal Reproducible Example
Using a public Workflow that doesn't require an API Key but passing an invalid one:
Additional
No response
Are you willing to submit a PR?
The text was updated successfully, but these errors were encountered: