Inference API call returning "HTTP/1.1 400 Bad Request"

Hi ,

We are using lambdalabs for inference. For some requests I am getting “HTTP/1.1 400 Bad Request” in response from https://api.lambdalabs.com/v1/chat/completions.

As response does not have any specific indicator of whats wrong in the request , so its difficult to find out where its failing to serve the inference. Please can anyone guide on this. Thanks

Can you try the Using the Lambda Inference API > Creating chat completions cURL example and post the results?