Inference API error: "model ID was not provided"

I tried calling the inference API with the following curl…

curl -sS https://api.lambda.ai/v1/completions \
  -H "Authorization: Bearer <REDACTED>" \
  -H "Content-Type: application/json" \
  -d '{
        "model": "llama-4-scout-17b-16e-instruct",
        "prompt": "Computers are",
        "temperature": 0
      }'

…but this resulted in the following response:

{"message":"model ID was not provided","type":"invalid_request_error"}

The model ID is definitely right there in the request… so what should I do?

I’m not able to reproduce this failure. Are you still getting this error?

The only thing I can think of would be if your API key is invalid and this is a misleading error as a result of that. But I believe in that case you should be getting back an “unauthorized” error message.