I’m trying to use llama 3.3 70b but it keeps saying model not found? It shows up on the /models endpoint?
I, [2024-12-22T19:19:31.304996 #1571779] INFO -- [Langchain.rb]: request: POST https://api.lambdalabs.com/v1/chat/completions
I, [2024-12-22T19:19:31.305046 #1571779] INFO -- [Langchain.rb]: request: Content-Type: "application/json"
Authorization: "Bearer APIKEY"
I, [2024-12-22T19:19:31.305118 #1571779] INFO -- [Langchain.rb]: request: {"messages":[{"role":"user","content":"CONTENT if MPG"}],"model":"llama3.3-70b-instruct-fp8","temperature":0.3,"n":1}
I, [2024-12-22T19:19:31.315641 #1571779] INFO -- [Langchain.rb]: response: Status 400