Let me get this straight: there’s no way to control the cost of inference? Does this mean that if my API key gets compromised, I could end up paying thousands of dollars because there’s no way to establish spending limits and no system in place to warn about usage?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
How do I set a spend limit on Lambda Cloud? | 1 | 749 | March 8, 2024 | |
Inference API Limits?
|
2 | 100 | February 14, 2025 | |
Deepseek v3 Billing
|
2 | 119 | April 23, 2025 | |
Lambda <> Openrouter Woes | 11 | 215 | January 17, 2025 | |
Model and content limit
|
2 | 99 | February 19, 2025 |