Let me get this straight: there’s no way to control the cost of inference? Does this mean that if my API key gets compromised, I could end up paying thousands of dollars because there’s no way to establish spending limits and no system in place to warn about usage?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Inference API Limits?
|
2 | 42 | February 14, 2025 | |
Inference API Privacy
|
0 | 11 | March 13, 2025 | |
How do I set a spend limit on Lambda Cloud? | 1 | 603 | March 8, 2024 | |
Lambda <> Openrouter Woes | 11 | 115 | January 17, 2025 | |
Model and content limit
|
2 | 52 | February 19, 2025 |