Store and update LLM Prompts

In your experience, what’s the best way to store and update prompts for LLM APIs?
I’m building a job-hunting startup called interviuu (you can check it out if you want), and I’m using several prompts across different LLM providers (I’m also considering switching to OpenRouter, but that’s another topic).

Right now, I’m just managing prompts by passing them as parameters in the API call using the API connector and setting them to private. But honestly that doesn’t feel like a scalable or manageable solution to me.

What’s your take on this? How do you handle prompts in your app?

Thank you :))
Francesco from interviuu