IMHO, this is how integrations with chat GPT should work. Default to a local model or a private cloud model, and if that doesn’t work, ask the user if they want the query to go to another LLM. And let people turn off the external LLM prompts entirely.
Let make it opt it. And make opting out very prominent.
“Do you want me to use ChatGPT to do that?”
No. I don’t. I really really don’t.
You can turn off the ability for it to request chatGPT if it can’t resolve the request on its own
IMHO, this is how integrations with chat GPT should work. Default to a local model or a private cloud model, and if that doesn’t work, ask the user if they want the query to go to another LLM. And let people turn off the external LLM prompts entirely.
Let make it opt it. And make opting out very prominent.