Hey everyone,
We’re trying to build an internal chatbot (mostly leveraging Claude) to help our teams query internal data. Our data is scattered everywhere. We’ve got customer data in a messy SQL database, docs in Notion, and a legacy on-prem inventory system.
Right now, we are writing custom API wrappers and point-to-point integrations for every single tool so the LLM can talk to them. It’s a nightmare to maintain. Every time a schema changes or we add a new tool, the whole integration breaks. Has anyone dealt with this enterprise spaghetti? Is there a cleaner way to connect AI models to our messy tech stack without building a hundred fragile, custom connectors?