MCP servers won’t be around imho. The field of LLM app development is new, and people are desperately trying to grab a railing to feel like they have a handle of all the coursework that’s coming down for this study.
For example, everyone thought solving context and RAG meant Vector Databases. It’s analogues to things we used to understand (hey, we need a databases, duh). Forget what you know, and be ready to throw it all out.
It’s silly to think we’ve agreed on anything other than the OpenAI API format, which even that, is just a simple HTTP call, with a simple expected formatted response.
For example, everyone thought solving context and RAG meant Vector Databases. It’s analogues to things we used to understand (hey, we need a databases, duh). Forget what you know, and be ready to throw it all out.
It’s silly to think we’ve agreed on anything other than the OpenAI API format, which even that, is just a simple HTTP call, with a simple expected formatted response.
Premature would be the word.