Dynamic Self-Discovery Is The Super Power of MCP
Dynamic Self-Discovery with MCP Let Your LLM Find Its Own Tools
Large Language model (LLM) based applications are evolving from passive assistants to actively exploring their digital environments to fulfil user intent.
At the beginning, LLMs only had access to their trained knowledge base, this changed when In-Context Learning (ICL) was discovered. Which spawned a whole RAG revolution.
Subsequently the dynamic nature of AI assistants in terms of discovery has extended to web browsing, OS use and more. The newest integration touch point is model context protocol (MCP).
- Also follow me on LinkedIn or on X! 🙂
MCP acts as a connector from the AI Agent or assistant to an MCP server.
One critical capability they need is dynamic self-discovery: the ability to understand what tools are available at runtime and decide how to use them.
What Is MCP?
The MCP server acts as a registry of tools that an LLM can access, query and invoke. It exposes metadata about available capabilities — such as search engines, data analysers, or domain-specific APIs — in a structured format (often JSON).
This allows the LLM to ask:
“What tools do I have right now, and which one helps me complete my goal?”
What Is Dynamic Self-Discovery?
Dynamic self-discovery is when the LLM learns what tools are available on-the-fly — without hardcoding them. This enables flexible, extensible AI Agents that can reason about and invoke capabilities as needed.
Python Example: Discovering Tools from MCP
Below is a basic example where I query a MCP server, retrieve available tools, and ask OpenAI’s LLM to use it as context…
Why It Matters
Dynamic self-discovery is a foundational piece of LLM autonomy.
It means:
No static tool lists.
Easier updates — tools come and go without rewriting code.
Smarter agents — ones that understand how to choose what they use.
This is the beginning of AI systems that learn, adapt and act — in real time.
Working Practical Example
Below is an MCP implementation using the OpenAI SDK…
The LLM is told to connect to an MCP server labeled deepwiki
, with three tools allowed:
read_wiki_structure
read_wiki_contents
ask_question
The model receives the user’s query ("What is the difference between LangChain and LangGraph?"
).
Based on its understanding of the tools and the query, the model chooses whether and how to use one of the available MCP tools.
For example, it might first call read_wiki_structure
to navigate topics, then use read_wiki_contents
to pull article data, and finally synthesise an answer.
The require_approval: "always"
flag means tool usage is mediated — useful for logging or human-in-the-loop systems.
import os
os.environ["OPENAI_API_KEY"] = "<Your API Key>"
from openai import OpenAI
client = OpenAI()
response = client.responses.create(
model="gpt-4.1-mini",
input=[
{
"role": "user",
"content": "What is the difference between LangChain and LangGraph?"
}
],
text={"format": {"type": "text"}},
reasoning={},
tools=[
{
"type": "mcp",
"server_label": "deepwiki",
"server_url": "https://mcp.deepwiki.com/mcp",
"allowed_tools": [
"read_wiki_structure",
"read_wiki_contents",
"ask_question"
],
"require_approval": "always"
}
],
temperature=1,
max_output_tokens=2048,
top_p=1,
store=True
)
print(response.json)
Why This Is a Big Deal
Dynamic self-discovery means:
You can plug in new tools without retraining the model.
LLMs act more like AI Agents — figuring out how to solve a problem using what’s available.
Your architecture becomes modular and extensible, with LLMs reasoning across external systems in real time.
This is a foundational step toward true LLM autonomy.
Chief Evangelist @ Kore.ai | I’m passionate about exploring the intersection of AI and language. From Language Models, AI Agents to Agentic Applications, Development Frameworks & Data-Centric Productivity Tools, I share insights and ideas on how these technologies are shaping the future.
Using MCP with OpenAI & MCP Servers
How MCP Servers Supercharge vertical AI Agent Integrationcobusgreyling.medium.com
OpenAI Has Added Remote MCP Server Support
OpenAI has introduced support for remote MCP servers in its Responses API, following the integration of MCP in the…cobusgreyling.medium.com
COBUS GREYLING
Where AI Meets Language | Language Models, AI Agents, Agentic Applications, Development Frameworks & Data-Centric…cobusgreyling.com