An Agent Framework agent that connects to a remote MCP server (GitHub) for tool discovery and hosted using the Responses protocol. Instead of defining tools locally, the agent discovers and invokes tools at runtime from an MCP-compatible endpoint — in this case, the GitHub Copilot MCP server. This enables dynamic tool integration without redeployment.
The agent uses FoundryChatClient from the Agent Framework to create an OpenAI-compatible Responses client. It registers a remote MCP tool pointing at https://api.githubcopilot.com/mcp/, authenticating with a GitHub Personal Access Token (PAT). When the model decides to call a tool, the framework forwards the call to the MCP server and returns the result to the model for the final reply.
See main.py for the full implementation.
The agent is hosted using the Agent Framework with the ResponsesHostServer, which provisions a REST API endpoint compatible with the OpenAI Responses protocol.
Follow the instructions in the Running the Agent Host Locally section of the README in the parent directory to run the agent host.
Depending on how you run the agent host, you can invoke the agent using
curl(Invoke-WebRequestin PowerShell) orazd. Please refer to the parent README for more details. Use this README for sample queries you can send to the agent.
Send a POST request to the server with a JSON body containing an "input" field to interact with the agent. For example:
curl -X POST http://localhost:8088/responses -H "Content-Type: application/json" -d '{"input": "List all the repositories I own on GitHub."}'To host the agent on Foundry, follow the instructions in the Deploying the Agent to Foundry section of the README in the parent directory.