Shashi Kanth G S

Airline Domain · Solution Architect · Applied AI · Cloud · App Modernization

DEV AI ToolingFebruary 27, 2026 · DEV5 min read

Build a Vendor-Neutral A2A Agent That Works With Any LLM Provider

A practical guide to exposing OpenCode behind A2A so orchestration logic stays stable even when the underlying model provider changes.

A2A OpenCode LLM Providers Agent Systems

This article makes a sharp architectural point: direct provider coupling becomes technical debt the moment teams want to benchmark, switch, or specialize models. The answer proposed here is to keep the orchestrator speaking A2A while OpenCode handles provider-specific execution underneath.

That shift turns model selection into configuration rather than integration work. It is a small implementation detail with large long-term consequences for portability and platform agility.

What the article demonstrates

  • OpenCode can sit behind an A2A wrapper as the execution runtime.
  • Anthropic, OpenAI, GitHub Copilot, and other providers can be swapped without changing the orchestration contract.
  • MCP tools remain part of the stack, so capability expansion still happens through standard interfaces.
  • Multi-agent routing becomes easier because each agent exposes the same protocol surface.

Why it belongs in this portfolio

The article reinforces a consistent theme across your work: prefer architecture that preserves optionality. Rather than betting on one provider-specific integration path, wrap strong runtimes behind stable protocol boundaries and let the system evolve underneath.