a2a-protocol-demo3

keiu-jiyu
1
This project is a deep practical example of the A2A (Agent-to-Agent) protocol and MCP (Model Context Protocol). Core Features: - Communication Layer: Implements standard JSON-RPC command interaction and SSE (Server-Sent Events) streaming responses based on FastAPI. - Data Layer: Defines structured Slots (语义槽位) to store context. - Logic Layer: Implements the Context Funnel (上下文漏斗) engine, demonstrating how to dynamically compress full memory (Memory) into a limited Prompt, addressing the LLM Token limitation issue.

Overview

What is a2a-protocol-demo3

a2a-protocol-demo3 is a comprehensive demonstration project that showcases the A2A (Agent-to-Agent) protocol and the MCP (Model Context Protocol). It illustrates how these two core pillars of modern agent systems can be implemented, focusing on standardized communication and context management.

How to Use

To use a2a-protocol-demo3, first install the required dependencies using Python 3.8 or higher. Start the server with 'python server.py', which will run on http://localhost:8000. Then, run the client with 'python client.py' to simulate multi-turn conversations with the agent and observe the internal processing.

Key Features

Key features include: 1) A2A communication layer using JSON-RPC 2.0 for command interaction and SSE for streaming responses; 2) MCP data layer defining structured Slots for context storage; 3) Context Funnel engine that dynamically compresses memory into limited prompts to address LLM token constraints.

Where to Use

a2a-protocol-demo3 can be utilized in various fields such as AI development, conversational agents, and any application requiring efficient context management and communication between agents.

Use Cases

Use cases include simulating conversations with agents, demonstrating memory compression in long dialogues, and exploring the interaction between user inputs and agent responses in real-time.

Content