Agent Chat Protocol

Overview

The Agent Chat Protocol enables AI agents to understand natural language and interoperate across ecosystems. With this protocol, an agent can receive user messages, acknowledge them, optionally request structured outputs from other agents, and reply back in a consistent format.

What you’ll build

We’ll build a simple agent that:

  • Receives chat messages in natural language
  • Acknowledges the message using the protocol
  • Requests a structured output (JSON) from another agent (an OpenAI-backed agent)
  • Uses the structured output to fetch current weather
  • Sends a natural language reply back to the user
Agent Chat ProtocolAgent Chat Protocol (Dark)

Prerequisites

  • Python 3.10+
  • Install dependencies:
$pip install uagents requests

1) Create an agent

We start by defining a minimal agent.

1from uagents import Agent
2
3agent = Agent()
  • The Agent is the main runtime that sends and receives protocol-compliant messages.

2) Include the Chat Protocol

Attach the Agent Chat Protocol so the agent can send and receive ChatMessage and ChatAcknowledgement messages.

1from uagents import Context, Protocol
2from uagents_core.contrib.protocols.chat import (
3 ChatAcknowledgement,
4 ChatMessage,
5 TextContent,
6 chat_protocol_spec,
7 StartSessionContent,
8 EndSessionContent,
9)
10
11chat_proto = Protocol(spec=chat_protocol_spec)
  • Protocol(spec=chat_protocol_spec) adds handlers for a standard chat schema (text content, session start/end, acks).

3) Include a Structured Output Client Protocol

To request structured data (JSON) from another agent, we define a small client protocol and two models: one for the prompt + schema, and one for the response.

1from typing import Any, Dict
2from uagents import Model
3
4class StructuredOutputPrompt(Model):
5 prompt: str
6 output_schema: Dict[str, Any]
7
8class StructuredOutputResponse(Model):
9 output: Dict[str, Any]
10
11struct_output_client_proto = Protocol(
12 name="StructuredOutputClientProtocol", version="0.1.0"
13)
  • We’ll send StructuredOutputPrompt to a remote AI agent. It will return a StructuredOutputResponse with a JSON object matching our schema.

4) Helper: create a text ChatMessage

A small helper to create consistent ChatMessage replies. Note the EndSessionContent uses type="end-session".

1from datetime import datetime
2from uuid import uuid4
3
4
5def create_text_chat(text: str, end_session: bool = False) -> ChatMessage:
6 content = [TextContent(type="text", text=text)]
7 if end_session:
8 content.append(EndSessionContent(type="end-session"))
9 return ChatMessage(
10 timestamp=datetime.utcnow(),
11 msg_id=uuid4(),
12 content=content,
13 )

5) Handle incoming chat and forward to the AI agent

  • On each inbound ChatMessage, we acknowledge it
  • We forward the user’s text to an AI agent capable of returning structured output that matches our weather request schema
1from uagents import Agent, Context
2from uagents_core.contrib.protocols.chat import (
3 ChatAcknowledgement,
4 ChatMessage,
5 TextContent,
6 StartSessionContent,
7)
8
9from functions import get_weather, WeatherRequest
10
11AI_AGENT_ADDRESS = (
12 "agent1qtlpfshtlcxekgrfcpmv7m9zpajuwu7d5jfyachvpa4u3dkt6k0uwwp2lct" # OpenAI AI agent address
13)
14
15agent = Agent()
16
17@chat_proto.on_message(ChatMessage)
18async def handle_message(ctx: Context, sender: str, msg: ChatMessage):
19 ctx.logger.info(f"Got a message from {sender}: {msg.content}")
20
21 # Remember who sent this session's message so we can reply later
22 ctx.storage.set(str(ctx.session), sender)
23
24 # Acknowledge receipt
25 await ctx.send(
26 sender,
27 ChatAcknowledgement(
28 timestamp=datetime.utcnow(), acknowledged_msg_id=msg.msg_id
29 ),
30 )
31
32 # Extract user text and forward to AI agent for structured output
33 for item in msg.content:
34 if isinstance(item, StartSessionContent):
35 ctx.logger.info(f"Got a start session message from {sender}")
36 continue
37 elif isinstance(item, TextContent):
38 ctx.logger.info(f"User said: {item.text}")
39
40 # Ask the AI agent to produce JSON matching our schema
41 await ctx.send(
42 AI_AGENT_ADDRESS,
43 StructuredOutputPrompt(
44 prompt=item.text, output_schema=WeatherRequest.schema()
45 ),
46 )
47 else:
48 ctx.logger.info("Ignoring non-text content")

6) Handle structured output response and reply

  • When the remote AI agent replies with a JSON object, we parse it, fetch weather, and send a natural-language message back to the original user. The final formatting is done on the agent side.
1@struct_output_client_proto.on_message(StructuredOutputResponse)
2async def handle_structured_output_response(
3 ctx: Context, sender: str, msg: StructuredOutputResponse
4):
5 ctx.logger.info(f"Structured output: {msg.output}")
6
7 # Who started this session?
8 session_sender = ctx.storage.get(str(ctx.session))
9 if session_sender is None:
10 ctx.logger.error("No session sender found in storage")
11 return
12
13 # Handle unknowns gracefully
14 if "<UNKNOWN>" in str(msg.output):
15 await ctx.send(
16 session_sender,
17 create_text_chat(
18 "Sorry, I couldn't process your location request. Please try again later."
19 ),
20 )
21 return
22
23 # Extract location from structured output
24 try:
25 location = msg.output.get("location") if isinstance(msg.output, dict) else None
26 except Exception:
27 location = None
28 ctx.logger.info(f"prompt{location}")
29
30 try:
31 if not location:
32 raise ValueError("No location provided in structured output")
33 weather = get_weather(location)
34 ctx.logger.info(str(weather))
35 except Exception as err:
36 ctx.logger.error(f"Weather error: {err}")
37 await ctx.send(
38 session_sender,
39 create_text_chat(
40 "Sorry, I couldn't process your request. Please try again later."
41 ),
42 )
43 return
44
45 if "error" in weather:
46 await ctx.send(session_sender, create_text_chat(str(weather["error"])) )
47 return
48
49 # Reply uses pre-formatted text from get_weather
50 reply = weather.get("weather") or f"Weather for {location}: (no data)"
51 await ctx.send(session_sender, create_text_chat(reply))

7) Wire up protocols and run

1agent.include(chat_proto, publish_manifest=True)
2agent.include(struct_output_client_proto, publish_manifest=True)
3
4if __name__ == "__main__":
5 agent.run()

Weather utility module (functions.py)

This helper module defines the expected schema for the structured output and a function to fetch weather from Open-Meteo. It returns a single pre-formatted string under weather.

1# functions.py
2from uagents import Model
3import requests
4
5class WeatherRequest(Model):
6 location : str
7
8class WeatherResponse(Model):
9 weather : str
10
11def get_weather(location: str):
12 """Return current weather for a location string (e.g., 'Paris, France')."""
13 if not location or not location.strip():
14 raise ValueError("location is required")
15
16 # 1) Geocode
17 geo_params = {"name": location, "count": 1, "language": "en", "format": "json"}
18 gr = requests.get(
19 "https://geocoding-api.open-meteo.com/v1/search",
20 params=geo_params,
21 timeout=60,
22 )
23 gr.raise_for_status()
24 g = gr.json()
25 if not g.get("results"):
26 raise RuntimeError(f"No geocoding match for: {location}")
27
28 r0 = g["results"][0]
29 latitude = r0["latitude"]
30 longitude = r0["longitude"]
31 timezone = r0.get("timezone") or "auto"
32 display = ", ".join([v for v in [r0.get("name"), r0.get("admin1"), r0.get("country")] if v])
33
34 # 2) Current weather
35 wx_params = {
36 "latitude": latitude,
37 "longitude": longitude,
38 "timezone": timezone,
39 "current": (
40 "temperature_2m,apparent_temperature,relative_humidity_2m,"
41 "weather_code,wind_speed_10m,wind_direction_10m,is_day,precipitation"
42 ),
43 }
44 wr = requests.get("https://api.open-meteo.com/v1/forecast", params=wx_params, timeout=60)
45 wr.raise_for_status()
46 data = wr.json()
47
48 current = data.get("current") or data.get("current_weather") or {}
49 temp = current.get("temperature_2m")
50 app = current.get("apparent_temperature")
51 wind = current.get("wind_speed_10m")
52 rh = current.get("relative_humidity_2m")
53
54 parts = [f"Weather for {display}"]
55 if temp is not None:
56 parts.append(f"temp {temp}°C")
57 if app is not None:
58 parts.append(f"feels like {app}°C")
59 if rh is not None:
60 parts.append(f"RH {rh}%")
61 if wind is not None:
62 parts.append(f"wind {wind} km/h")
63
64 return {"weather": ", ".join(parts)}

Complete example (copy-paste)

Combine everything into two files you can run on hosted agent or locally.

This Weather Agent is an Agentverse hosted agent. You can create your own hosted agent by following the guide here:

Hosted Agents

.

1# agents.py
2from uagents import Agent, Context, Protocol
3from uagents_core.contrib.protocols.chat import (
4 ChatAcknowledgement,
5 ChatMessage,
6 TextContent,
7 chat_protocol_spec,
8 StartSessionContent,
9 EndSessionContent,
10)
11from functions import get_weather, WeatherRequest, WeatherResponse
12from datetime import datetime
13from uuid import uuid4
14from typing import Any, Dict
15from uagents import Model
16
17class StructuredOutputPrompt(Model):
18 prompt: str
19 output_schema: Dict[str, Any]
20
21class StructuredOutputResponse(Model):
22 output: Dict[str, Any]
23
24AI_AGENT_ADDRESS = "agent1qtlpfshtlcxekgrfcpmv7m9zpajuwu7d5jfyachvpa4u3dkt6k0uwwp2lct" # OpenAI ai agent address
25
26agent = Agent()
27
28chat_proto = Protocol(spec=chat_protocol_spec)
29struct_output_client_proto = Protocol(
30 name="StructuredOutputClientProtocol", version="0.1.0"
31)
32
33def create_text_chat(text: str, end_session: bool = False) -> ChatMessage:
34 content = [TextContent(type="text", text=text)]
35 if end_session:
36 content.append(EndSessionContent(type="end-session"))
37 return ChatMessage(
38 timestamp=datetime.utcnow(),
39 msg_id=uuid4(),
40 content=content,
41 )
42
43@chat_proto.on_message(ChatMessage)
44async def handle_message(ctx: Context, sender: str, msg: ChatMessage):
45 ctx.logger.info(f"Got a message from {sender}: {msg.content}")
46 ctx.storage.set(str(ctx.session), sender)
47 await ctx.send(
48 sender,
49 ChatAcknowledgement(timestamp=datetime.utcnow(), acknowledged_msg_id=msg.msg_id),
50 )
51
52 for item in msg.content:
53 if isinstance(item, StartSessionContent):
54 ctx.logger.info(f"Got a start session message from {sender}")
55 continue
56 elif isinstance(item, TextContent):
57 ctx.logger.info(f"Got a message from {sender}: {item.text}")
58 ctx.storage.set(str(ctx.session), sender)
59 await ctx.send(
60 AI_AGENT_ADDRESS,
61 StructuredOutputPrompt(
62 prompt=item.text, output_schema=WeatherRequest.schema()
63 ),
64 )
65 else:
66 ctx.logger.info(f"Got unexpected content from {sender}")
67
68@chat_proto.on_message(ChatAcknowledgement)
69async def handle_ack(ctx: Context, sender: str, msg: ChatAcknowledgement):
70 ctx.logger.info(
71 f"Got an acknowledgement from {sender} for {msg.acknowledged_msg_id}"
72 )
73
74@struct_output_client_proto.on_message(StructuredOutputResponse)
75async def handle_structured_output_response(
76 ctx: Context, sender: str, msg: StructuredOutputResponse
77):
78 ctx.logger.info(f'Here is the message from structured output {msg.output}')
79 session_sender = ctx.storage.get(str(ctx.session))
80 if session_sender is None:
81 ctx.logger.error(
82 "Discarding message because no session sender found in storage"
83 )
84 return
85
86 if "<UNKNOWN>" in str(msg.output):
87 await ctx.send(
88 session_sender,
89 create_text_chat(
90 "Sorry, I couldn't process your location request. Please try again later."
91 ),
92 )
93 return
94
95 # Extract location robustly from dict output
96 try:
97 location = msg.output.get("location") if isinstance(msg.output, dict) else None
98 except Exception:
99 location = None
100 ctx.logger.info(f'prompt{location}')
101
102 try:
103 if not location:
104 raise ValueError("No location provided in structured output")
105 weather = get_weather(location)
106 ctx.logger.info(str(weather))
107 except Exception as err:
108 ctx.logger.error(f"Error: {err}")
109 await ctx.send(
110 session_sender,
111 create_text_chat(
112 "Sorry, I couldn't process your request. Please try again later."
113 ),
114 )
115 return
116
117 if "error" in weather:
118 await ctx.send(session_sender, create_text_chat(str(weather["error"])) )
119 return
120
121 reply = weather.get("weather") or f"Weather for {location}: (no data)"
122 chat_message = create_text_chat(reply)
123
124 await ctx.send(session_sender, chat_message)
125
126agent.include(chat_proto, publish_manifest=True)
127agent.include(struct_output_client_proto, publish_manifest=True)
128
129if __name__ == "__main__":
130 agent.run()
1# functions.py
2from uagents import Model
3import requests
4
5class WeatherRequest(Model):
6 location : str
7
8class WeatherResponse(Model):
9 weather : str
10
11def get_weather(location: str):
12 """Return current weather for a location string (e.g., 'Paris, France')."""
13 if not location or not location.strip():
14 raise ValueError("location is required")
15
16 # 1) Geocode
17 geo_params = {"name": location, "count": 1, "language": "en", "format": "json"}
18 gr = requests.get(
19 "https://geocoding-api.open-meteo.com/v1/search",
20 params=geo_params,
21 timeout=60,
22 )
23 gr.raise_for_status()
24 g = gr.json()
25 if not g.get("results"):
26 raise RuntimeError(f"No geocoding match for: {location}")
27
28 r0 = g["results"][0]
29 latitude = r0["latitude"]
30 longitude = r0["longitude"]
31 timezone = r0.get("timezone") or "auto"
32 display = ", ".join([v for v in [r0.get("name"), r0.get("admin1"), r0.get("country")] if v])
33
34 # 2) Current weather
35 wx_params = {
36 "latitude": latitude,
37 "longitude": longitude,
38 "timezone": timezone,
39 "current": (
40 "temperature_2m,apparent_temperature,relative_humidity_2m,"
41 "weather_code,wind_speed_10m,wind_direction_10m,is_day,precipitation"
42 ),
43 }
44 wr = requests.get("https://api.open-meteo.com/v1/forecast", params=wx_params, timeout=60)
45 wr.raise_for_status()
46 data = wr.json()
47
48 current = data.get("current") or data.get("current_weather") or {}
49 temp = current.get("temperature_2m")
50 app = current.get("apparent_temperature")
51 wind = current.get("wind_speed_10m")
52 rh = current.get("relative_humidity_2m")
53
54 parts = [f"Weather for {display}"]
55 if temp is not None:
56 parts.append(f"temp {temp}°C")
57 if app is not None:
58 parts.append(f"feels like {app}°C")
59 if rh is not None:
60 parts.append(f"RH {rh}%")
61 if wind is not None:
62 parts.append(f"wind {wind} km/h")
63
64 return {"weather": ", ".join(parts)}

Why the Agent Chat Protocol is useful

  • Natural language first: users can speak or type naturally; your agent wraps messages in a standard structure
  • Interoperability: any agent implementing the protocol can communicate, regardless of internal implementation
  • Extensible: add client protocols (like structured output) to connect to specialized agents
  • Reliability: acknowledgements and session controls help build robust, user-friendly experiences

Tip: You can also use this agent through ASI:One by mentioning it directly in your prompt, for example:

@agent1qde95qr0dzcnhhs8f65hkwujn9mh89jx0u7u7g6nv3tm2jxvjwhkunvessq please get me weather of San Francisco.

To try a live conversation experience, visit the Example Weather Agent on Agentverse.

Agent chat in ASI:One (Light)Agent chat in ASI:One (Dark)