Skip to main content

Chat with Agent (Streaming)

Start a streaming chat conversation with an agent. Unlike the Send Message endpoint, this endpoint streams the response in real-time, providing a more interactive experience.

Endpoint

POST /api/v1/agents/chat/{agentId}

Authentication

This endpoint requires API key authentication. Include your API key in the Authorization header using the Bearer scheme.

Header Format:

Authorization: Bearer {PUBLIC_KEY}:{SECRET_KEY}

Request

Path Parameters

ParameterTypeRequiredDescription
agentIdstring (UUID)YesThe unique identifier of the agent

Headers

HeaderTypeRequiredDescription
AuthorizationstringYesYour API key in the format Bearer pk_xxx:sk_xxx
Content-TypestringYesMust be application/json
X-Agent-IdentifierstringNoExternal user identifier for conversation tracking

Request Body

The request body should contain the message content in the format expected by the Vercel AI SDK.

{
"messages": [
{
"role": "user",
"content": "What are your business hours?"
}
]
}

Response

Success Response

Status Code: 200 OK

The response is streamed using Server-Sent Events (SSE). The content arrives in chunks as the agent generates its response.

Content-Type: text/event-stream

Error Responses

401 Unauthorized

Status: 401 Unauthorized
Body: "Unauthorized"

402 Payment Required

Returned when the organization has no active subscription or no remaining credits.

404 Not Found

Status: 404 Not Found
Body: "Agent not found"

500 Internal Server Error

Status: 500 Internal Server Error
Body: "Internal server error"

Example Usage

cURL

curl -X POST https://agentsgt.com/api/v1/agents/chat/550e8400-e29b-41d4-a716-446655440000 \
-H "Authorization: Bearer pk_1234567890abcdef:sk_abcdef1234567890" \
-H "Content-Type: application/json" \
-H "X-Agent-Identifier: user-123" \
-d '{
"messages": [
{
"role": "user",
"content": "What are your business hours?"
}
]
}'

JavaScript (Fetch API)

const publicKey = 'pk_1234567890abcdef';
const secretKey = 'sk_abcdef1234567890';
const agentId = '550e8400-e29b-41d4-a716-446655440000';

const response = await fetch(
`https://agentsgt.com/api/v1/agents/chat/${agentId}`,
{
method: 'POST',
headers: {
'Authorization': `Bearer ${publicKey}:${secretKey}`,
'Content-Type': 'application/json',
'X-Agent-Identifier': 'user-123'
},
body: JSON.stringify({
messages: [
{ role: 'user', content: 'What are your business hours?' }
]
})
}
);

const reader = response.body.getReader();
const decoder = new TextDecoder();

while (true) {
const { done, value } = await reader.read();
if (done) break;

const chunk = decoder.decode(value);
process.stdout.write(chunk);
}

Python (Requests)

import requests

public_key = 'pk_1234567890abcdef'
secret_key = 'sk_abcdef1234567890'
agent_id = '550e8400-e29b-41d4-a716-446655440000'

headers = {
'Authorization': f'Bearer {public_key}:{secret_key}',
'Content-Type': 'application/json',
'X-Agent-Identifier': 'user-123'
}

response = requests.post(
f'https://agentsgt.com/api/v1/agents/chat/{agent_id}',
headers=headers,
json={
'messages': [
{'role': 'user', 'content': 'What are your business hours?'}
]
},
stream=True
)

for chunk in response.iter_content(chunk_size=None, decode_unicode=True):
print(chunk, end='', flush=True)

Node.js (Axios)

const axios = require('axios');

const publicKey = 'pk_1234567890abcdef';
const secretKey = 'sk_abcdef1234567890';
const agentId = '550e8400-e29b-41d4-a716-446655440000';

const response = await axios.post(
`https://agentsgt.com/api/v1/agents/chat/${agentId}`,
{
messages: [
{ role: 'user', content: 'What are your business hours?' }
]
},
{
headers: {
'Authorization': `Bearer ${publicKey}:${secretKey}`,
'Content-Type': 'application/json',
'X-Agent-Identifier': 'user-123'
},
responseType: 'stream'
}
);

response.data.on('data', (chunk) => {
process.stdout.write(chunk.toString());
});

response.data.on('end', () => {
console.log('\nStream ended');
});

Notes

  • This endpoint streams the response in real-time, which provides a better user experience for chat interfaces
  • The X-Agent-Identifier header is optional but recommended for tracking conversations per user
  • Each message consumes credits based on the agent's model and token usage
  • The agent checks for an active subscription and sufficient credits before processing
  • Chat history is automatically logged for later retrieval via the Chat History endpoints
  • For simple request-response interactions without streaming, use the Send Message endpoint instead