Back
API REFERENCE v1.0

API Documentation

The LongMemory API allows you to add infinite semantic memory to your AI agents. We provide a unified layer that handles vector embeddings, storage, and retrieval automatically.

Base URL
https://api.longmemory.io/v4
Version
v1.0

Authentication

Authentication is handled via Bearer tokens. You can obtain your API key from the dashboard. Always keep your keys secure and never expose them in client-side code.

Python SDK

The client automatically looks for these environment variables.

bash
# In your terminal or .env file
export LONGMEMORY_API_KEY="sk_live_12345..."

POST

/ingest

Stores a new memory chunk. The system automatically processes the text, generates vector embeddings using our SOTA models, and indexes it for low latency retrieval.

Parameters

PropertyTypeDescription
user_id
Required
stringUnique identifier for the user or agent owner.
text
Required
stringThe conversation content or fact to be stored.
speaker
Optional
stringName of the speaker (e.g., 'Bob', 'Alice').
created_at
Optional
iso8601Timestamp for historical backfilling (e.g., '2023-10-01T12:00:00Z').
Python Client
python
client.add(
    user_id="user_123",
    text="My favorite food is Pizza.",
    speaker="Bob"
)
cURL Request
bash
curl -X POST https://api.longmemory.io/v4/ingest \
  -H "Authorization: Bearer sk_live_..." \
  -H "Content-Type: application/json" \
  -d '{
    "user_id": "user_123",
    "text": "My favorite food is Pizza.",
    "speaker": "Bob"
  }'
Response
json
{
  "id": "user_123...",
  "status": "success",
  "token_usage": 12
}

POST

/query

Parameters

PropertyTypeDescription
user_id
Required
stringUnique identifier for the user.
query
Required
stringThe question or statement to retrieve context for.
Python Client
python
# Retrieve the memories for Bob
results = client.get(
    user_id="user_123",
    query="What is Bob's favorite food?",

)
cURL Request
bash
curl -X POST https://api.longmemory.io/v4/query \
  -H "Authorization: Bearer sk_live_..." \
  -H "Content-Type: application/json" \
  -d '{
    "user_id": "user_123",
    "query": "What is Bob's favorite food?",

  }'
Response
json
{
  "matches": [
    {
      "text": "[Bob] My favorite food is Pizza.",
    }
  ]
}

POST

/askCOMING SOON

Let LongMemory handle the LLM generation for you. We securely proxy the request, retrieve the relevant context, and prompt the LLM of your choice in one step.

Parameters

PropertyTypeDescription
user_id
Required
stringUnique identifier for the user.
query
Required
stringThe question to ask the agent.
llm_model
Required
stringThe model ID (e.g., 'gpt-4o', 'claude-3').
llm_api_key
Required
stringYour LLM provider's API key (we do not store this).
Python Client
python
# 1. Initialize with your LLM keys
client = LongMemory(
    api_key="lm_live_...",
    llm_api_key="sk-openai-...",
    llm_model="gpt-4o"
)

# 2. Ask the agent directly
response = client.ask(
    user_id="user_123",
    query="What is the name of Alice's pet?"
)
cURL Request
bash
curl -X POST https://api.longmemory.io/v4/ask \
  -H "Authorization: Bearer sk_live_..." \
  -H "Content-Type: application/json" \
  -d '{
    "user_id": "user_123",
    "query": "What is the name of Alice's pet?",
    "llm_model": "gpt-4o",
    "llm_api_key": "sk-openai-..."
  }'
Response
json
{
  "answer": "Alice's pet is named Fluffy.",
  "context_used": [
    "[Alice] I have a cat named Fluffy."
  ]
}