Developer API

REST + WebSocket reference for SynapCores v1.5.0-ce. Works with any HTTP client in any language — cURL, Node, Python, PHP samples per endpoint.

Before you start

  • All endpoints below are under /v1. Replace localhost:8080 with your own host:port.
  • Auth: JWT via Authorization: Bearer … or API key via X-API-Key: aidb_sk_….
  • Full machine-readable spec at /v1/openapi.json on every install — feed it to any OpenAPI client generator.
  • Don't have a CE instance yet? Download the free Community Edition.

1. Authentication

Every request needs either a JWT (from /auth/login) or an API key (X-API-Key header). API keys with prefix ak_ or aidb_ are accepted; the prefix tells the server which key store to look up.

POST/v1/auth/login

Returns a JWT. If MFA is enabled the response includes mfa_required and you exchange via /v1/auth/mfa/verify.

curl -s -X POST http://localhost:8080/v1/auth/login \
  -H "Content-Type: application/json" \
  -d '{"username":"admin","password":"changeme"}'

GET/v1/users/me

Both auth schemes shown. Pick one — JWT for interactive sessions, API key for service-to-service.

# JWT
curl -s http://localhost:8080/v1/users/me -H "Authorization: Bearer $TOKEN"

# API key (preferred for backend services)
curl -s http://localhost:8080/v1/users/me -H "X-API-Key: aidb_sk_..."

2. SQL execution

Run parameterised SQL. Placeholders are PostgreSQL-style: $1, $2, $3 in the SQL string, parameters as a positional array. Returns columns metadata + rows.

POST/v1/query/execute

curl -s -X POST http://localhost:8080/v1/query/execute \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "sql": "SELECT name, price FROM products WHERE category = $1 LIMIT 5",
    "parameters": ["Electronics"]
  }'

POST/v1/query/execute/batch

Multiple SQL statements in one round trip. Each statement runs in its own implicit transaction unless explicitly wrapped.

curl -s -X POST http://localhost:8080/v1/query/execute/batch \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "statements": [
      {"sql":"INSERT INTO logs (msg) VALUES ($1)","parameters":["hello"]},
      {"sql":"INSERT INTO logs (msg) VALUES ($1)","parameters":["world"]}
    ]
  }'

3. Natural-language to SQL

Send a plain-English question, get back an executed query with rows. Schema-aware: the planner injects your live table catalog into the LLM prompt. Falls back to a deterministic pattern matcher when no LLM provider is configured.

POST/v1/nl2sql/query

curl -s -X POST http://localhost:8080/v1/nl2sql/query \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "question": "Show me top 5 products by revenue last quarter",
    "execute": true
  }'

4. Document collections

Schema-flexible JSON document stores with built-in vector + full-text indexing. Drop a JSON blob, add a vector field, do similarity search inline.

POST/v1/collections

Create a collection with typed fields and indexes.

curl -s -X POST http://localhost:8080/v1/collections \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "products",
    "schema": {
      "fields": [
        {"name":"name","type":"string","required":true},
        {"name":"description","type":"string"},
        {"name":"price","type":"number"},
        {"name":"category","type":"string"},
        {"name":"embedding","type":"vector"}
      ],
      "indexes": [
        {"name":"category_idx","fields":["category"],"type":"btree"},
        {"name":"vector_idx","fields":["embedding"],"type":"vector"}
      ]
    }
  }'

POST/v1/collections/products/documents

Insert documents. Use /documents/bulk for batched ingest.

curl -s -X POST http://localhost:8080/v1/collections/products/documents \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Wireless Mouse",
    "description": "Ergonomic wireless mouse",
    "price": 29.99,
    "category": "Electronics"
  }'

POST/v1/collections/products/search

Hybrid search: vector similarity + structured filters in one call.

curl -s -X POST http://localhost:8080/v1/collections/products/search \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "query": "ergonomic wireless input device",
    "filters": {"category": "Electronics"},
    "limit": 10
  }'

5. Vector collections + similarity search

Standalone vector store backed by HNSW. For pure-vector workloads (semantic search over embeddings, recommendation, RAG retrieval) without the document overhead.

POST/v1/vectors/collections

Create a vector collection.

curl -s -X POST http://localhost:8080/v1/vectors/collections \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"name":"text_embeddings","dimensions":384,"metric":"cosine"}'

POST/v1/vectors/collections/text_embeddings/vectors

Insert vectors with optional metadata. Use /vectors/batch for bulk.

curl -s -X POST http://localhost:8080/v1/vectors/collections/text_embeddings/vectors \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "id": "doc-42",
    "vector": [0.013, -0.214, 0.087, ...],
    "metadata": {"source": "kb-article-42", "lang": "en"}
  }'

POST/v1/vectors/collections/text_embeddings/search

k-NN search with optional metadata filter and pre-filter / post-filter modes.

curl -s -X POST http://localhost:8080/v1/vectors/collections/text_embeddings/search \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "vector": [0.013, -0.214, 0.087, ...],
    "limit": 10,
    "filter": {"lang": "en"}
  }'

6. Graph + Cypher

Run Cypher queries against the property graph. Mix structural patterns, vector hops via SIMILAR_TO, and LLM-graded scoring with llm_score() in a single MATCH.

POST/v1/graph/match

Execute a Cypher query. Parameters use $name placeholders.

curl -s -X POST http://localhost:8080/v1/graph/match \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "cypher": "MATCH (seed:Complaint {id:$id})-[:SIMILAR_TO > 0.85]->(c:Complaint) MATCH (c)-[:DESCRIBES]->(p:Product)<-[:SUPPLIES]-(s:Supplier) WHERE s.audited_year = 2025 RETURN s.name, p.name, c.id LIMIT 50",
    "parameters": {"id": "CX-9001"}
  }'

POST/v1/graph/extract

LLM-driven knowledge-graph extraction — send unstructured text, get back nodes and edges.

curl -s -X POST http://localhost:8080/v1/graph/extract \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "text": "In Q3 our CFO Susan Wong joined the call with CEO Tom Chen and confirmed Acme Corp acquired RegionTech for $40M.",
    "graph": "earnings_calls"
  }'

POST/v1/graph/algorithms

Run PageRank, Louvain (community detection), Label Propagation, or Triangle Count via CALL.

curl -s -X POST http://localhost:8080/v1/graph/algorithms \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "algorithm": "page_rank",
    "graph": "social",
    "options": {"max_iterations": 20, "damping_factor": 0.85}
  }'

7. AI Chat

Multi-turn conversational interface. Bundled local LLM (Llama-3.2-1B GGUF) is the default; configurable to OpenAI / Anthropic / Ollama / Cohere / HuggingFace.

POST/v1/ai/sessions

Create a chat session, get back a session_id you reuse for all messages.

curl -s -X POST http://localhost:8080/v1/ai/sessions \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"system_prompt":"sales-coach","model":"native"}'

POST/v1/ai/chat

Send a message, get a non-streaming reply.

curl -s -X POST http://localhost:8080/v1/ai/chat \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "session_id": "sess_abc",
    "content": "Summarize the sales-pipeline movement this week"
  }'

POST/v1/ai/chat/stream

Server-Sent-Events streaming. Tokens arrive incrementally; flush them straight to your UI.

curl -N -X POST http://localhost:8080/v1/ai/chat/stream \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"session_id":"sess_abc","content":"Walk me through ..."}'

8. Embeddings + NLP

Bundled MiniLM model serves embeddings in-process — no external API key needed. Plus sentiment / entities / summarization / classification helpers.

POST/v1/ai/embeddings

Single text in, vector out. Use /v1/ai/embeddings/batch for arrays.

curl -s -X POST http://localhost:8080/v1/ai/embeddings \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"text":"wireless ergonomic mouse"}'

POST/v1/ai/sentiment | /entities | /summarize | /classify | /qa

NLP helpers — same request envelope, different endpoint per task.

# sentiment
curl -s -X POST http://localhost:8080/v1/ai/sentiment \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"text":"the support was slow and the product never worked"}'

# summarize
curl -s -X POST http://localhost:8080/v1/ai/summarize \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"text":"...long article...","max_length":120}'

9. Recipes

161 hand-curated recipes ship inside every CE binary. List, fetch, execute, and bind your own parameters via REST.

GET/v1/recipes

curl -s http://localhost:8080/v1/recipes -H "X-API-Key: $AIDB_API_KEY"

POST/v1/recipes/:id/execute

Run a recipe by id, supplying any required parameters. Returns the SQL/Cypher that ran plus the result rows.

curl -s -X POST http://localhost:8080/v1/recipes/016_graphrag_qna/execute \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"parameters":{"id":"CX-9001"}}'

10. Filesystem Collections

Drop a file into a watched folder and SynapCores automatically chunks + embeds + OCRs + transcribes. Manage collections via REST and stream live progress over WebSocket.

POST/v1/filesystem-collections

Create a watched folder. Returns a collection id and the inferred ingest plan.

curl -s -X POST http://localhost:8080/v1/filesystem-collections \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "research_papers",
    "watch_dir": "/data/papers",
    "embed_text": true,
    "ocr_images": true
  }'

GET/ws/filesystem-collections/:id/progress

WebSocket — live ingest events. Auth via short-lived ticket from POST /v1/ws/ticket.

// 1. exchange JWT/API key for a ticket
const { token } = await fetch('http://localhost:8080/v1/ws/ticket', {
  method: 'POST',
  headers: { 'X-API-Key': key },
}).then((r) => r.json());

// 2. open the websocket with the ticket on the URL
const ws = new WebSocket(
  `ws://localhost:8080/ws/filesystem-collections/${id}/progress?token=${token}`,
);
ws.onmessage = (e) => console.log(JSON.parse(e.data));

11. Multimodal

Cross-modal similarity, search, and embeddings — text↔image↔audio. Configurable to OpenAI GPT-4o, Anthropic Claude, or local Ollama LLaVA via /v1/system/vision.

POST/v1/multimodal/search

Cross-modal search — query in any modality, results in any modality.

curl -s -X POST http://localhost:8080/v1/multimodal/search \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "query": {"text": "a red sports car at sunset"},
    "modalities": ["image"],
    "limit": 10
  }'

POST/v1/multimodal/embed

Get an embedding for a text/image/audio/video input.

curl -s -X POST http://localhost:8080/v1/multimodal/embed \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"input":{"image_url":"https://..."}, "model":"clip-vit-base"}'

12. AutoML

Train and deploy ML models without writing Python. Submit a dataset, specify the target column, get back a trained model behind a /predict endpoint.

POST/v1/automl/train

Kick off a training job. Returns a job id you poll via /v1/automl/jobs/:id.

curl -s -X POST http://localhost:8080/v1/automl/train \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "dataset_id": "ds_churn_q1",
    "task": "binary_classification",
    "target_column": "churned",
    "max_trials": 50,
    "algorithms": ["random_forest","xgboost","neural_network"]
  }'

POST/v1/automl/models/:id/predict

Score new examples against a trained model.

curl -s -X POST http://localhost:8080/v1/automl/models/m_churn_v1/predict \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "rows": [
      {"age": 41, "tenure": 18, "monthly_charges": 79.5}
    ]
  }'

13. Transactions

ACID transactions over REST. Begin → execute → commit/rollback, with optional named savepoints.

POST/v1/transactions

Begin a transaction, get back a transaction id.

# 1. begin
TX=$(curl -s -X POST http://localhost:8080/v1/transactions \
  -H "X-API-Key: $AIDB_API_KEY" | jq -r .id)

# 2. execute one or more statements in the tx
curl -s -X POST http://localhost:8080/v1/transactions/$TX/execute \
  -H "X-API-Key: $AIDB_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"sql":"UPDATE accounts SET balance = balance - $1 WHERE id = $2","parameters":[100,"acc_1"]}'

# 3. commit
curl -s -X POST http://localhost:8080/v1/transactions/$TX/commit \
  -H "X-API-Key: $AIDB_API_KEY"

14. WebSocket

The WebSocket endpoint at /ws handles streaming SQL execution, AI chat streaming, and live subscriptions. Auth uses a short-lived ticket exchange — never put a long-lived JWT or API key directly on the WS URL. Get a ticket from POST /v1/ws/ticket, then connect with ?token=<ticket>.

// Browser
const { token } = await fetch('http://localhost:8080/v1/ws/ticket', {
  method: 'POST',
  headers: { 'X-API-Key': apiKey },
}).then((r) => r.json());

const ws = new WebSocket(`ws://localhost:8080/ws?token=${token}`);
ws.onmessage = (e) => console.log(JSON.parse(e.data));

// Send a streaming SQL query
ws.send(JSON.stringify({
  type: 'sql.execute.stream',
  sql: 'SELECT * FROM logs ORDER BY created_at DESC',
  parameters: [],
}));

15. Errors + conventions

  • Status codes follow standard REST: 2xx success, 4xx user error, 5xx server error. 401 = bad/expired auth, 403 = authenticated but not authorized, 404 = not found, 409 = conflict (duplicate key), 429 = rate-limited.
  • Error envelope: { "error": { "code": "...", "message": "...", "details": {} } }
  • Pagination: limit + offset on list endpoints. Cursor-based pagination on hot paths returns next_cursor in the body.
  • Rate limits: per-tenant default 60 RPS, burst 120. Response headers include X-RateLimit-Remaining and X-RateLimit-Reset.
  • Idempotency: send Idempotency-Key header on POST/PUT to make retries safe.
  • Versioning: every endpoint is namespaced under /v1. Breaking changes ship under /v2 with a deprecation window.

Prefer a typed driver?

Native SDKs are in active development. Node.js / Python / Rust catch up to the full v1.5.0-ce surface in v0.2.0 — request early access from the contact form.