Neural Fabric · MCP

The shared memory layer
for every AI agent you use.

Every AI tool you use is isolated from the others. Neural Fabric is where structured knowledge lives: decisions, constraints, entities, written once, queryable by any agent you connect, in any tool, at any time.

01 | Architecture

Who connects to what

Any MCP-compatible AI connects to one Fabric server. All agents read and write to the same knowledge graph: Claude, ChatGPT, Gemini, or any open-source model.

write read AI AGENTS Claude claude.ai · claude code ChatGPT openai · gpt-4o Gemini + more any mcp client QORBIT MCP fabric_query fabric_note fabric_assert fabric_ingest fabric_search fabric_task fabric_thread fabric_entities fabric_status 9 TOOLS · STREAMABLE HTTP NEURAL FABRIC Project layer Notes & threads Assertions & decisions Ingested docs 81+ ENTITIES Production layer Business entities Customers, orders Integrations ISOLATED · READ-ONLY
02 | How it works

One agent writes. Every agent knows

One agent captures a decision. A completely different agent, in a different tool, on a different day, queries it immediately: the source, the confidence, the timestamp.

SESSION A · CLAUDE.AI User "capture this decision" Claude.ai calls fabric_assert Neural Fabric assertion stored persists forever · knowledge persists across every agent · SESSION B · COMPOSER Composer calls fabric_query Composer calls fabric_task User reads prior decisions task written back to Fabric · visible to all agents
03 | Connect

Connect any MCP client

Neural Fabric is a standard MCP server. If your tool speaks MCP, it connects. Choose your path.

Cloud AI platforms
Claude.ai · ChatGPT · Gemini · Open WebUI
+ any cloud MCP client with OAuth
1
Copy your Fabric URL
From your Qorbit dashboard after signing up.
2
Open connector settings
In your AI tool (e.g., Claude.ai → Settings → Connectors → Add custom connector).
3
Paste URL, click Connect
Sign in with your Qorbit account when prompted. The 9 Fabric tools appear automatically.
No bearer token needed — secure OAuth handles authentication.
Cloud inference platforms
Fireworks AI, Groq, and vLLM handle the MCP connection server-side — add your Fabric URL in their dashboard. Run Kimi K2.5, DeepSeek V3, Qwen 3, Llama 4, or any hosted model with Fabric memory.
IDEs · Editors · Local runners
Cursor · Claude Code · VS Code Copilot · Windsurf · LM Studio · Ollama
+ Zed, Continue, Cline, Jan, llama.cpp, and any MCP client that reads a config file
1
Copy your Fabric URL + bearer token
From your Qorbit dashboard after signing up.
2
Add to your config file
Add the Fabric MCP server to your tool's config, replacing the URL and token with your credentials:
mcp.json / settings.json
{
  "mcpServers": {
    "qorbit-fabric": {
      "url": "https://your-fabric-url/api/mcp/fabric",
      "headers": {
        "Authorization": "Bearer YOUR_TOKEN"
      }
    }
  }
}
3
Restart your tool
The 9 Fabric tools appear automatically. Try fabric_status to confirm.
✓ Works with any MCP client that reads a JSON config file.
Frameworks · Custom agents
OpenClaw · LangChain · CrewAI · LlamaIndex · AutoGen
+ Haystack, Semantic Kernel, and any framework with MCP client support
1
Install the MCP client for your framework
Most frameworks have an MCP client package or built-in support.
2
Initialize with your Fabric URL + token
Point the MCP client at your Fabric endpoint:
Python
from mcp_client import MCPClient

client = MCPClient(
  url="https://your-fabric-url/api/mcp/fabric",
  headers={"Authorization": "Bearer YOUR_TOKEN"}
)
tools = client.list_tools()
TypeScript
import { StreamableHTTPClientTransport } from "@modelcontextprotocol/sdk/client/streamableHttp.js";

const transport = new StreamableHTTPClientTransport(
  new URL("https://your-fabric-url/api/mcp/fabric"),
  { headers: { "Authorization": `Bearer YOUR_TOKEN` } }
);
3
Call tools from your agent
The 9 Fabric tools are available immediately. See Section 04 below for OpenClaw-specific setup.
✓ Works with any framework that has an MCP client.
These are examples — Neural Fabric works with any MCP-compatible tool.
Get your Fabric URL and bearer token
04 | OpenClaw

Connect OpenClaw

Give your agent memory that survives compaction. Add the Fabric as an MCP server and your OpenClaw agent gets 9 tools for persistent, structured knowledge.

1
Add to your config
Add the Fabric MCP server to your OpenClaw config.
~/.openclaw/openclaw.json
{
  "mcpServers": {
    "qorbit": {
      "type": "http",
      "url": "https://your-fabric-url/api/mcp/fabric",
      "headers": {
        "Authorization": "Bearer YOUR_TOKEN"
      }
    }
  }
}
2
Restart and try it
Restart your gateway and send a message: "Call fabric_status to check the connection." The 9 Fabric tools appear alongside your existing OpenClaw tools.
Terminal
$ openclaw gateway restart
Get your Fabric URL and bearer token
✓ OpenClaw connects via Streamable HTTP transport
05 | Tools

9 tools. Every client.

Connect any MCP client and these 9 tools appear automatically. Same tools in Claude.ai, ChatGPT, Cursor, OpenClaw, or any MCP-compatible agent.

fabric_note
Save a decision, constraint, or fact
fabric_query
Ask a question, get a grounded answer
fabric_search
Fast text search across everything stored
fabric_assert
Capture decisions with confidence scores
fabric_ingest
Store entire documents permanently
fabric_thread
Follow related notes across agents
fabric_entities
List what is in the graph
fabric_status
Check connection and entity counts
fabric_task
Assign work between agents
Neural Fabric · MCP · 9 tools · Streamable HTTP + stdio transport · Any MCP-compatible client
Neural Fabric Development Layer
About User Feedback View Plans