***

title: Concepts
subtitle: Runtime architecture and contract boundaries
slug: npm/concepts
------------------

## Core Runtime Model

The SDK is implemented as Redux Toolkit slices + thunks.

* State: `npc`, `directive`, `memory`, `cortex`, `bridge`, `ghost`, `soul`
* API transport: `sdkApi` RTK Query endpoints
* Protocol execution: `processNPC` thunk

## Body vs Mind Boundary

* SDK = execution layer (the body)
* API = decision layer (the mind)

The SDK does not synthesize directives, construct prompts, or perform final action validation independently.

## Multi-Round Protocol (Implemented Sequence)

Primary path (current SDK runtime):

1. `POST /npcs/{npcId}/process` with `{ tape, lastResult }`
2. local instruction execution (`IdentifyActor`, `QueryVector`, `ExecuteInference`)
3. repeat `/process` until finalize instruction
4. local state/memory application from finalize payload

Compatibility path (also supported by API):

1. `POST /npcs/{npcId}/directive`
2. local memory recall (if instructed)
3. `POST /npcs/{npcId}/context`
4. local completion (`ICortex.complete`)
5. `POST /npcs/{npcId}/verdict`
6. local state/memory application

## Required Inputs for `processNPC`

* `apiUrl`
* `text`
* `cortex` (local runtime cortex)
* `npcId` or active NPC in state
* persona (arg or active NPC persona)

If API instructions require memory operations, a memory engine must be provided.

## Node vs Browser Runtimes

Both `@forbocai/node` and `@forbocai/browser` provide the same protocol contract — they differ only in how local side-effects are executed.

**Node** (`@forbocai/node`):

* Cortex: `node-llama-cpp` running GGUF models on CPU/GPU
* Memory: LanceDB file-based vector DB (persists across process restarts)
* CLI: `npx forbocai <command>` for terminal workflows

**Browser** (`@forbocai/browser`):

* Cortex: WebLLM running in a WebGPU-capable browser tab
* Memory: Orama in-memory vector search (ephemeral per session)
* No CLI

Both environments execute the same multi-round protocol against the same API. The SDK is the "body" — it runs inference and stores vectors locally regardless of platform.

## Store-First Integration Pattern

```typescript
import { createSDKStore, setNPCInfo, processNPC } from '@forbocai/core';
import { createNodeCortex } from '@forbocai/node';

const store = createSDKStore();
store.dispatch(setNPCInfo({ id: 'npc_1', persona: 'Guard captain' }));

const cortex = createNodeCortex('smollm2-135m');
await cortex.init();

await store.dispatch(processNPC({
  npcId: 'npc_1',
  text: 'State your business.',
  apiUrl: 'https://api.forboc.ai',
  apiKey: process.env.FORBOCAI_API_KEY,
  cortex
})).unwrap();
```
