Meechi Core
Meechi Core is the reference implementation of the Meechi Protocol. It is the raw cognitive engine, stripped of Meechi App UI layer, designed to be the "white box" of your digital mind.
If you are a developer, this is where you can see exactly how the magic happens.
"Trust is not given. It is audited."
Open Source
I decided to open-source the core logic because of a simple principle: if you claim to be private, you must prove it. You cannot have a "Black Box" that claims to respect your secrets while running proprietary, obfuscated code for its most critical operations.
The Core handles the heavy lifting of the Cognitive Engine. It bridges the gap between the browser's sandbox and the raw power of Large Language Models.
It is designed to be Local-First. This means:
- Zero Latency: Thoughts happen at the speed of your RAM, not your Wi-Fi.
- Zero Leakage: No telemetry. No "training data" uploads.
- Zero Rent: The intelligence runs on your hardware, forever.
Architecture
The @meechi-ai/core library is designed to be the invisible orchestration layer for local-first AI applications. It abstracts the complexity of loading models, managing context, and syncing data, so you can focus on building the experience.
Intelligence Layer
The brain of the operation. We use WebLLM (opens in a new tab) (powered by MLC-AI (opens in a new tab)) to run quantized Large Language Models directly in the browser via WebGPU. This allows us to run powerful models like Llama 3.2 (opens in a new tab) or Phi 3 (opens in a new tab) with token speeds comparable to server-side APIs, but without the data risk.
For lighter tasks, we leverage Transformers.js (opens in a new tab) for in-browser embeddings and feature extraction.
Memory Layer
Context is king. Meechi Core implements a robust memory system using IndexedDB (opens in a new tab) (via Dexie (opens in a new tab)). This creates a persistent, encrypted-at-rest database inside your browser.
- Vector Store: Semantic search running locally.
- Chat History: Infinite scroll, instantly retrievable.
- Graph Relations: Connecting thoughts, not just storing them.
Protocol Layer (MCP)
Meechi is an early adopter and implementer of the Model Context Protocol (MCP) (opens in a new tab). This allows the Core to connect with external tools and data sources in a standardized way. Whether it's reading a local file or querying a remote API, MCP provides the interface.
Sync Layer
We use Yjs (opens in a new tab) for conflict-free replicated data types (CRDTs). This ensures that if you choose to sync your data between devices, it happens seamlessly and without conflict, while remaining end-to-end encrypted.
Developer Usage
You can install the core package to build your own local-first AI applications.
npm install @meechi-ai/core
# or
yarn add @meechi-ai/coreBasic Hook
import { useMeechi } from '@meechi-ai/core';
function App() {
const { chat, isReady, localAIStatus } = useMeechi();
const handleSend = async (msg) => {
// 1. History: Previous conversation messages
const history = [];
// 2. Context: Relevant data or instructions
const context = "User is asking about Meechi.";
await chat(msg, history, context, (chunk) => {
console.log("Stream:", chunk);
});
};
if (!isReady) return <div>Status: {localAIStatus}</div>;
return <button onClick={() => handleSend("Hello World")}>Chat</button>;
}