MDDB Chat Widget
Embed an AI-powered chatbot on any website with a single script tag. RAG-powered answers from your MDDB knowledge base. Supports OpenAI, Claude, Ollama, Bielik, Groq, Mistral, and more.
Features
- RAG-Powered โ answers from your MDDB knowledge base via hybrid BM25 + vector search
- 10+ LLM Providers โ OpenAI, Claude, Ollama, Bielik, Groq, Mistral, DeepSeek, Gemini
- WebSocket Streaming โ real-time streaming responses with typing indicators
- Shadow DOM โ fully isolated CSS, no conflicts with your site's styles
- Session Persistence โ localStorage-based session continuity across page loads
- < 30KB gzipped โ lightweight IIFE bundle, zero framework dependencies
- Queue System โ graceful handling of multiple concurrent users
Architecture
Three components work together:
| Component | Description | Port |
|---|---|---|
mddb-chat-widget | TypeScript widget (Shadow DOM, streaming) | CDN / self-hosted |
mddb-chat | Rust/Axum WebSocket server, tool-calling loop | 11030 |
mddbd | MDDB knowledge base, hybrid search | 11023 (HTTP), 11024 (gRPC) |
Your Website MDDB Infrastructure | |
<script> +---------------+
mddb-chat.min.js | mddbd | | | :11024 gRPC | | WebSocket +-------^-------+ v |
+-----------+ gRPC |
| mddb-chat |---------------------------+
| :11030 |
| (Rust) |---- HTTP ---> LLM Provider
+-----------+ (OpenAI / Claude / Ollama / ...)
Quick Start
Docker Compose (Recommended)
git clone https://github.com/tradik/mddb.git
cd mddb make docker-up Embed on Your Website
<!-- Add to your HTML before </body> -->
<script> window.MDDB_CHAT_CONFIG = { serverUrl: 'ws://localhost:11030/ws', title: 'Documentation Assistant', primaryColor: '#2563eb' };
</script>
<script src="http://localhost:11032/mddb-chat.min.js" defer></script>
Self-Hosted Widget
cd services/mddb-chat-widget
npm install
npm run build
Configuration
Chat Server Environment Variables
MDDB_CHAT_LLM_PROVIDER=openai # openai | anthropic | ollama | groq | mistral
MDDB_CHAT_LLM_API_KEY=sk-... # API key (not needed for Ollama)
MDDB_CHAT_LLM_MODEL=gpt-4o-mini # Model name
MDDB_CHAT_LLM_BASE_URL= # Custom base URL (optional) MDDB_GRPC_ADDRESS=localhost:11024
MDDB_COLLECTION=docs # Default collection to search MDDB_CHAT_PORT=11030
MDDB_CHAT_MAX_SESSIONS=100
MDDB_CHAT_RATE_LIMIT=10 # Messages per minute per session
Widget Configuration
window.MDDB_CHAT_CONFIG = { serverUrl: 'ws://localhost:11030/ws', // Chat server WebSocket URL title: 'Documentation Assistant', // Widget header title subtitle: 'Powered by MDDB', // Widget header subtitle primaryColor: '#2563eb', // Accent color (hex) position: 'bottom-right', // bottom-right | bottom-left welcomeMessage: 'Hello! How can I help you today?'
};
LLM Provider Examples
OpenAI
MDDB_CHAT_LLM_PROVIDER=openai
MDDB_CHAT_LLM_API_KEY=sk-...
MDDB_CHAT_LLM_MODEL=gpt-4o-mini
Anthropic Claude
MDDB_CHAT_LLM_PROVIDER=anthropic
MDDB_CHAT_LLM_API_KEY=sk-ant-...
MDDB_CHAT_LLM_MODEL=claude-haiku-4-5-20251001
Ollama (Local, Free)
MDDB_CHAT_LLM_PROVIDER=ollama
MDDB_CHAT_LLM_BASE_URL=http://localhost:11434
MDDB_CHAT_LLM_MODEL=llama3.2
Bielik (Polish)
MDDB_CHAT_LLM_PROVIDER=ollama
MDDB_CHAT_LLM_BASE_URL=http://localhost:11434
MDDB_CHAT_LLM_MODEL=bielik-11b-v2.3-instruct
How RAG Works
- User sends a message via the chat widget
- Chat server receives it via WebSocket
- Hybrid search โ MDDB performs BM25 + vector search on your knowledge base
- Context injection โ top results are injected into the LLM prompt
- Streaming response โ the LLM streams its answer back through the chat server
- Widget renders โ the response appears with markdown formatting in the widget
Step-by-Step Guide
See the Website Chat Guide for a complete walkthrough with Docker and Ollama.
Source Code
- Widget:
services/mddb-chat-widget - Server:
services/mddb-chat