Back to Projects

LocalLLM Chat Interface

Local AI chat application with custom agent personas, knowledge base upload, and web search — all running privately on your machine.

December 1, 2025
LLMsFastAPIPythonAI Agents

Overview

A personal project building a fully local AI chat command center. Connects to LM Studio for running large language models privately on your own hardware, with a web-based UI for conversations.

LocalLLM Chat Interface

Features

  • Persistent Chats: Chat history saved across sessions
  • Custom Agent Personas: Define AI agents with custom system prompts for different tasks (coding assistant, writing helper, researcher, etc.)
  • Knowledge Base: Upload PDFs to give the AI context about your documents — text is extracted and used for informed responses
  • File Attachments: Attach files to conversations for analysis
  • Web Search: DuckDuckGo integration for real-time web lookups during conversations
  • Privacy: Everything runs locally — no data leaves your machine

Architecture

  • Backend: FastAPI server connecting to LM Studio's OpenAI-compatible API
  • Frontend: Clean web UI (HTML/CSS/JS)
  • Document Processing: pypdf for PDF text extraction into the knowledge base

Technologies

Python, FastAPI, OpenAI API (LM Studio compatible), HTML/CSS/JS, DuckDuckGo Search, pypdf