Local AI Chatbot (Ollama)
Offline-first AI chatbot powered by local LLM models via Ollama. Features conversational memory, embedding-based retrieval, secure processing and low-latency inference without cloud dependencies.
📋 Project Overview
Offline-first AI chatbot powered by local LLM models via Ollama. Features conversational memory, embedding-based retrieval, secure processing and low-latency inference without cloud dependencies.