Local AI Chatbot (Ollama)

Offline-first AI chatbot powered by local LLM models via Ollama. Features conversational memory, embedding-based retrieval, secure processing and low-latency inference without cloud dependencies.

Local AI Chatbot (Ollama)

📋 Project Overview

Offline-first AI chatbot powered by local LLM models via Ollama. Features conversational memory, embedding-based retrieval, secure processing and low-latency inference without cloud dependencies.

🛠️ Technologies Used