Offline Desktop AI App (Electron + React + Local LLM + OCR)
UpworkDENot specifiedintermediateScore: 69
JavaScriptReactAPIDesktop ApplicationElectronNode.js
I am building a cross-platform desktop application (Windows/Mac/Linux) similar in concept to NotebookLM, but fully offline.
The app should:
• Import documents (PDFs + scanned images)
• Run OCR locally (Tesseract)
• Split text into chunks
• Generate embeddings locally
• Store embeddings in SQLite (or local vector DB)
• Use llama.cpp for local LLM inference
• Enable chat with documents (RAG architecture)
• Show source citations in answers
Stack requirements:
• Electron + React + TypeScript
• SQLite (with vector search support preferred)
• Local llama.cpp integration
• Tesseract OCR integration
• Fully offline (no cloud APIs)
Unlock AI Intelligence, score breakdowns, and real-time alerts
Upgrade to Pro — $29.99/moClient
Spent: $42,688.37Rating: 4.9Verified