π¦ Fully In-Browser Chat Over Documents with AI-Mask
π¦ Fully In-Browser Chat Over Documents with AI-Mask π¦
π‘Yes, it's another LLM-powered chat over documents implementation... but this one is entirely local in your browser!
πThe vector store (Voy) and embeddings (Transformers.js) are served via Vercel Edge function and run fully in the browser with no setup required.
βοΈThe default LLM is Phi-2 run using AI-Mask. You must have the extension installed for this to work.
πThe first time you start a chat, the model provider will automatically download the weights and cache them. These weights are several GB in size, so it may take some time. Make sure you have a good internet connection!
πΊοΈThe default embeddings are
"Xenova/all-MiniLM-L6-v2"
. For higher-quality embeddings on machines that can handle it, switch to
nomic-ai/nomic-embed-text-v1
in
app/worker.ts
.
π¦LangChain.js handles orchestration and ties everything together!
πThis template is open source - you can see the source code and deploy your own version from the GitHub repo!
πTry embedding a PDF below, then asking questions! You can even turn off your WiFi after the initial model download.