🦝 Fully In-Browser Chat Over Documents with AI-Mask 🦝

  • 🏑Yes, it's another LLM-powered chat over documents implementation... but this one is entirely local in your browser!
  • βš™οΈThe default LLM is Phi-2 run using AI-Mask. You must have the extension installed for this to work.
  • πŸ•‘The first time you start a chat, the model provider will automatically download the weights and cache them. These weights are several GB in size, so it may take some time. Make sure you have a good internet connection!
  • πŸ—ΊοΈThe default embeddings are
    "Xenova/all-MiniLM-L6-v2"
    . For higher-quality embeddings on machines that can handle it, switch to
    nomic-ai/nomic-embed-text-v1
    in
    app/worker.ts
    .
  • πŸ™This template is open source - you can see the source code and deploy your own version from the GitHub repo!
  • πŸ‘‡Try embedding a PDF below, then asking questions! You can even turn off your WiFi after the initial model download.