Chatty

2wks agoupdate 00

Open-source Gemini/ChatGPT-like interface for local browser-based open-source models.

Collection time:
2024-02-24
ChattyChatty

What is Chatty?

Chatty is an open-source, feature-rich interface similar to Gemini/ChatGPT, designed for running open-source models (Gemma, Mistral, LLama3 etc.) locally in the browser using WebGPU. It eliminates server-side processing, ensuring user data remains on their personal computer.


How to use Chatty?

Use Chatty by selecting a model (Gemma, Mistral, LLama3 etc.) and starting a new chat. Type your query in the chat interface and receive responses directly in your browser. No server-side processing is involved.


Chatty’s Core Features

Local browser-based execution of open-source models WebGPU utilization for efficient processing Privacy-focused data handling (no server-side processing) Gemini/ChatGPT-like interface


Chatty’s Use Cases

  • Chatting with open-source LLMs locally without data leaving your computer.

Relevant Navigation