DeepCo

MCP.Pizza Chef: succlz123

DeepCo is a versatile chat client designed for interacting with large language models (LLMs). Built using Compose Multiplatform, it supports Windows, MacOS, Linux, Android, and iOS, enabling seamless cross-device conversations. DeepCo offers a modern, user-friendly interface for managing LLM chats, making it ideal for developers and users seeking a unified client experience across platforms.

Use This MCP client To

Engage in real-time conversations with multiple LLMs Manage LLM chat sessions across desktop and mobile devices Test and debug LLM responses in a unified client environment Integrate with MCP hosts to streamline LLM interactions Use as a development tool for building LLM-powered chat applications

README

Deep-Co

icon

windows macos linux
android iOS
kotlin compose
stars gpl release

A Chat Client for LLMs, written in Compose Multiplatform. Target supports API providers such as OpenRouter, Anthropic, Grok, OpenAI, DeepSeek, Coze, Dify, Google Gemini, etc. You can also configure any OpenAI-compatible API or use native models via LM Studio/Ollama.

Release

v1.0.6

Feature

  • Desktop Platform Support(Windows/MacOS/Linux)
  • Mobile Platform Support(Android/iOS)
  • Chat(Stream&Complete) / Chat History
  • Chat Messages Export / Chat Translate Server
  • Prompt Management / User Define
  • SillyTavern Character Adaptation(PNG&JSON)
  • DeepSeek LLM / Grok LLM / Google Gemini LLM
  • Claude LLM / OpenAI LLM / OLLama LLM
  • Online API polling
  • MCP Support
  • MCP Server Market
  • RAG
  • TTS(Edge API)
  • i18n(Chinese/English) / App Color Theme / App Dark&Light Theme

Chat With LLMs

1

Config Your LLMs API Key

2

Prompt Management

4

Chat With Tavern Character

6

User Management

5

Config MCP Servers

3

Setting

7

Model Context Protocol (MCP) ENV

MacOS

brew install uv
brew install node

windows

winget install --id=astral-sh.uv  -e
winget install OpenJS.NodeJS.LTS

Build

Run desktop via Gradle

./gradlew :desktopApp:run

Building desktop distribution

./gradlew :desktop:packageDistributionForCurrentOS
# outputs are written to desktopApp/build/compose/binaries

Run Android via Gradle

./gradlew :androidApp:installDebug

Building Android distribution

./gradlew clean :androidApp:assembleRelease
# outputs are written to androidApp/build/outputs/apk/release

Thanks

DeepCo FAQ

How do I install DeepCo on different platforms?
DeepCo supports Windows, MacOS, Linux, Android, and iOS with installation instructions available on its GitHub repository.
Can DeepCo connect to multiple LLM providers?
Yes, DeepCo is designed to work with various LLM providers including OpenAI, Anthropic Claude, and Google Gemini.
Is DeepCo open source?
Yes, DeepCo is licensed under GPL-3.0 and its source code is available on GitHub.
What programming language is DeepCo built with?
DeepCo is developed using Kotlin and Compose Multiplatform for cross-platform compatibility.
Does DeepCo support offline usage?
DeepCo requires internet connectivity to interact with LLM APIs and does not support offline LLM inference.
How can I contribute to DeepCo?
Contributions are welcome via GitHub by submitting issues, feature requests, or pull requests.
Can DeepCo be customized for specific LLM workflows?
Yes, its open architecture allows developers to extend and customize chat interactions.
Does DeepCo support secure communication with LLMs?
DeepCo relies on secure API connections to LLM providers, ensuring data privacy during chats.