Hello CryptPad team,
I’d like to propose the integration of a connector compatible with OpenAI-like APIs (via LiteLLM) to bring optional, privacy-conscious generative AI capabilities to CryptPad.
Motivation
CryptPad is a pioneer in privacy-first collaboration. Many organizations and users are now exploring generative AI to boost productivity. By adding a modular AI connector compatible with LiteLLM — which supports 100+ LLM providers including on-premise and sovereign models — CryptPad can offer smart features without compromising user privacy.
Proposal
Introduce a configurable AI backend (LiteLLM or compatible OpenAI APIs) that can be self-hosted and optionally enabled by administrators. This would allow:
Email app enhancements:
- Draft assistance for email composition
- Thread summarization for long conversations
- etc.
Spreadsheet (Sheets) integration:
- Chat assistant to help with formula generation and data manipulation
- etc
Optional in-app chat assistant:
- Context-aware help across CryptPad modules (notes, kanban, etc.)
All prompts and responses can be encrypted and processed locally or through trusted, self-hosted LLMs (e.g. Mistral, LLaMA, Claude, etc.).
Benefits
- Enhances productivity with intelligent features
- Fully optional
- Compatible with the project’s privacy principles when paired with on-premise LLMs
- Makes CryptPad more competitive against commercial AI-powered suites (while remaining FOSS)
Implementation Notes
Leverage LiteLLM’s abstraction layer to avoid tight coupling to a specific provider
UI
- integration should respect CryptPad's design language and minimalism
- Clear warnings/disclaimers for users when AI is used, especially if remote endpoints are configured
Would the team be open to discussing a possible prototype or roadmap for this feature?
Thank you for your time and for your work on this project.
Best regards,
François from France