Zep: A long-term memory store for LLM applications
Quick Start |
Documentation |
LangChain Support |
Discord
www.getzep.com
Easily add relevant documents, chat history memory & rich user data to your LLM app's prompts.
Vector Database with Hybrid Search
Populate your prompts with relevant documents and chat history. Rich metadata and JSONPath query filters offer a powerful hybrid search over texts.
Batteries Included Embedding & Enrichment
- Automatically embed texts, or bring your own vectors.
- Enrichment of chat histories with summaries, named entities, token counts. Use these as search filters.
- Associate your own metadata with documents & chat histories.
Fast, low-latency APIs and stateless deployments
- Zep’s local embedding models and async enrichment ensure a snappy user experience.
- Storing documents and history in Zep and not in memory enables stateless deployment.
Python & TypeScript/JS SDKs, LlamaIndex and LangChain Support, Edge Deployment
- Python & TypeScript/JS SDKs for easy integration with your LLM app.
- LangChain and LangChain.js integration
- LlamaIndex VectorStore and Reader
- TypeScript/JS SDK supports edge deployment.
Get Started
Install Server
Please see the Zep Quick Start Guide for important configuration information.
docker compose up
Looking for other deployment options?
Install SDK
Please see the Zep Develoment Guide for important beta information and usage instructions.
pip install zep-python
or
npm i @getzep/zep-js