feat(chat): add LangChain memory service for conversation history
Summary
- Implement LangChain-based conversation memory with Summary + Buffer strategy
- Add
/memoryendpoints for n8n workflow integration - Externalize all configuration to
.envfile - Switch chatbot-rag workflow to use OpenAI API directly
Changes
New Features
1. LangChain Memory Service
-
Strategy: Summary + Buffer
- Buffer: Keep last 10 messages in memory
- Summary: Auto-generate summary when > 6 new messages accumulate
- Storage: Summary stored in
conversations.summarycolumn
2. Memory API Endpoints
| Method | Endpoint | Description |
|---|---|---|
| GET | /chat/conversations/{id}/memory |
Get conversation memory for LLM |
| POST | /chat/conversations/{id}/memory |
Get memory with request body |
3. Configuration Externalization
All constants moved to .env:
# LiteLLM
LITELLM_URL=https://litellm-proxy.mc-intern.com/v1/chat/completions
LITELLM_MODEL=gpt-5
LITELLM_API_KEY=sk-...
# Memory
MEMORY_BUFFER_SIZE=10
MEMORY_MAX_MESSAGE_LENGTH=4000
MEMORY_SUMMARY_THRESHOLD=6
# RAG
RAG_KVEC=5
RAG_KLEX=3
4. OpenAI Direct Integration
- Workflow
chatbot-rag.jsonnow calls OpenAI API directly - No dependency on shared n8n credentials
- API key passed in Authorization header
Database Migration
New columns in conversations table:
| Column | Type | Description |
|---|---|---|
summary |
TEXT | Summary of old messages |
summary_message_count |
INTEGER | Number of summarized messages |
Run migration:
python Agent/scripts/migrate_conversation_summary.py
Files Changed
| File | Change |
|---|---|
Agent/scripts/migrate_conversation_summary.py |
New - Migration script |
Integration/config/constants.py |
New - Externalized config |
Integration/services/langchain_memory_service.py |
New - Memory service |
Integration/models/conversation.py |
Modified - Add summary columns |
Integration/schemas/chat.py |
Modified - Add Memory schemas |
Integration/chat_api.py |
Modified - Add /memory endpoints |
Integration/workflows/chatbot-rag.json |
Modified - OpenAI direct |
Agent/.env |
Modified - Add new variables |
Test Plan
-
Run migration script -
Test /memoryendpoint (GET) -
Test /memoryendpoint (POST) -
Test conversation with 5+ messages -
Verify history is passed to LLM -
Verify AI responds with context awareness -
Verify database structure (user/agent alternation) -
Run Integration/tests/test_chat_memory.sh- All 8 tests pass
Architecture
User Message
|
v
POST /chat/conversations/{id}/messages
|
v
n8n Workflow (chatbot-rag)
|
+---> GET /chat/conversations/{id}/memory
| |
| v
| LangChainMemoryService
| |
| +---> Summary (if available)
| +---> Buffer (last 10 messages)
|
+---> POST /search/mixed (RAG)
|
v
Build Context: [Summary] + [Buffer] + [RAG Docs]
|
v
OpenAI API (gpt-5)
|
v
Response saved to DB