Ollama_Agents uses JSON files to store memories, which include conversation history, document chunks, and other relevant information. These files are crucial for the AI's long-term memory and context understanding.
Each memory is stored as a separate JSON file in the data/json_history
directory. The filename format is:
YYYYMMDD_HHMMSS_<memory_type>.json
For example: 20230515_143022_interaction.json
Here's a typical structure of a memory JSON file:
{
"timestamp": "2023-05-15T14:30:22.123456",
"username": "User123",
"model_name": "llama3.1:latest",
"type": "interaction",
"content": {
"prompt": "What is the capital of France?",
"response": "The capital of France is Paris."
},
"access_count": 3,
"permanent_marker": 0,
"embedding": [0.1, 0.2, 0.3, ..., 0.9]
}
- Format: ISO 8601 (YYYY-MM-DDTHH:MM:SS.mmmmmm)
- Purpose: Used for chronological ordering and age-based pruning
- Purpose: Identifies the user associated with the memory
- Purpose: Specifies the AI model used for this interaction
- Possible values: "interaction", "document_chunk"
- Purpose: Distinguishes between different types of memories
- For interactions: Contains "prompt" and "response"
- For document chunks: Contains the text content of the chunk
- Purpose: Tracks how often this memory has been accessed
- Usage: Can be used for relevance-based pruning (higher count = more relevant)
- Values: 0 (not permanent) or 1 (permanent)
- Purpose: Flags memories that should never be pruned
- Purpose: Vector representation of the memory content for similarity search
- The
access_count
is incremented each time the memory is retrieved or used in a search
- Important memories can be marked as permanent (e.g., user preferences, critical information)
- Set
permanent_marker
to 1 for these memories
These fields can be used to implement various pruning strategies:
- Age-based pruning: Remove memories older than a certain date using the
timestamp
- Relevance-based pruning: Remove memories with low
access_count
- Type-based pruning: Prune certain types of memories based on the
type
field - Model-specific pruning: Remove memories associated with deprecated models using
model_name
- Selective preservation: Never prune memories with
permanent_marker
set to 1
The JSON structure allows for easy analytics:
- Track user activity by analyzing memories per
username
- Measure model usage patterns using the
model_name
field - Identify frequently accessed information via
access_count
- Ensure that sensitive information is not stored in plain text in the
content
field - Implement access controls to protect user-specific memories
- Implement a scoring system combining
timestamp
andaccess_count
for more nuanced pruning - Add a
last_accessed
field to track recency of memory usage - Introduce a
relevance_score
field updated by the AI based on context relevance
Remember to update this guide as new fields or management strategies are implemented in the Ollama_Agents system.