Eternal Recall is built on ACT-R's mathematical framework—the same cognitive architecture used by researchers at Carnegie Mellon University for over 50 years.

If you haven't read the Science Overview yet, start there for the foundational concepts.

The Mathematical Foundation

Eternal Recall implements the original ACT-R formulas from Carnegie Mellon research:

Base-Level Learning Equation

B(t) = ln(Σtk-d) + β

This equation calculates how accessible a memory is based on when and how often it was accessed. More recent and frequent access means higher accessibility.

— Anderson & Lebiere, 1998

Activation Equation

A(t) = B(t) + Σwjsij

Total activation combines base-level learning with spreading activation from related context. This is how thinking about one thing makes related things more accessible.

— Anderson et al., 2004

Retrieval Probability

P(t) = 1 / (1 + e-(A(t)-τ)/s)

The probability of successfully recalling a memory follows a sigmoid function based on its activation level relative to a threshold.

— Anderson & Lebiere, 1998

Our Implementation: Eternal Recall uses these foundational equations as a starting point, with our own proprietary adaptations optimized for AI memory systems. The specific implementation details remain confidential.

The Memory Lifecycle

Every memory in Eternal Recall follows a natural lifecycle inspired by human cognition:

1

Creation

New memories are stored with high accessibility and automatically categorized and linked to related concepts.

2

Natural Decay

Following ACT-R's power law decay, memories gradually become less accessible over time—just like human memory.

3

Retrieval & Strengthening

When a memory is accessed, its accessibility increases. Frequently used memories stay highly accessible.

4

Deep Memory

Memories that fade become "deep"—harder to recall spontaneously but never deleted. They can always be retrieved with effort.

Spreading Activation: The Knowledge Graph

How Context Flows Through Memory

In ACT-R, memories don't exist in isolation. When you think about one concept, related concepts become more accessible. This is called spreading activation.

Eternal Recall implements this with a Knowledge Graph—a network of connected entities including people, projects, concepts, files, and locations.

How It Works

  1. You mention a concept (e.g., "Project Alpha")
  2. That concept receives activation
  3. Activation spreads to connected concepts
  4. Related memories become more accessible
  5. Your AI has better context for the conversation

Example: Mentioning "Python"

Python
Flask
debugging
automation
web_dev
error_handling
scripting

Related concepts become more accessible through network connections.

Buffer Decay: Extended Context Windows

Based on research by Thomson, Bennati & Lebiere (Carnegie Mellon), Eternal Recall implements buffer decay—recently discussed topics continue to influence context even after moving on.

The Research Foundation

Cognitive science research shows that short-term memory influence doesn't disappear instantly. Peterson & Peterson (1959) and Waugh & Norman (1965) demonstrated that recently processed information continues to affect cognition for several seconds.

Eternal Recall implements this insight: when a topic leaves active discussion, it doesn't immediately lose all influence. Instead, its contextual priming fades gradually over a short window.

The Result

More natural conversations where context from moments ago still informs current responses—just like talking to a human who remembers what you just said.

Memory Consolidation

As memories age, Eternal Recall can consolidate them—creating compressed summaries that preserve essential information.

What Consolidation Does

  • Generates compressed "gist" summary
  • Preserves original content (never deleted)
  • Gist used for quick retrieval
  • Full original available on demand

Benefits

  • Faster context loading
  • Reduced memory footprint
  • Essential info preserved
  • Full detail always available

Example

Original Memory:

"User asked about implementing a REST API in Flask. Discussed route decorators, request/response handling, JSON serialization, and error handling patterns. Recommended using Flask-RESTful for larger projects. User mentioned they're working on a project called 'DataSync' that needs authentication."

Consolidated Gist:

"Flask REST API discussion for DataSync project. Topics: routes, JSON, errors. Recommended Flask-RESTful."

Project-Based Memory Boosting

When you're working on a specific project, all related memories should be more accessible. Eternal Recall implements project boosting to make this happen.

How It Works

  1. Register a Project: Mark a project as active
  2. Automatic Discovery: System finds all related memories
  3. Accessibility Boost: Related memories become more accessible
  4. Decay Protection: Project memories stay accessible longer
  5. Completion: When done, memories return to normal decay

This mirrors how human memory works—when you're focused on a project, everything related to it comes to mind more easily.

Automatic Disaster Recovery

AI can make mistakes. Eternal Recall automatically creates recovery points—you can always roll back to a previous state without remembering to save.

📸 Automatic Snapshots

The system automatically captures your memory state as you work. No manual saves required.

  • Continuous background protection
  • Zero user intervention needed
  • Always have a recovery point
  • Never lose your data

⏪ Instant Rollback

Made a mistake? Roll back to any previous state instantly.

  • Undo recent changes
  • Restore any point in time
  • Find last working state
  • Selective restoration

✅ The Power of Persistent Memory

Because Eternal Recall uses persistent local storage, rollbacks are always possible. Your data exists as real files on your computer—not ephemeral cloud data that disappears.

During development, a bug corrupted thousands of memories. Full recovery took under 5 minutes—no manual backups needed.

Continue Learning

Experience Cognitive Science in Action

Eternal Recall launches on Kickstarter in January 2026

Get Notified at Launch