Not Marketing Hype. Real Science.

Eternal Recall isn't built on buzzwords or Silicon Valley trends. It's built on ACT-R—one of the most thoroughly validated theories of human cognition ever developed.

While other AI memory systems use simple database timers ("delete after 30 days") or opaque algorithms you can't inspect, Eternal Recall implements the actual mathematical models that cognitive scientists use to understand human memory.

The result? An AI memory system that behaves like human memory—because it's based on the same principles.

What is ACT-R?

ACT-R (Adaptive Control of Thought—Rational) is a cognitive architecture developed principally by Dr. John R. Anderson at Carnegie Mellon University. It represents over 50 years of research into how humans think, learn, remember, and forget.

The Core Insight

ACT-R models how humans recall "chunks" of information from memory and how they solve problems. The key insight is that memory strength isn't binary (remembered vs. forgotten)—it exists on a continuous spectrum of "activation" that changes over time based on use.

ACT-R has been used to model hundreds of psychological experiments and has been validated across domains including:

  • Memory and learning
  • Problem-solving and decision-making
  • Language comprehension
  • Skill acquisition
  • Attention and perception

🏛️ Carnegie Mellon University

ACT-R was developed in the Psychology Department at Carnegie Mellon University, one of the world's leading research institutions for cognitive science and artificial intelligence.

🎖️ Real-World Applications

ACT-R has been used by the US Military, NASA, and research labs worldwide to model human cognition for training systems, interface design, and human-AI interaction.

How ACT-R Models the Mind

ACT-R divides human knowledge into two fundamental types:

📚

Declarative Knowledge

"Knowing That"

Facts you can consciously recall and articulate. Stored as "chunks" of information.

Examples:

  • "Paris is the capital of France"
  • "My meeting is at 3pm"
  • "The user prefers detailed explanations"
  • "We discussed Python debugging yesterday"
In Eternal Recall: Every memory stored—your preferences, projects, conversations—is a declarative chunk with its own activation level.
⚙️

Procedural Knowledge

"Knowing How"

Skills and processes you execute without conscious thought. Stored as "production rules."

Examples:

  • How to ride a bike
  • How to parse a sentence
  • How to respond to a greeting
  • How to connect related concepts
In Eternal Recall: The knowledge graph uses production-like rules to connect memories and spread activation between related concepts.

The Heart of ACT-R: Activation

The central concept in ACT-R is activation—a numerical measure of how accessible a memory is at any given moment. Higher activation means the memory is more likely to be recalled.

The Activation Equation

A(t) = B(t) + Σ wⱼsᵢⱼ

A(t) = Total activation at time t

B(t) = Base-level activation (recency + frequency)

Σ wⱼsᵢⱼ = Spreading activation from related memories

This equation captures two fundamental truths about human memory:

  1. Recent and frequent memories are stronger. If you just thought about something, or think about it often, it's easier to recall.
  2. Context matters. Thinking about related concepts makes associated memories easier to recall (spreading activation).

Why This Matters for AI

Other AI memory systems treat memories as binary: either you have access or you don't. ACT-R—and Eternal Recall—treats memory as a spectrum.

Important, frequently-used memories stay accessible. Rarely-used memories fade but are never deleted. Just like your brain.

Base-Level Learning: The Decay Equation

The base-level activation of a memory reflects its history of use. Every time a memory is accessed, it gets a boost. Over time, that boost decays—but never completely disappears.

The Base-Level Equation

B(t) = ln(Σ tₖ⁻ᵈ) + β

tₖ = Time since each access of the memory

d = Decay rate (typically 0.3-0.5)

β = Base constant

What This Means in Practice

  • Recency: A memory accessed 1 minute ago is more accessible than one accessed 1 hour ago.
  • Frequency: A memory accessed 100 times is more accessible than one accessed once.
  • Power Law: Decay follows a power function, not exponential—memories fade quickly at first, then slowly stabilize.
  • Cumulative: Each access adds to the total, so well-used memories remain accessible even without recent use.

📈 The Decay Curve

Memory accessibility over time follows a predictable curve:

Now
1 day
1 week
1 month
1 year

Memories fade but never fully disappear. Each use boosts them back up.

Spreading Activation: Context is Everything

When you think about one thing, related things become easier to recall. ACT-R models this with spreading activation—activation flows through connections between memories.

How It Works

Imagine you're thinking about "Python." Activation spreads to connected concepts:

Python
→ debugging (strong) → Flask (moderate) → data science (weak)

If you've often discussed Python debugging together, that connection is strong. Concepts you rarely connect have weaker associations.

In Eternal Recall

Our Knowledge Graph implements spreading activation:

  • Memories connect to entities (people, projects, concepts)
  • When you mention a project, related memories "light up"
  • Connections strengthen through repeated co-activation
  • Your AI naturally recalls relevant context

This is why Eternal Recall feels intuitive—it retrieves context the same way your brain does.

The Innovation: ACT-R for AI Memory

What ACT-R Was Built For

ACT-R was developed to model human cognition—to understand and predict how humans think, learn, and remember. It's been used for:

  • Cognitive tutoring systems
  • Human performance modeling
  • Interface usability research
  • Training simulation

What I Did Differently

I'm the first to apply ACT-R's memory equations to AI persistent memory.

Instead of modeling how a human remembers, I used these equations to give an AI a human-like memory system. The activation, decay, and spreading activation that cognitive scientists validated over 50 years—now powering how your AI remembers you.

Why It Works

ACT-R equations were optimized for human memory. They capture:

  • The balance between remembering and forgetting
  • How context aids recall
  • Why some memories persist while others fade
  • How importance affects retention

These are exactly the properties you want in an AI memory system.

Dive Deeper

Ready for AI Memory Based on Real Science?

Eternal Recall launches on Kickstarter in January 2026

Get Notified at Launch Read the Research