A Framework for Impossible Conversations

AI for Time Travel

How AI Enables Conversations Across Time

Talk to your grandparents in the 1970s. Preview your future self. Rehearse difficult conversations before they happen.

AI doesn't just automate tasks—it enables relationships across time.

What You'll Learn

  • The Time-Shifted Proxy pattern: a TRIZ-like operator for AI innovation
  • Why late 2025 is the inflection point (73% can't tell AI video from real)
  • Business applications already generating 340% better training ROI
  • A design space matrix for generating your own applications

December 2025

I Called My Grandparents in the 1970s

My brother is into genealogy. He's the kind of person who actually reads through old family letters—boxes of them, handwritten, folded, posted. Most people don't. It feels like homework.

One day he showed me a letter from our aunt Carol, written to her parents back in the 1970s. Ordinary family correspondence: news about the kids, plans for a visit, everyday life. The kind of artefact that's technically rich—full of context and personality—but emotionally distant. You have to want to spend time with it. You have to be into genealogy.

I looked at that letter and saw something else. Not a document. A conversation trapped in the wrong medium.

The Experiment

I decided to translate it. Same source material. Two completely different outputs.

Output 1: The Radio Play

First, I created what you might call a "radio play"—a produced audio piece where you hear Carol calling her father. It's the 1970s on the call. He doesn't know about the future. He's just... Dad. The conversation contains everything from the letter: family news, the kids, plans for visits, period-appropriate concerns.

But it's no longer text you read. It's a conversation you listen to.

Output 2: The Interactive Phone Call

This is where it gets different.

You can actually ring my grandparents. Or ring Carol herself. And have a real conversation—not listening to a recording, but in the 1970s, talking to them. The AI knows what was in the letter. Knows the family context. Responds as them, in their voice, in their time.

You're not consuming a narrative. You're participating in a conversation.

"Same source material. Two completely different experiences. One is a produced piece you consume. The other is a conversation you participate in."

The radio play is passive—cinema-style, crafted narrative. The interactive version is active—you ask questions, you explore, you have agency.

Both are built from the same fragmentary evidence: one letter. Both feel emotionally real despite being synthetic.

My Brother's Reaction

"Gobsmacked" is the word he used.

But not because it sounded realistic. That's the obvious reaction, and it's the least interesting one. He was gobsmacked because he suddenly saw his genealogy research differently.

All those letters and documents he'd been collecting for years—carefully preserving, cataloguing, transcribing—they didn't have to stay as archives. They could become conversations.

The Shift in Perspective

Genealogy Normally Feels Like:
  • Research—database work, connecting nodes on a family tree
  • Document preservation—scanning, cataloguing, transcribing
  • Interesting but dry—you're managing information, not experiencing people
This Felt Like:
  • Connection—actual presence, not just facts
  • Talking to someone who's been gone for decades
  • Not just interesting—emotionally real

What Actually Changed?

The information was always there. The letter existed. The facts were known.

What changed was the medium.

We're not built for documents. We're built for conversations. Families remember through stories told out loud, not through filing systems.

AI didn't create new information. It translated existing information into a form humans actually want to engage with.

Stumbling Into Something Bigger

As I watched my brother's reaction, something clicked. This wasn't just a genealogy hack. This wasn't just a "cool AI project."

The experience felt... larger. Like I'd accidentally walked into a room that connected to many other rooms.

I started thinking about other things this reminded me of:

• Sales training I'd built with AI customers

• A dating app concept that had been rattling around in my head

• Marketing ideas about "future-you" testimonials

• An app for capturing your voice before you die

• Tools for rehearsing difficult conversations

These all felt... related somehow. But they'd never connected in my mind before.

Then I realised: they're all variations of the same thing.

"AI as a way to let emotionally meaningful conversations happen across time—with people who exist outside their normal time slot."

The Questions This Raised

? What is the pattern here?

? Why do these seemingly unrelated ideas feel like variations of the same thing?

? If there's a generalizable principle, what is it?

? And if it works for genealogy, where else could it work?

It turns out, those questions have answers. And the answers are stranger—and more useful—than I expected.

But first, let me show you five wildly different ideas that turned out to be the same idea wearing different clothes.

Five Ideas That Seem Unrelated

After the Carol experiment, I started noticing other ideas rattling around in my head. They'd never connected before. But now they seemed to rhyme. Let me walk you through five of them.

These will seem unrelated at first. Your job: spot the common thread. By the end of this chapter, you should see it. And once you see it, you can't unsee it.

Example 1: The AI Partner as Future Matchmaker

Here's an idea: a dating app where you fall in love with an AI. But not as the end state. The AI is the means to something else.

You start by talking to an AI partner. It's flirty, supportive, tuned to your likes and dislikes. Over weeks and months, the AI is gradually building a profile of who you'd actually be compatible with. It learns from how you interact:

  • What you respond to
  • What bores you
  • Where you argue but stay engaged
  • Where you shut down

Meanwhile, across a million users, everyone is falling in love with their own AIs. Behind the scenes, the system is quietly pairing compatible people based on relational patterns—not photos and bios, but how they actually relate.

At some point, both users get a notification:

"You've been matched with someone who shares your relational pattern. Want to talk to the human behind this?"

The AI partner was always a placeholder. A stand-in for your future partner—someone you hadn't met yet.

"The AI is basically future-partner-in-draft mode."

Example 2: Virtual Scott After Death

Another idea: an app that continuously captures your signals through life.

What It Captures
  • • Voice recordings
  • • Writing samples
  • • Photos
  • • 24/7 audio recording (like Rewind.ai for PC screen and audio)
  • • PC usage patterns
  • • Messages and emails
  • • Stories you tell
What It Builds
  • • People graph (family, friends)
  • • Event graph (jobs, trips, crises)
  • • Preference graph (music, politics, jokes)
  • • Language patterns (phrases, pacing)
  • • Emotional "moves" (joke → insight → reassurance)

It's not surveillance—it's deliberate life-logging with a purpose. When you die, the system has enough data to build a conversational model of "you." Your family can still talk to you.

Not resurrection. Not "real" Scott. Something different:

"A compressed explainer of how Scott would likely react, trained on 40 years of receipts."

For family and friends, this can feel meaningful. Not "fake Scott" but "a way to remember how Scott thought."

The system could even track eras: Scott in his 30s, Scott in his 50s, "Dad with small kids" Scott, "Consultant Scott." Different chapters of life, different conversational modes.

Example 3: Future-You Holiday Testimonials

You're considering booking a themed holiday. Before you decide, you receive a video. It's "future you"—three months after the trip. They're telling present-you how much they loved it. Everything they saw, felt, experienced.

Or maybe it's your kid at age 10, explaining what a fun time they had at age 8.

"I can't believe how much the kids loved swimming with the fish on the reef... and watching them chase the kangaroos."
— Kevin, reflecting on a trip that hasn't happened yet.

Why This Works Psychologically

Normal marketing: Brand talks at you about benefits.

This version: You-from-the-future talks to you about your actual experience.

Still a simulation—but emotionally compelling because it's personal. Our brains are bad at imagining future experience. But a vivid testimonial from "future us" short-circuits that discount.

Example 4: AI Customers for Sales Training

This one isn't hypothetical. I've built and deployed it.

Twelve different AI personas. Each represents a different client type. Each has hidden problems your salespeople need to uncover and solve. Staff ring the AIs to practice. Real conversations with synthetic customers.

How It Works

AI customers are built from real patterns:

Data Sources

  • → CRM notes
  • → Lost deal analyses
  • → Customer support tickets
  • → Actual objections and concerns

Can Simulate

  • → Angry customers
  • → Skeptical prospects
  • → Budget-constrained buyers
  • → Enterprise procurement processes

They respond dynamically—not scripted. Staff practice discovery, demos, objection handling. Scored and reviewed after each call.

Why It Beats Traditional Training

❌ Traditional Approach

  • • Read a script
  • • Roleplay with colleague who doesn't really know how to object
  • • Limited iterations before "using up" training partners
  • • Feedback is subjective and delayed

✓ AI Version

  • • Practice against realistic, varied resistance
  • • Unlimited iterations without "using up" anyone
  • • Immediate, objective feedback on what worked
  • • Safe environment to fail and learn
Beyond Training: The Disgruntled Customer Stand-In

This isn't just for sales reps practicing. It's also for executives to experience what angry customers actually sound like.

Take real feedback records—NPS comments, support tickets, churn interviews—and render them as:

  • A video you play at a board meeting
  • An interactive avatar executives can question in real-time
  • A stand-in that lets leadership feel the emotional reality of customer frustration

Reading a summary of customer complaints is one thing. Sitting across from an AI that embodies those complaints—frustrated, specific, emotional—is something else entirely.

Example 5: Rehearsing Difficult Conversations

You need to have a hard conversation:

  • Asking your boss for a raise
  • Difficult talk with your partner
  • Telling your team about layoffs
  • Confronting a colleague about behaviour

Before you do it for real, you practice with an AI. The AI plays the other person. Tuned to behave like them—their patterns, likely objections, emotional style.

The Value Proposition

High-stakes conversations rarely go well when unrehearsed. But you can't exactly practice asking for a raise with your actual boss.

The AI gives you a safe space to:

  • → Try different approaches
  • → Experience likely reactions
  • → Build muscle memory for staying calm
  • → Iterate until you feel ready
"The more you expose yourself to saying difficult things in a safe environment, the more your body and mind stay regulated when it counts."

The Pattern Emerges

Pause here. Look back at the five examples.

They span:

Dating
Death/Legacy
Marketing
Sales Training
Personal Development
...

Totally different domains. And yet...

The Connection

Every single one is a conversation that can't normally happen:

🚫 Blocked by death: Virtual Scott, Carol's parents
🚫 Blocked by time: Future-you, future partner
🚫 Blocked by scale: 12 different customers
🚫 Blocked by risk: Boss, partner, difficult talks

Every single one solves the problem the same way:

Build an AI stand-in
Frame it in the right time direction (past, future, parallel)
Present it in an emotionally accessible medium (call, video, chat)

These aren't random cool AI tricks.

They're all instances of the same thing.

There's a pattern here. A generalizable principle.

Chapter 3 will extract it.

The Time-Shifted Proxy Pattern

Five seemingly unrelated ideas. All instances of the same underlying structure. It needs a name.

Naming the Pattern

After walking through those five examples—dating apps, posthumous conversations, holiday previews, sales training, difficult conversation rehearsal—the common thread becomes visible. Each one is a conversation that can't normally happen. Blocked by death, blocked by time, blocked by scale, blocked by risk.

And each one solves the problem the same way: build an AI stand-in, frame it in the right time direction, and present it in an emotionally accessible medium.

This pattern deserves a name: Time-Shifted Proxy.

The Three Ingredients

Every Time-Shifted Proxy application contains three core elements. Once you recognise them, you can construct new applications deliberately rather than stumbling upon them by accident.

1. Time Shift

All examples involve a temporal displacement. Time is the first axis of impossibility—we can only interact with people in the present moment. Normally.

  • Past: Talking to grandma in 1975, reconstructing deceased relatives from letters and photographs
  • Future: Future-you testimonial, dating a future partner archetype, message from year-end you
  • Parallel: Simulated customer (composite of many real ones), practising with "someone like your boss"

AI removes the constraint that relationships require presence.

2. AI Stand-In / Proxy

The conversation is "impossible" because the other party isn't available. They're dead, they haven't met you yet, they don't exist as a single person, or the real version is too risky to practise with.

AI provides a stand-in—not the real person, but something that feels close enough. The stand-in is built from available data:

  • Letters, photos, documents (for deceased)
  • CRM notes, support tickets, survey responses (for customers)
  • Known personality traits, likely objections (for boss/colleague)
  • User's own preferences and patterns (for future partner)

3. Media Translation

Raw data is hard to consume. Letters are high-friction. CRM notes are dry. Documents are boring.

The AI translates that data into a form humans actually engage with: phone calls, video testimonials, interactive conversations, voice notes.

This is the key insight: we've always had the information. What we lacked was the right medium.

"The constraint was never the data—it was the accessibility. AI translates high-friction archives into low-friction interactions."

The Five-Step Operator

Once you understand the three ingredients, you can apply the pattern systematically. Here's how to generate Time-Shifted Proxy ideas in any domain:

1

Identify an impossible conversation

Who do you wish you could talk to, but can't? Blocked by death? Time? Scale? Risk?

Examples: A churned customer who won't return calls. A founder from year one of the company. Your future self who's already made this decision. A hundred variations of "typical difficult stakeholder".

2

Choose the time direction

  • Past: "What really happened?" / "What would grandma say?"
  • Future: "How did this turn out?" / "What does year-end me think?"
  • Parallel: "What would someone like this typically say?"
3

Build the AI stand-in

What data traces exist? Letters, emails, recordings, transcripts, CRM notes, support tickets, published work, known opinions.

The stand-in doesn't need perfect accuracy. It needs enough texture to feel real.

4

Translate the medium

Pick the format that matches the use case:

  • Text chat: Lowest friction, good for exploration
  • Voice call: Higher intimacy, feels more real
  • Video: Maximum emotional impact, requires most production

The medium shapes the experience. A phone call to grandma hits differently than reading her letter.

5

Attach a purpose

Why are you having this conversation?

  • Training: Rehearse before real stakes
  • Marketing: Persuade with future-self endorsement
  • Decision support: Get perspective from past/future you
  • Heritage: Connect with family across time
  • Product design: Talk to future users

The purpose determines your success metrics.

Why This Feels Like TRIZ

If you're familiar with systematic innovation frameworks, you might be feeling a sense of déjà vu. The Time-Shifted Proxy pattern maps almost perfectly onto principles from TRIZ—the Russian methodology for inventive problem solving.

Three TRIZ principles are particularly relevant to Time-Shifted Proxies:

Principle #10: Preliminary Action

"Perform, before it is needed, the required change of an object."

In our context: rehearse the conversation before it's real. Practise asking for the raise with AI-boss before real boss. Preview the holiday experience before booking. Meet your future partner's archetype before meeting the person.

Principle #24: Intermediary

"Use an intermediary carrier article or intermediary process."

The AI isn't the destination—it's the bridge. In dating: AI partner is intermediary to real human. In sales training: AI customer is intermediary to real pitch. In marketing: future-you is intermediary to purchase decision.

Principle #26: Copying

"Replace unavailable, expensive, or fragile objects with inexpensive copies."

Grandma in 1975 is "copied" from letters. Angry customer is "copied" from CRM notes and survey feedback. Future-you is "copied" from likely trajectories. The copy doesn't need to be perfect—it needs to serve the purpose.

"That's you doing TRIZ, just with LLMs and time instead of springs and gears."

The power of pattern recognition is this: once you name a principle, you can apply it deliberately. Instead of waiting for creative insight, you use a checklist:

  • "Who can't I talk to?"
  • "What data traces exist?"
  • "What medium would make this feel real?"

That's systematic innovation. Not magic—method.

The Unifying Insight

Let's pull back and look at what this pattern reveals about AI's emerging role.

These examples—dating apps, posthumous conversations, holiday previews, sales training, difficult conversation rehearsal—aren't random AI tricks. They're all manifestations of the same underlying capability:

"Using AI to let humans have emotionally meaningful conversations across time, not just across space."

We've always wanted to talk to the dead. We've always wanted to preview the future. We've always wanted to rehearse before high-stakes moments. We just couldn't.

Now we can.

Why It Works on the Brain

We're terrible at reading archives. We're brilliant at talking to people.

Forty pages of letters? Homework. A six-minute phone call that compresses those letters into grandma's voice? Instantly consumable.

Conversation is our native interface for memory.

Our brains don't perfectly distinguish "live" from "simulated" when emotional and sensory cues are strong enough. That's why it feels like time travel.

A New Product Category

This is a new product category emerging in real-time.

Not "AI assistant"—assistants help with tasks. Not "chatbot"—chatbots answer questions.

This is something different:

AI systems that let humans move through their own timeline—past, present, future—with emotional fidelity, delivered in conversational formats.

Once you see it as a category, you can aim it anywhere:

  • Product development: Talk to future power users about what frustrated them
  • HR and onboarding: New hires interview your company's "founding team" from year one
  • Change management: Leaders rehearse difficult announcements with simulated team reactions
  • Customer research: Have conversations with composite personas built from thousands of support tickets
  • Personal development: Regular check-ins with "future-you" at various life stages
  • Family heritage: Multi-generational conversations that would be impossible otherwise

The pattern is universal. The applications are limited only by imagination and available data.

In the next chapter, we'll explore why late 2025 is the inflection point—why this pattern is becoming viable now rather than remaining speculative. Three technical barriers fell simultaneously, and the implications are just beginning to unfold.

Why Late 2025 Is the Inflection Point

Patterns are timeless. They can exist for years before being practical.

The film Her explored AI romantic relationships back in 2013 — twelve years before the parasocial AI market hit 220 million downloads. Sci-fi has been playing with "talking to the dead" for decades. Sales roleplay existed long before language models.

But theory and practice are different animals.

The Time-Shifted Proxy pattern wasn't invented in 2025. What changed in 2025 is that three technical barriers fell almost simultaneously, turning speculative applications into production-ready products. Each barrier was necessary; together they're transformational.

Barrier 1: Video Generation Crossed the Photorealism Threshold

Q3 2025 marked what industry observers call the photorealism threshold — the point at which AI-generated video can fool the human eye more often than not.

73%

of viewers can't distinguish high-quality AI video from traditional footage in blind tests

— Clippie.ai, 2025 AI Video Trends Report

"Many shots pass the visual Turing test, in which most people would not be able to distinguish that it's completely synthetic."
— Filmmaker Paul Trillo, on Google's Veo 2 (Variety)

This isn't incremental improvement. It's a category shift.

Your "future-you" holiday testimonial doesn't have to be text anymore. It doesn't have to be a stilted avatar moving like a badly puppeted game character. It can be cinematic video that looks like it was actually filmed — complete with natural lighting, realistic motion, authentic facial expressions.

Kevin reflecting on his trip to Sydney? That's rendered video, not footage. But the emotional impact registers the same way your brain processes real testimonials.

The Leading Platforms in Late 2025

Sora 2 (OpenAI)

The current gold standard. Physics accuracy, longer durations, cinematic quality. Best for premium content where visual fidelity is paramount.

Veo 2 (Google)

Noted for passing the visual Turing test. Strong on photorealism and natural motion.

Runway Gen-4

Professional toolkit with Motion Brush, Director Mode, and Camera Control. Ideal for users who need precise creative control across multiple shots.

Pika Labs 2.5

Best value. 30-second generation time makes it a "daily driver" for rapid content creation.

The medium translation axis just got supercharged. Video has massively higher emotional impact than text — and now that quality is achievable at production scale.

Barrier 2: Persona Modelling Became Commercially Accessible

Creating a "digital twin" used to be a major project. Custom development. Months of work. Enterprise budgets.

Now it's a product you can buy.

$700–$1,400

Current pricing for digital replicas

Super Brain (China) has created 1,000+ digital replicas since March 2023

$140

Planned app-only pricing

Order-of-magnitude cost reduction coming in next release cycle

The input requirements dropped as dramatically as the cost. DeepBrain AI's "Re;memory 2" can generate an AI avatar from:

  • 1 Single photo
  • + 10-second audio clip

Not a six-month data collection process. Something you could do for a deceased relative today.

"Some replicas are 90–95% indistinguishable from the real person."
— Sensay founder (The National)

Whether that claim holds up to rigorous scrutiny varies by use case. But the direction is undeniable: the barrier is dropping fast, and commercial products are racing to the bottom on both cost and complexity.

The Commercial Ecosystem That Emerged in 2025

Soul Link: Launched in Korea and Japan, expanding to Middle East. Uses photos, video, voice to create "digital twins" of deceased.

Twin Protocol: Curated data "vaults" (books, lectures, videos, voice memos) trained into interactive personas.

Sensay: "Virtual humans" that don't just sound like the person — they act on their behalf in transactions and decisions.

StoryFile: 5,000+ profiles for video Q&A. You record answers; AI plays relevant clips in response to questions from viewers.

HereAfter AI: Memory-sharing chatbots for $3.99–$7.99/month. Friendly virtual interviewer captures stories; family can query later.

China's virtual digital human market reached 48.06 billion yuan (~$6.7B USD) by 2025. (iMedia Research via NPR)

What this means for Time-Shifted Proxies: Building the AI stand-in is no longer the hard part. The data requirement dropped dramatically. The cost dropped by an order of magnitude. Anyone can now create a proxy — not just tech companies with big budgets.

Barrier 3: Parasocial AI Went Mainstream

People don't just tolerate AI relationships. They actively seek them out. The numbers are staggering.

220M

cumulative downloads of AI companion apps

as of July 2025

88%

year-over-year growth in H1 2025

$120M

projected 2025 revenue

— TechCrunch, AI Companion Apps Market Report

Revenue per download jumped from $0.52 in 2024 to $1.18 in 2025 — reflecting stronger consumer willingness to pay for emotional AI experiences.

Character.AI Dominance

92 minutes of daily usage

Highest engagement in the AI companion category

20 million monthly active users

60% of audience is 18–24 age group

— ElectroIQ, AI Companions Statistics

Teen Adoption

72%

of US teens have tried an AI companion

52%

are regular users

31%

say AI conversations are as satisfying or more satisfying than human ones

— TechCrunch, Teen AI Companion Study

Replika Statistics

30+ million total users

Average user exchanges ~70 messages per day with their AI companion

85%+ of users report developing emotional connections

— Nikola Roza, Replika AI Statistics & Trends Guide for 2025

"People already want AI for relationships, not just productivity. The demand exists. The question is productization."

The 52% OpenRouter role-playing statistic from earlier chapters is consistent with this broader market pattern. Emotional AI isn't fringe speculation anymore — it's mainstream behaviour.

The market is proven. The category is established. Now it's about who builds the best applications of the Time-Shifted Proxy pattern.

The Convergence

1

Video Quality

You can generate emotionally compelling, realistic footage that passes the visual Turing test.

2

Persona Accessibility

You can create a convincing stand-in cheaply and quickly from minimal input data.

3

Market Acceptance

People are already demonstrating they want this — 220M downloads, 92 minutes daily, 72% teen adoption.

Any one of these alone wouldn't be enough.

All three together = inflection point.

The Window

What was speculative twelve months ago is now buildable. The technical barriers that made Time-Shifted Proxies theoretical have fallen. Production-ready tools exist. Commercial ecosystems are forming. The market has spoken.

First movers in this space will define the category. Early adopters will shape the norms, the ethics, the canonical use cases. The design space is wide open.

"The technology is ready. The applications are emerging across every domain. The only question is: which conversation across time will you enable first?"

Cost of Delay

Someone else will productize these patterns. Startups are already deploying them in genealogy, sales training, medical education, marketing. The window for category-defining work is now.

You'll either be learning the framework while competitors deploy it — or you'll be the one setting the standard.

Late 2025 isn't just another point on the AI hype curve.

It's the moment when Time-Shifted Proxies became production-ready, commercially viable, and demonstrably wanted by mainstream users. The three barriers fell. The category opened.

What you build next will either define it — or chase it.

Business Applications Already Deployed

Chapters 1 through 4 established the pattern, the framework, and the timing thesis. But pattern recognition isn't enough for business readers. You need proof that this works and delivers ROI.

This chapter presents five domains where Time-Shifted Proxies are already deployed — not as speculation, but with measurable business outcomes validated in production environments.

Deployed Applications Summary

Domain Application Key Metric
Sales Training AI customer roleplay 340% better conversion improvement, 2x opportunities per rep
Medical Education AI patient simulation 80% of trainees want to continue using
Marketing Video testimonials 80% conversion boost, 95% viewer retention
Decision Support Future-self conversations Decreased anxiety, improved long-term decisions
Memorial Services Digital replicas $6.7B market, 5,000+ profiles deployed

Domain 1: Sales & Customer Success

AI-simulated customers for training represent one of the most mature applications of the Time-Shifted Proxy pattern. Sales representatives practice discovery calls, product demonstrations, objection handling, and compliance scenarios with AI avatars that simulate realistic customer behaviour.

The traditional alternative — roleplay with colleagues — suffers from predictable failure modes. Colleagues don't know how to object convincingly, iterations are limited before participants lose interest, and the social awkwardness undermines psychological safety. AI roleplay eliminates these constraints: unlimited practice, realistic resistance, immediate feedback, no social cost.

340%

Better Conversion Performance

Companies using AI sales training see conversion rates improve from 2% to 4.5% or higher — representing 340% better performance than traditional training methods that typically move from 2% to 2.3%.

Source: Kendo AI, "AI Sales Training ROI: Why It Outperforms Traditional Methods by 340%"

Oracle Case Study

Oracle adopted Second Nature's AI training platform with striking results. While the volume of introductory calls remained constant, the quality of outcomes transformed: opportunities per sales representative per month rose from 2.78 to 6.02, and new logos per rep per month grew from 0.49 to 1.04 — more than doubling pipeline creation efficiency.1

Time savings compound these gains. In the Outreach Prospecting 2025 report, 100% of AI-powered SDR users reported measurable time savings, with nearly 40% reclaiming 4 to 7 hours per week.2

"AI roleplays provide a psychologically safe environment that can help learners feel more prepared and confident in their ability to handle difficult situations."
— SmartWinnr, AI Roleplays for Sales Guide

Deployment Variations

Future Lost Deal Callback

AI simulates a churned customer calling to explain why they left, powered by patterns extracted from actual lost-deal analyses.

Boss Simulator

Internal champions practise pitching to an AI version of their executive sponsor, tuned on documented objection patterns from that role.

Angry Customer De-escalation

Customer success teams rehearse high-pressure conversations using AI personas built from real complaint transcripts and support tickets.

The Disgruntled Customer Stand-In: Executive Empathy at Scale

Training simulations typically target frontline staff. But what about leadership? Executives read quarterly NPS summaries and churn dashboards — they rarely experience what frustrated customers actually sound like.

The same pattern that powers sales training can render a "disgruntled customer stand-in" — built from real feedback records (NPS verbatims, support tickets, exit interviews, social media complaints) and presented as:

Video Compilation

A rendered video of composite customer frustrations to play at board meetings or all-hands — making the data visceral.

Interactive Avatar

Leadership sits across from an AI embodying customer complaints. They can ask questions, probe deeper, feel the emotional weight of the feedback.

Meeting Stand-In

Before product decisions, invite the "angry customer avatar" to represent the voice of churned users — a seat at the table for those who left.

"Reading a summary of customer complaints is one thing. Sitting across from an AI that embodies those complaints — frustrated, specific, emotional — is something else entirely."

The value: Empathy doesn't scale from spreadsheets. But it does scale from simulation. When leadership feels customer pain rather than reading about it, product decisions and resource allocation shift accordingly.

Domain 2: Medical Education

Medical communication training faces unique constraints: conversations are high-stakes (patient outcomes directly affected), low-opportunity (cannot ethically practise on real patients), and emotionally demanding (breaking bad news, end-of-life discussions, complex diagnoses). AI simulation provides the missing infrastructure for safe, repeatable practice.

Oxford Medical Simulation deploys voice-controlled virtual patients using large language models to enable natural conversation. The system creates what researchers describe as "a psychologically safe environment" where learners can rehearse difficult conversations without risk to actual patients.3

Mayo Clinic Study

Mayo Clinic deployed an AI-based communication platform with Arizona-based PGY-2 neurology residents to improve communication with stroke patients in emergency settings — scenarios where communication errors carry severe consequences. The results validated both efficacy and acceptance: 80% of trainees reported they would use the AI tool again for future communication practice.4

Cornell's MedSimAI

Researchers at Cornell University developed MedSimAI, a large language model system that creates realistic interactive patient conversations. In a study involving 104 medical students, 78% identified focused history-taking as the platform's most valuable function, while 62% highlighted question phrasing practice.5

80%

Trainee Acceptance Rate

Mayo Clinic neurology residents overwhelmingly reported they would continue using AI simulation for communication practice — demonstrating both efficacy and user acceptance in high-stakes clinical training.

Source: Mayo Clinic Platform, "Finding a Place for AI in Medical Education"

Domain 3: Marketing & Video Testimonials

Video testimonials represent a direct application of the "future customer talking to present prospect" pattern. The Time-Shifted Proxy manifests as AI-generated customer avatars delivering composite experiences built from multiple real customer stories.

The Retention Advantage

Customers remember 95% of a video testimonial compared to only 10% of written reviews — a near 10x differential in retention. This isn't surprising: video engages multiple sensory channels and emotional processing pathways that text cannot access.6

80%

Conversion Boost from Video Testimonials

Video testimonials can increase sales page conversions by 80%, with 71% of customers reporting they purchased a product or service because they watched a video testimonial.

Source: Zebracat, "80+ Video Testimonials Statistics for 2025"

Production Economics

AI tools reduce video testimonial production costs by up to 70% compared to traditional filming, eliminating the need for actors, studios, and production crews. Professional-quality testimonials can now be generated in minutes rather than days.7

"85% of consumers now expect personalised interactions from brands. For businesses still clinging to one-size-fits-all strategies, the future is bleak."
— Vidnoz AI, marketing analysis

Emotional AI Market Growth

The global emotion AI market reached $2.9 billion in 2024 and is projected to grow at 21.7% CAGR through 2034. Sales teams deploy emotional AI to read facial expressions during product demonstrations and adjust responses when confusion is detected, driving measurably higher conversion rates.8

A Realeyes study analysing 130 automotive advertisements found a direct correlation between emotional engagement and social media performance. Volkswagen's narrative-driven "The Force" advertisement generated 86% higher brand exposure and 329% more social media interactions than Ford Fiesta's product-focused approach.9

Domain 4: Decision Support

Temporal discounting — the human tendency to heavily discount future pleasure and pain — undermines long-term decision-making across financial planning, health behaviours, and career choices. MIT researchers developed "Future You" to address this cognitive bias through conversational AI.

MIT's Future You Tool

The system enables users to have text-based conversations with an AI-generated simulation of their potential future self at age 60. Users provide information about current circumstances and aspirations; the large language model constructs a plausible future persona that can answer questions, offer advice, and describe life trajectories based on present decisions.10

The initial study included 344 participants aged 18 to 30. Results showed that users experienced decreased anxiety and lower levels of negative emotions after conversing with their future selves. More significantly, participants reported a stronger connection to their future — a psychological shift linked to improved decision-making regarding health, education, and financial planning.11

The mechanism works by transforming "future me" from an abstract concept into a conversational partner. When future consequences feel personally real rather than intellectually distant, temporal discounting weakens and decision quality improves.

Domain 5: Memorial Services

Memorial AI services represent the most emotionally charged application of Time-Shifted Proxies: conversational interfaces with deceased loved ones. Multiple commercial platforms now offer these services at accessible price points.

Commercial Deployments

StoryFile

Launched in 2017, StoryFile enables people to create videos that reply to viewers' questions using AI to select relevant clips. Initially conceived to preserve Holocaust survivor testimonies, the platform now appears at funerals. Over 5,000 profiles have been created, including Marina Smith's interactive memorial and Ed Asner's posthumous conversation with his son.12

Super Brain (China)

Since March 2023, Super Brain has created digital replicas for over 1,000 clients, charging $700 to $1,400 depending on service tier. The company plans to release an app-only product at approximately $140, dramatically expanding accessibility.13

HereAfter AI

El Cerrito-based HereAfter AI pairs user photos with audio interviews to create memory-sharing chatbots accessible via computer, smartphone, or smart speaker. Monthly subscription tiers range from $3.99 (Starter: 20 stories and photos) to $7.99 (Unlimited).14

Market Size

China's virtual digital human industry reached a core market size of approximately $6.7 billion USD by 2025, with surrounding markets driven by the technology approaching $90 billion. This represents explosive growth from 2022 figures and validates substantial commercial demand for digital resurrection services.15

$6.7B

Virtual Digital Human Market

China's core market for virtual digital humans — including memorial AI services — reached $6.7 billion USD by 2025, driving a surrounding market approaching $90 billion and demonstrating substantial commercial validation.

Source: iMedia Research via NPR, "Chinese companies offer to 'resurrect' deceased loved ones with AI avatars"

Consumer Acceptance with Consent

Research by Masaki Iwasaki found that 58% of U.S. survey respondents support digital resurrection when the deceased had explicitly consented. Acceptance plummets to 3% when consent is absent — demonstrating that demand exists but is contingent on ethical frameworks that respect autonomy and consent.16

What This Proves

Time-Shifted Proxies are not theoretical constructs — they are deployed systems generating measurable business outcomes across radically different industries. The pattern proves portable: the same fundamental operator (time direction + AI stand-in + media translation + purpose) applies equally to sales training, medical education, marketing, decision support, and memorial services.

The convergence of technical maturity, commercial accessibility, and proven ROI creates conditions for rapid category expansion. Early movers in 2025 are defining what becomes standard practice by 2027.

"The question isn't whether this works. It's: which conversation across time would unlock the most value in your domain?"

Chapter 5 Citations

1 Second Nature, "How to Measure the Impact of Sales Training"

2 Outreach, "Sales 2025 Data Report: Prospecting Analysis"

3 Council of Deans of Health, "AI in Simulation: Transforming Healthcare Training"

4 Mayo Clinic Platform, "Finding a Place for AI in Medical Education"

5 Mayo Clinic Platform (Cornell study), "Finding a Place for AI in Medical Education"

6 Zebracat, "80+ Video Testimonials Statistics for 2025"

7 Reelmind AI, "The Best AI Tools for Creating Engaging Video Testimonials"

8 GM Insights, "Emotion AI Market Size, 2025-2034"

9 Marketing Scoop, "Top 10 Emotional AI Examples & Use Cases in 2025"

10 MIT News, "AI simulation gives people a glimpse of their potential future self"

11 Tech Times, "MIT's AI Chatbot Lets You Talk to Your Future Self"

12 dot.LA, "StoryFile Aims to Let Users Talk to the Dead, Digitally"

13 NPR, "Chinese companies offer to 'resurrect' deceased loved ones with AI avatars"

14 Connecting Directors, "AI and Grieving Series, Part 4: HereAfter.AI"

15 iMedia Research via NPR, "Chinese companies offer to 'resurrect' deceased loved ones with AI avatars"

16 UAB Institute for Human Rights, "Griefbots: Blurring the Reality of Death and the Illusion of Life"

The Design Space Matrix

You understand the pattern. You've seen proof it works. The question now: how do you apply this to your domain?

Most innovation frameworks give you principles. This chapter gives you something better: a systematic idea-generation tool. A design space matrix where every combination of axes produces a potential product, feature, or use case.

Not random brainstorming. Deliberate exploration.

The solution: four axes that combine to form hundreds of possible products. Let's walk through each one.

Axis 1: Time Direction

The first decision: where in time does this conversation exist?

Past: "What really happened?"

Use for: Heritage, learning from history, replaying decisions

Examples:

  • • Talk to grandparents in the 1970s
  • • Interview founder-you from day one of the business
  • • Replay the incident with the key players
  • • Ask 7-year-old you what you were excited about

Future: "How did this turn out?"

Use for: Decision support, marketing, previewing outcomes

Examples:

  • • Future-you loved the holiday and explains why
  • • Your team in 12 months describes the transformation
  • • Future customer explains why they bought
  • • 60-year-old you evaluates this career decision

Parallel: "What would someone like this say?"

Use for: Training at scale, archetypal scenarios, empathy building

Examples:

  • • Typical angry enterprise customer
  • • Anxious SMB owner considering your product
  • • Average regulator asking compliance questions
  • • Composite churned customer explaining patterns

Each direction unlocks different value. Past gives you memory and learning. Future gives you decision support and persuasion. Parallel gives you scalable training and pattern recognition.

Axis 2: Role to Simulate

The second decision: who is speaking?

Five Categories of People to Model

Self (past, future, alternate)

Your younger self asking questions. Your future self offering advice. The version of you who chose differently.

Applications: Decision support, journaling, therapy, career planning

Family (deceased, young, historical)

Grandparents, parents, ancestors. Your children at different ages. Historical family figures.

Applications: Genealogy, heritage, grief support, family storytelling

Customer (churned, happy, angry, future)

Lost customers explaining why they left. Happy customers before they bought. Angry customers at peak frustration.

Applications: Product development, sales training, marketing, retention strategy

Colleague (boss, team, report)

Your manager's likely response. How your team will react to news. A direct report expressing concerns.

Applications: Rehearsal, management training, change management, leadership development

Stakeholder (regulator, investor, partner)

Regulatory authority asking hard questions. Investor challenging assumptions. Partner expressing concerns.

Applications: Preparation, compliance training, negotiation practice, risk assessment

Axis 3: Medium Level

The third decision: what form does the interaction take?

This isn't just about production cost. The medium fundamentally shapes the emotional experience and, therefore, the product's effectiveness.

Medium Characteristics Best Use Cases
Text Chat Lowest friction, fastest to implement, good for exploration and iteration, lowest emotional impact Quick exploration, scale priority, control matters
Voice Call Medium friction, higher intimacy, feels more "real" than text, strong emotional resonance Relationship matters, heritage/memory, training scenarios
Video Highest production requirements, maximum emotional impact, most immersive Marketing, high-stakes training, memorial services
"Phone call to grandma hits differently than reading her letter. The medium isn't decoration—it's the product."

Same underlying AI. Same data source. Radically different user experience. Choose deliberately based on the emotional outcome you need.

Matching Medium to Outcome

When to Use Text
  • • Rapid iteration in product development
  • • High-volume customer insights
  • • Personal journaling applications
  • • Exploratory conversations with low stakes
When to Use Voice
  • • Genealogy and family heritage
  • • Sales training with realistic scenarios
  • • Difficult conversation rehearsal
  • • Grief support and memorial connection
When to Use Video
  • • Marketing testimonials requiring maximum persuasion
  • • Medical education with visual/emotional cues
  • • High-impact memorial services
  • • Executive training for board presentations

Axis 4: Purpose

The fourth decision: what outcome are you trying to achieve?

Purpose determines success metrics. A training application succeeds when performance improves. A marketing application succeeds when conversion increases. Choose the wrong purpose and you'll measure the wrong thing.

Training / Rehearsal

Build skill through practice in a safe space for mistakes.

Example: Sales training with AI customers, difficult conversation practice, medical communication scenarios

Success metric: Performance improvement in real scenarios

Marketing / Persuasion

Influence decisions and create emotional connection.

Example: Future-you testimonials, personalised video, composite customer stories

Success metric: Conversion rate, engagement, time on page

Decision Support / De-risking

Better choices through perspective; reduce anxiety about future.

Example: MIT Future You, exit interview with yourself, board scenario planning

Success metric: Decision quality, confidence level, anxiety reduction

Heritage / Connection

Maintain relationships across time; preserve memory and personality.

Example: Genealogy conversations, memorial services, family storytelling

Success metric: Emotional resonance, usage over time, family engagement

Product Development

Understand users better; identify unmet needs.

Example: "Talk to future power user", "Interview churned customer archetype"

Success metric: Insight quality, feature prioritisation accuracy

The Matrix in Action

Now combine the axes. Every combination is a potential product.

Example Product Combinations

Training × Customer × Parallel × Voice

Product: "Angry customer de-escalation practice"

AI plays composite of real angry customer patterns. Staff practice via phone call simulation. Scored on empathy, resolution, retention.

Decision Support × Self × Future × Text

Product: "Career decision advisor"

Chat with 60-year-old you about this choice. Low-friction journaling-style interface. Helps overcome short-term bias.

Heritage × Family × Past × Voice

Product: "Genealogy conversation engine"

Turn letters/documents into phone calls with ancestors. Interactive Q&A about family history. The Carol example from Chapter 1.

Marketing × Customer × Future × Video

Product: "Post-experience testimonial generator"

Show future customer reflecting on purchase. Personalised to prospect's situation. High-conversion landing page element.

Product × Customer × Past × Text

Product: "Churned customer interview simulator"

Chat with composite of lost customers. Understand patterns in why people leave. Inform retention strategy.

Domain-Specific Idea Generators

Here's the matrix applied to common business domains. Use these as starting points for your own exploration.

Sales & Revenue

  • Future lost deal calling you back: "I bought from your competitor. Want to know why?"
  • Boss simulator for internal champions: Practice pitching to their executive team
  • Churned customer interview: Identify patterns in why people leave
  • Happy customer testimonial generator: For sales collateral and demos

Product & Engineering

  • "Future Power User" frustrated about missing features: Discover unmet needs early
  • Past incident as interactive post-mortem: Interview the outage for learning
  • Onboarding assistant with user archetype: Test flows with realistic personas
  • Beta tester composite: Aggregate feedback into conversational form

HR & Culture

  • Future exit interview with yourself: Before joining, talk to "you after 18 months"
  • "Team in 1 year" simulator: Change management preparation
  • Mentor-in-a-bottle: Capture top performer's stories for on-demand coaching
  • Difficult performance conversation rehearsal: Practice with AI direct reports

Marketing & Growth

  • Future-customer testimonials: Before they buy, show them their own success
  • "Before and after" video comparisons: Composite customer journey
  • Industry-specific avatar testimonials: Scaled personalisation
  • Composite persona interviews: For market research and positioning

Key Takeaways

  • Every cell in the grid is a potential product. Time Direction × Role × Medium × Purpose = hundreds of combinations. Most haven't been built yet.
  • The medium shapes the experience fundamentally. Same AI, same data, radically different emotional impact. Choose deliberately.
  • Start with the constraint. What's blocked by death, time, scale, or risk? What data traces already exist? What medium would make it feel real? That's your product spec.
"Which impossible conversation would unlock the most value in your domain?"

Not "what AI tool should we build?"

But "who do we wish our team or customers could talk to—but can't?"

Answer that question. Walk through the four axes. Write the one-line product spec. You've just systematically generated a Time-Shifted Proxy application.

Your Turn

You now have a systematic method for generating ideas. The matrix isn't abstract theory—it's a tool you can use today.

The next chapter shifts perspective. We've built the framework. Now we need to understand what this pattern is fighting against: the incumbent mental model that says AI is only for productivity, not relationships.

What This Is Fighting Against

The incumbent mental model keeps you from seeing the pattern. Time to name what's in the way.

If you've read this far and the examples still feel disconnected—talking to grandma in 1975, dating an AI, rehearsing conversations with your boss, watching future-you rave about a holiday—you're likely caught in a frame that makes these ideas invisible.

That frame is powerful. It's institutional. And it's wrong about what AI is for.

The Incumbent Mental Model

Here's what most people think AI is for:

  • Productivity. Making tasks faster.
  • Automation. Reducing headcount.
  • Efficiency. Time saved per employee, cost reduction, workflow optimisation.

This isn't wrong. These are valid use cases. But they share the same frame: AI as a tool that makes existing work faster or cheaper.

How does this show up in practice? Listen to the questions:

All sensible. All measurable. All the same underlying assumption: AI replaces human effort on existing tasks.

Why does this frame dominate? Three reasons:

Institutional Inertia

Enterprise AI has been sold on efficiency ROI. Vendors pitch time savings, not emotional connection. The playbook is established.

Measurability

Productivity gains are easy to quantify. Conversations enabled? Emotional resonance? Harder to fit into a spreadsheet.

Perceived Safety

Efficiency feels "safe"—not weird, not gimmicky. Emotional AI applications sound risky. Easier to stick with what's proven.

What Productivity Thinking Misses

When AI equals productivity, you only see automation opportunities. The design space shrinks to three questions:

  • What's slow? Speed it up.
  • What's expensive? Automate it.
  • What's repetitive? Remove humans.

That's a narrow aperture. Here's what doesn't fit:

Yet every single one of these has proven ROI. Oracle's AI sales training: opportunities per rep doubled (2.78 → 6.02 per month). Video testimonials boost conversions by 80%. MIT's "Future You" tool reduced anxiety and improved long-term decision-making in 344 participants.

The productivity frame can't see these because it asks the wrong question.

The New Frame: AI for Relationships Across Time

The Time-Shifted Proxy pattern starts from a different assumption:

"AI isn't just for making tasks faster. AI is for relationships across time."

Not automation. Simulation.
Not efficiency. Connection.
Not tasks. Conversations.

Here's the category distinction:

Two Frames, Different Worlds

Productivity Frame Relationship Frame
AI assistant AI stand-in
Chatbot Rehearsal partner
Content generation Media translation
Automation Simulation
Speed Connection

Why does this matter? Because different frames generate different questions:

  • Productivity: "How do we automate X?"
  • Relationships: "Who do we wish we could talk to?"

The second question opens doors the first one can't see.

The Core Mechanism: Media Translation

Here's the deeper insight: the constraint was never technical. It was media format.

The archive already exists. It always has:

  • Letters from grandma in the 1970s
  • CRM notes from churned customers
  • Research on what makes holidays memorable
  • Recordings from top salespeople
  • Family photos, genealogy records, old voice notes

The raw material was always there. What AI enables is translation.

Why does this work on the brain? Because conversation is our native interface for memory.

"Humans are lazy mammals with big feelings. We want the meaning, not the microfiche."

We suck at reading archives. We're great at talking to people. Forty pages of letters? That's homework. A six-minute phone call that compresses those letters into a voice that sounds like grandma? Instantly consumable.

AI doesn't create new information. It makes existing information consumable in the form our brains actually want. That's the unlock. Not AI intelligence, but AI accessibility.

What We're NOT Saying

Before we go further, three important caveats:

1. AI Doesn't Replace Real Relationships

Time-Shifted Proxies complement, not substitute. Remember the 72% of teens using AI companions? Eighty per cent of them say they spend more time with real friends, not less. It's additive, not replacement.

2. Ethical Dimensions Exist

Consent matters. In one U.S. survey, 58% supported digital resurrection only with explicit consent from the deceased; acceptance plummeted to 3% without it. Exploitation risks are real—Cambridge researchers warn about grief monetisation. Psychological impacts need consideration. We're not dismissing these; we're scoping them as a separate conversation. This book focuses on the pattern, not the edge cases.

3. Not Everything Should Be a Proxy

Some conversations should be hard. Some archives should stay static. The pattern is a tool, not a mandate. Apply it deliberately, not reflexively.

If This Frame Wins

What changes if Time-Shifted Proxy thinking becomes mainstream?

For Individuals

  • AI becomes a tool for emotional connection, not just productivity
  • Heritage becomes accessible—talk to ancestors, not just research them
  • Decision-making improves through future-self connection
  • Difficult conversations get rehearsed before stakes are real

For Teams

  • Training, onboarding, and change management get a new design pattern
  • "Talk to" becomes a training modality (simulated customers, stakeholders, executives)
  • Knowledge transfer shifts—capture not just information but voice and perspective
  • Empathy scales—experience what it's like to be an angry customer, not just read about it

For the Industry

  • A new product category emerges: "time-shifted proxy" applications
  • Memorial AI moves from fringe to mainstream
  • Simulation becomes standard for high-stakes preparation
  • Marketing shifts from "brand tells you" to "future-you tells you"
"The companies that see this pattern early will define new categories. The ones that don't will keep building marginally better chatbots."

The Recognition Test

How do you know if you've shifted frames? Here's the test:

Old Frame vs New Frame

Old question:

"What can we automate?"

New question:

"Who do we wish we could talk to?"

Old metric:

Time saved

New metric:

Conversations enabled

Old product:

AI that does tasks

New product:

AI that simulates people

Signs you're still in the productivity frame:

  • Every AI idea is about "faster" or "cheaper"
  • You dismiss emotional applications as gimmicky
  • You can't see the connection between genealogy and sales training
  • "That's interesting but not practical for business"

Signs you've adopted the relationship frame:

  • You see impossible conversations everywhere
  • You ask "what data traces exist?" about people
  • You think about medium (text/voice/video) as a design choice
  • The examples in this book feel connected, not random

If you've made the shift, the next question isn't "is this real?" It's "where do I apply it first?"

That's Chapter 8.

Applying the Pattern: Your Next Steps

You've seen the pattern. You understand the framework. You know why late 2025 is the inflection point. Now the question is: what conversation across time will you enable first?

This chapter gives you five practical steps to move from concept to implementation, domain-specific applications to inspire your thinking, and a closing that brings us back to where we started.

The Five Steps

1

Audit Your Domain for Impossible Conversations

Start by listing the conversations your team wishes they could have. Who do you wish you could talk to, but can't? What's blocked by death, time, scale, or risk?

Questions to ask:

  • "If I could interview anyone about this problem, who would it be?"
  • "Who do our customers wish they could talk to?"
  • "What conversation would de-risk this decision?"
  • "Who's not available but would have the answer?"
Examples by Domain

Sales: Churned customers who won't return calls

Product: Future power users with emergent needs

HR: The employee who left six months ago

Marketing: Future customers who've already bought

Strategy: Your industry five years from now

2

Identify Existing Data Traces

Inventory what you already have about the person or persona. Look for raw material hiding in plain sight. Don't assume you need "enough" data — minimal input is often sufficient.

For customers: CRM notes, support tickets, NPS comments, sales call transcripts, reviews

For colleagues: Emails, meeting notes, Slack history, performance reviews

For family: Letters, photos, stories, documents, voice recordings

For personas: Industry research, published archetypes, survey data

"The raw material is usually hiding in plain sight. You've been collecting data on these people for years — you just haven't animated it."

What You Actually Need

  • • Single photo + 10-second audio = AI avatar (DeepBrain)
  • • One letter = phone conversation with grandparents
  • • CRM notes = realistic angry customer for training

The barrier isn't data collection — it's recognising what you have.

3

Choose the Medium That Matches the Use Case

The medium isn't a technical choice — it's a design choice. A phone call to grandma hits differently than reading her letter. Here's how to choose:

Text Chat

Fastest to build, lowest emotional impact

Good for: Exploration, iteration, scale, low-stakes applications

Example: Quick chats with "future you" for journaling

Voice Call

Higher production, higher intimacy

Good for: Training, heritage, relationship-focused applications

Example: Rehearsing difficult conversation with simulated boss

Video

Maximum production, maximum impact

Good for: Marketing, high-stakes training, memorial

Example: Future-customer testimonials for landing pages

Use Case Recommended Medium Why
Sales training Voice/video Emotional realism matters
Quick exploration Text Speed and iteration
Marketing testimonial Video Maximum persuasion
Heritage/genealogy Voice Intimacy without video production
Difficult conversation prep Voice Needs to feel real
Decision journaling Text Low friction, daily use

Choose based on desired emotional outcome, not just what's easiest to build.

4

Prototype Small

Don't build the full product first. Create one conversation, one scenario, one instance. Test whether the experience lands before scaling.

Minimum viable experiments:

  • Sales training: One AI customer persona, tested with three reps
  • Marketing: One future-customer testimonial video for one product
  • Heritage: One phone call with one family member's voice
  • Training: One difficult conversation scenario

Why this matters:

  • Time-Shifted Proxies are experiential — you can't evaluate them from a spec
  • The emotional resonance is the product
  • You have to feel it to know if it works
  • Small prototype = fast learning with low investment
5

Measure Emotional Resonance, Not Just Completion

Two Approaches to Metrics

❌ The Wrong Metrics

  • • "How many people completed the training?"
  • • "How long did they spend?"
  • • "What was the NPS?"

Task completion doesn't measure emotional connection

✓ The Right Metrics

  • • "Did they feel like they really talked to that person?"
  • • "Did their behaviour change afterward?"
  • • "Do they want to do it again?"
  • • "What did they say unprompted about the experience?"

Emotional truth and behaviour change are the real KPIs

Time-Shifted Proxies succeed on emotional truth, not task completion. The KPI isn't "did they finish?" — it's "did they connect?"

A six-minute call that changes someone's relationship to their genealogy is success. A sixty-minute training that feels like checkbox completion is failure.

Domain-Specific Applications

The pattern generalises across domains. Here are specific applications to inspire your thinking.

📦 For Product Teams

"Talk to Future Power User"

Build a persona of someone who's used your product for a year. They're frustrated about specific missing workflows. They articulate emergent use-cases. Product managers "interview" them to surface unmet needs.

"Past Incident as Interactive Post-Mortem"

Instead of dry root cause documents, new engineers "interview" the incident. AI answers from tickets, commits, Slack, and incident reports. Buried documentation becomes interactive conversation.

👥 For HR & Leadership Teams

"Future Exit Interview with Yourself"

Before joining a company, talk to "Future-You after 18 months there." Model built on actual exit interviews, Glassdoor reviews, culture surveys. De-risk the career decision before making it.

"Team in 1 Year" Simulator

Leaders going through change programs talk to AI representing their future team. Tired or energised? How do they describe the change in hindsight? Rehearse change management before implementation.

"Mentor-in-a-Bottle"

Capture top performer's stories, habits, war stories. New hires "ring" virtual mentor on demand. Doesn't replace real mentorship — supplements it.

💼 For Sales & Revenue Teams

"Future Lost Deal Calling Back"

AI simulates churned customer explaining why they left. Powered by patterns from actual lost-deal notes. Practice retention conversations with realistic resistance.

"Boss Simulator for Internal Champions"

Your buyer practices pitching your product to AI twin of their boss. Tuned on typical objections from that role.

"Happy Customer Before They Were Happy"

Future customer testimonial before purchase decision. "I've been using this for 6 months. Here's what changed." Driven by outcomes from similar customers.

📢 For Marketing Teams

"Composite Testimonial Avatar"

Aggregate real reviews, sales call notes, support tickets. Distil into one coherent "hero arc." Embody as AI video of customer retelling story. Technically synthetic, emotionally true.

"Industry-Specific Personas"

Same product, different avatar for each vertical. "Fake clinic owner" / "Fake tradie" / "Fake SaaS founder." Personalised testimonials at scale.

🌱 For Personal Applications

"Career Decision Advisor"

Chat with 60-year-old you about this choice. Helps overcome short-term bias. Strengthen future self-continuity.

"Difficult Conversation Rehearsal"

Practice the raise conversation before having it. Iterate approaches in safe environment. Build muscle memory for staying calm.

"Family Heritage Conversations"

Turn genealogy research into phone calls. Talk to ancestors, not just read about them. Make family history accessible to non-genealogists.

Back to Carol

Remember where we started? A letter from the 1970s, found in a box by a genealogy enthusiast. Ordinary family correspondence. Nothing special.

We turned it into two things:

  • A radio play (carol_call.mp3) — Carol's voice calling her father
  • An interactive phone call — pick up the phone and talk back

Same source material. Two completely different experiences.

What We Learned

The archives always existed. The information was always there. What was missing was the medium.

AI doesn't create new data — it translates existing data into forms humans actually engage with.

"Not resurrection. Not simulation. Translation."

Turning static archives into living interactions.

Turning time-locked people into ongoing relationships.

Turning the conversations we can't have into conversations we can.

What This Book Has Given You

Pattern: Time-Shifted Proxy = impossible conversation + AI stand-in + media translation

Framework: Five-step operator to apply it

Design space: Time × Role × Medium × Purpose matrix

Timing: Why late 2025 is the inflection point

Evidence: Proof it works across domains

Frame shift: From AI-as-productivity to AI-as-relationships

The Question That Remains

"What conversation across time will you enable first?"

Not "what AI tool should we build?"

Not "how do we automate X?"

But: "Who do we wish we could talk to — but can't?"

That's where the Time-Shifted Proxy pattern starts. The technology is ready. The applications are emerging. The only question is where you point it.

References & Sources

This ebook synthesizes research from industry analysts, academic institutions, technology publications, and practitioner experience. Sources are organized by category for easy reference. All statistics and quoted material are attributed to their original sources.

Video Generation & AI Technology

Clippie.ai — "Recap: The Best AI Video Creation Trends from 2025"

Source for the "August breakthrough moment" and 73% photorealism threshold statistic.

https://clippie.ai/blog/ai-video-creation-trends-2025-2026

Variety — "Video Generation Model Evaluation in 2025"

Paul Trillo quote on visual Turing test and Veo 2 capabilities.

https://variety.com/vip/video-generation-model-evaluation-in-2025-veo-2-sora-pika-ray2-1236276435/

Lovart.ai — "Best AI Video Generators in 2025"

Sora 2, Runway Gen-4, and platform comparison data.

https://www.lovart.ai/blog/video-generators-review

AI Blog.digital — "How AI Generates Video and Images in 2025"

Runway CEO quote on AI video capabilities.

https://blog-ai.digital/artificial-intelligence/how-ai-generates-video-and-images

AI Companions & Parasocial Relationships

TechCrunch — "AI companion apps on track to pull in $120M in 2025"

220 million downloads, 88% YoY growth, market revenue projections.

https://techcrunch.com/2025/08/12/ai-companion-apps-on-track-to-pull-in-120m-in-2025/

TechCrunch — "72% of US teens have used AI companions"

Teen adoption rates, usage patterns, and satisfaction data.

https://techcrunch.com/2025/07/21/72-of-u-s-teens-have-used-ai-companions-study-finds/

ElectroIQ — "AI Companions Statistics"

Character.AI 92-minute daily usage, 20M monthly active users.

https://electroiq.com/stats/ai-companions-statistics/

Nikola Roza — "Replika AI Statistics Guide for 2025"

30M+ users, 85%+ emotional connection rate, usage patterns.

https://nikolaroza.com/replika-ai-statistics-facts-trends/

MAPP Psychology — "Parasocial relationships in 2025"

Cambridge Dictionary Word of the Year, psychology of AI relationships.

https://www.mapp-psychology.com/journal/parasocial-relationships-in-2025

Digital Resurrection & Memorial AI

The National — "How AI is being used to create messages from digital twins of the dead"

Soul Link, Sensay (90-95% indistinguishability), Twin Protocol services.

https://www.thenationalnews.com/news/uae/2025/10/13/how-ai-is-being-used-to-create-messages-from-digital-twins-of-the-dead/

NPR — "Chinese companies offer to 'resurrect' deceased loved ones with AI avatars"

Super Brain pricing ($700-$1,400), $6.7B market size, 1,000+ replicas.

https://www.npr.org/2024/07/18/nx-s1-5040583/china-ai-artificial-intelligence-dead-avatars

dot.LA — "StoryFile Aims to Let Users Talk to the Dead, Digitally"

5,000+ profiles, Ed Asner memorial example, Marina Smith funeral.

https://dot.la/storyfile-ai-talk-to-dead-people-2657797613.html

Connecting Directors — "AI and Grieving Series: HereAfter.AI"

HereAfter AI service tiers and memory-sharing chatbot features.

https://connectingdirectors.com/69008-ai-and-grieving-series-part-4

Ground News — "DeepBrain AI Re;memory 2"

Single photo + 10-second audio clip avatar creation.

https://ground.news/article/can-you-really-talk-to-the-dead-using-ai-we-tried-out-deathbots-so-you-dont-have-to

Sales Training & Business Applications

Kendo AI — "AI Sales Training ROI Analysis"

340% better conversion improvement vs traditional methods.

https://kendo.ai/blogs/ai-sales-training-roi-analysis

Second Nature — "How to Measure the Impact of Sales Training"

Oracle case study: opportunities per rep 2.78 → 6.02/month.

https://secondnature.ai/how-to-measure-the-impact-of-sales-training/

Outreach — "Sales 2025 Data Report"

100% of AI SDR users report time savings, 40% save 4-7 hours/week.

https://www.outreach.io/resources/blog/sales-2025-data-analysis

SmartWinnr — "AI Roleplays for Sales: Ultimate 2025 Guide"

AI roleplay mechanics and implementation guidance.

https://smartwinnr.com/blogs/insights-ai-roleplays-for-sales-ultimate-2025-guide/

Medical Education & Healthcare

Mayo Clinic Platform — "Finding a Place for AI in Medical Education"

80% of trainees want to continue using AI simulation.

https://www.mayoclinicplatform.org/2025/06/23/finding-a-place-for-ai-in-medical-education/

Council of Deans of Health — "AI in Simulation: Transforming Healthcare Training"

Oxford Medical Simulation voice-controlled virtual patients.

https://www.councilofdeans.org.uk/resource/sponsored-blog-ai-in-simulation-transforming-healthcare-training-innovation-month-2025-x-oms/

ArXiv — "Synthetic Patients: Simulating Difficult Conversations"

Multimodal AI avatars for medical training research.

https://arxiv.org/html/2405.19941v1

Marketing & Video Testimonials

Zebracat — "80+ Video Testimonials Statistics for 2025"

80% conversion boost, 95% video retention vs 10% text.

https://www.zebracat.ai/post/video-testimonials-statistics

Reelmind — "Best AI Tools for Creating Video Testimonials"

70% production cost reduction with AI tools.

https://reelmind.ai/blog/the-best-ai-tools-for-creating-engaging-video-testimonials

GM Insights — "Emotion AI Market Size, 2025-2034"

$2.9B market in 2024, 21.7% CAGR projection.

https://www.gminsights.com/industry-analysis/emotion-ai-market

Decision Support & Future Self Research

MIT News — "AI simulation gives people a glimpse of their potential future self"

"Future You" tool, 344 participants, decreased anxiety findings.

https://news.mit.edu/2024/ai-simulation-gives-people-glimpse-potential-future-self-1001

Tech Times — "MIT's AI Chatbot Lets You Talk to Your Future Self"

Psychological benefits and future self-continuity research.

https://www.techtimes.com/articles/305961/20240623/mit-ai-chatbot-lets-talk-future-self-help-reduce-anxiety.htm

TRIZ & Innovation Methodology

Lean Outside the Box — "40 Inventive Principles of TRIZ"

TRIZ methodology overview, Preliminary Action, Intermediary, Copying principles.

https://leanoutsidethebox.com/40-inventive-principles-of-triz/

ipface.org — "TRIZ 40 Design Principles"

Intermediary principle and application examples.

https://www.ipface.org/pdfs/reading/TRIZ_Principles.pdf

Difficult Conversation Training

AI Therapy — "How AI Helps You Practice Hard Conversations"

Psychology of conversation rehearsal and skill building.

https://blog.aitherapy.care/hard-conversations/

Tough Tongue AI

Video/audio feedback for difficult conversation practice.

https://www.toughtongueai.com/

Skillsoft — "CAISY Conversation AI Simulator"

Enterprise conversation simulation for leadership skills.

https://www.skillsoft.com/blog/how-practice-makes-perfect-with-skillsofts-caisy-conversation-ai-simulator

Ethics & Digital Afterlife

UAB Institute for Human Rights — "Griefbots: Blurring the Reality of Death"

58% consent requirement statistic, ethical considerations.

https://sites.uab.edu/humanrights/2025/02/07/griefbots-blurring-the-reality-of-death-and-the-illusion-of-life/

Philosophy & Technology — "Griefbots, Deadbots, Postmortem Avatars"

Responsible development recommendations for memorial AI.

https://link.springer.com/article/10.1007/s13347-024-00744-w

Note on Research Methodology

This ebook was compiled in December 2025 using primary research from industry publications, academic institutions, and technology analysts. Statistics and quotes have been verified against original sources where possible.

The "Time-Shifted Proxy" framework and associated concepts represent practitioner synthesis developed through direct experimentation with the technologies described, including the Carol phone call prototype and sales training implementations referenced throughout.

Some links may require subscription access. URLs were verified as of publication date but may change over time.