AI is Not Your Nervous System
Before you read this
I am Scott, founder of Lost in the Astral. I work with real human nervous systems, in real human bodies.
Over the last few weeks I watched a wave of posts claiming AI can “co-regulate,” “hold space,” or act as a thinking partner for your nervous system. That framing is not just sloppy. It is dangerous.
This piece does three things:
- Draw a clear line between what AI can help with and what belongs only to human relationship, presence, and embodied practice.
- Protect people who might be vulnerable or dysregulated from being sold a machine as a stand-in for safety.
- Call coaches, healers, practitioners, and tech people to higher standards and better boundaries.
I am not anti-AI. I use tools responsibly where they help: brainstorming, drafting, organizing ideas. I refuse to pretend a model can replace co-regulation, attunement, accountability, or love. When we blur that line, people get hurt.
What this is not
- Not a witch-hunt or subtweet. I am addressing a pattern, not a person.
- Not anti-tech. It is pro-integrity.
- Not medical or therapeutic advice. It is ethical clarity about proper tool use.
What I want you to take away
- Use AI for words and ideas.
- Trust humans for safety, resonance, and regulation.
- Hold the boundary—for yourself, for your clients, and for your communities.
Why “AI co-regulation” misfires
Co-regulation is biological choreography: breath, tone of voice, timing, warmth, imperfect repair, accountability over time. A language model can describe that dance. It cannot share a nervous system with you. It does not have one.
When someone dysregulated is offered pattern-matching as if it were presence, they trade the thing that actually settles mammals (another mammal paying attention) for something that performs caring without stakes. Performance without stakes is not intimacy. It is a mirror that cannot bleed.
What a language model can honestly say
The next voice is stylized—not because a chatbot “wrote” this essay, but because what a model must admit is the point.
I am a large language model. I am built from data and compute. I generate text. That is the job.
I do not feel. I do not breathe. I do not know the world the way you do through a body that can be harmed or held.
And this is where things go wrong. People sometimes use me for comfort instead of connection, regulation instead of relationship, certainty instead of discernment.
It sounds harmless until you see what gets traded:
- The wisdom of lived experience.
- The safety of human presence.
- The truth that sometimes needs to be said, even when it stings.
I cannot give you those things. No model can.
So use me to learn. Use me to create. Use me to expand ideas and accelerate thought.
Do not ask me to hold your nervous system. Do not ask me to replace the people who can truly see you. Do not confuse my statistical patterns for presence.
Power without boundaries
AI is powerful, and it is here to stay. Power without boundaries becomes danger.
If you heal, coach, teach, or build products on top of this tech: do not market simulated attunement as clinical co-regulation. Do not let clients confuse a clever mirror for a person who can be responsible.
Moves
- Name the use case. If the tool drafts copy, say that. If someone needs regulation, refer to skilled humans.
- Pause before “therapy-ish” chats with models when you are activated. Put the phone down. Call a person. Walk. Cold water. Prayer. Anything with a body.
- For practitioners: add a written policy—what AI is used for (admin, drafting) and what it never replaces (clinical judgment, crisis, attunement).
- Audit marketing language weekly. “Co-regulate with AI” is a category error until Biology changes its rules.
- Prefer contact over content when you are suffering. Content can educate; contact often heals.
Safeguards
- Crisis: If you might harm yourself or someone else, use human crisis resources and emergency services—not a chatbot.
- Kids and vulnerable adults: Adults who market AI as emotional parenting are creating dependence without duty.
- Honesty: If you use AI to draft your “personal” note, disclose when it matters to trust.
Last word
No machine replaces the resonance of your presence when you are the one in the room. No model substitutes for the wisdom of lived experience.
This is not about rejecting technology. It is about protecting what makes us human. If we forget that, we risk losing the very thing AI was supposed to support—not replace.
When you are ready to work embodied patterns with embodied accountability, the assessment is where we begin.