When her 4-year-old son, Alex, started showing strange symptoms during the COVID-19 lockdown, Courtney never imagined it would take three years, countless specialists, and an unexpected turn with artificial intelligence to finally get answers.
What began with a simple bounce house and some odd behavior spiraled into a frustrating medical mystery. While many parents might turn to Google for help, Courtney turned to something a little more advanced—ChatGPT—and surprisingly, it made all the difference.
A Mother’s Instinct vs. a Medical Maze
It started with pain. Alex, who was usually a cheerful, active little boy, suddenly began having what his caregiver described as extreme meltdowns unless given Motrin (a pain reliever). With the medicine, he was fine. Without it, his behavior changed drastically.
Then came the chewing—Alex began gnawing on random things. Courtney thought maybe his teeth were coming in, or he had a cavity. But after a visit to the dentist, no real dental issues were found. Instead, the dentist suspected teeth grinding and suggested they visit an orthodontist who specialized in breathing-related issues. It turned out Alex’s palate was too narrow, which might’ve made it harder for him to breathe during sleep. An expander was placed in his mouth, and for a brief moment, things seemed to improve.
“We thought we were in the home stretch,” Courtney said.
But then Alex stopped growing.
Read more: Google Claims That AI Will Surpass Human Intelligence By 2030, Posing Extinction Risk
A Puzzle With Too Many Missing Pieces
Concerned, Courtney took Alex to the pediatrician, who brushed it off as a possible result of the pandemic. Still, she persisted. Later visits revealed differences in how Alex used the left and right sides of his body. He would lead with his right leg and just drag the left along.
Then came the headaches—severe ones—and intense fatigue. One neurologist said they were migraines. An ENT (ear, nose, and throat specialist) thought it could be related to his sinuses. Yet every expert seemed focused only on their piece of the puzzle, never looking at the bigger picture.
“No one was willing to solve the whole mystery,” Courtney recalled.
A physical therapist suggested something called Chiari malformation, a brain condition involving the base of the skull and spine. That led to even more doctors—17 in total—including pediatricians, internists, and musculoskeletal experts. Still, no clear diagnosis.
When AI Became the Unexpected Hero
Out of sheer exhaustion and desperation, Courtney turned to ChatGPT. She created an account and, line by line, began entering everything she knew about Alex’s medical history and symptoms. She even included small but telling details, like how he couldn’t sit “crisscross applesauce”—a clue she felt might be more meaningful than doctors realized.
That’s when ChatGPT suggested tethered cord syndrome.
Curious, Courtney dug deeper. She joined an online group for parents of kids with the condition and noticed the stories there sounded eerily similar to her son’s. Encouraged, she took Alex to a new neurosurgeon and told the doctor what she suspected. After reviewing the MRIs, the doctor confirmed it: Alex had spina bifida occulta and his spinal cord was tethered.
What Is Tethered Cord Syndrome?
Tethered cord syndrome occurs when the spinal cord is abnormally attached to the surrounding tissue, which limits its ability to move freely. As a child grows, this tension can stretch the spinal cord and cause problems.
Dr. Holly Gilmer, a pediatric neurosurgeon who treated Alex, explained it this way: “The cord gets stuck. It might be tethered by a tumor, bone, or just fatty tissue. As the child grows, it stretches and pulls the cord.”
In many cases, tethered cord syndrome is linked to spina bifida, a birth defect where the spinal cord doesn’t fully close during development. But in Alex’s case, it was the hidden type—spina bifida occulta—which doesn’t leave a visible opening in the back. Instead, there was just a faint mark at the base of his spine that had gone unnoticed.
Signs of tethered cord syndrome can include leg weakness, bladder issues, scoliosis, constipation, abnormal walking, or delays in sitting and walking. But in younger children, who can’t explain what’s wrong, these signs often go undiagnosed.
“This is just how they’ve always been, so no one realizes it’s not normal,” said Dr. Gilmer.
Read more: Artificial Intelligence Can Now Replicate Itself—And It Has Experts Terrified
Surgery and a Path Forward
Once diagnosed, Alex underwent surgery to release his tethered spinal cord. The goal? To stop the damage from getting worse.
“We essentially detach the cord from where it’s stuck,” Gilmer explained. “That releases the tension.”
Today, Alex is still healing, but he’s already showing signs of improvement. Though some sports like hockey are too painful for him, he’s adapted in creative ways—often acting as a coach or cheerleader from the sidelines.
“He’s so smart and adaptable,” Courtney said proudly. “He finds a way to stay in the game.”
Can ChatGPT Really Diagnose Illness?
Technically, ChatGPT isn’t a doctor. It’s a type of AI designed to predict text based on patterns it’s learned from massive amounts of online data. It doesn’t “think” or “know” things the way a human does, but it’s remarkably good at connecting dots.
Dr. Andrew Beam, an assistant professor at Harvard who studies AI in medicine, likens ChatGPT to a “supercharged medical search engine.” It doesn’t know your diagnosis—it just predicts what diagnosis might fit the input based on all the information it has seen across the internet.
For patients on long, confusing diagnostic journeys, tools like ChatGPT can sometimes be a surprisingly helpful sidekick. “It may not have the same blind spots as human doctors,” Beam noted.
Still, there’s a catch.
ChatGPT can also make things up—a phenomenon known as “hallucination.” It might invent studies or suggest medical advice that sounds convincing but isn’t backed by real data. That’s why experts stress that while AI can be a useful aid, it should never replace professional medical advice.
The American Medical Association (AMA) has cautioned against relying too heavily on AI in healthcare until it’s better regulated and proven safe. “AI-generated errors can harm patients,” said AMA president Dr. Jesse Ehrenfeld. “We need clinical evidence before fully integrating AI into medical care.”
Read more: Experts Say Human DNA Can Be Hacked—and May Be the Next Cybersecurity Threat
The Bigger Picture: Trust Your Gut and Be Persistent
Courtney shared her family’s story not just to highlight the weirdly effective use of AI in her son’s diagnosis, but to encourage other parents: Don’t give up.
“There’s nobody that connects the dots for you,” she said. “You have to be your child’s advocate.”
And in this case, a mother’s determination—and a bit of help from artificial intelligence—finally brought her son the answers he needed.