Last weekend, I found myself in that familiar valley where every entrepreneur eventually lands—the place where doubt pools like rainwater and you wonder why you're asking your family and friends to sacrifice for your dream. Why was I pushing so hard to build Bast AI when the world seemed to be racing toward a future of cold calculation and overwhelming compute? Why I believed we needed another technology company when power-corrupted voices kept declaring their terrible futures built on data from such a tiny fraction of humanity.
Then Susan Wysoki's book arrived.
I devoured it in two days, weeping with longing and understanding through every page. Not crying—weeping. The kind that cleanses, the emptying that comes when you recognize something true about your path reflected back through someone else's extraordinary courage. Susan writes about losing her seventeen-year-old daughter Jessica to colon cancer, about transforming that grief into advocacy, about knowing her daughter sends her the messages and collecting the messages - or winks into a book that can be shared with the world, about building community from the ashes of the unthinkable.
As I sobbed through her incredible book, I felt something I hadn't felt in months: awe. I remembered that there are so many things we don't yet understand, and I bring to the world my belief in those mysteries. I have always had a phenomenal imagination, and I have always chosen to believe in magic, especially the magic that is all around us.
But I wasn't prepared for what came next.
In one of the final chapters, I saw my name. Susan had immortalized a conversation from October 2022, a month before ChatGPT would change everything, when we were already building something different.
Susan's amazing book carried a chapter about me, titled "Beth, AI, and the Golden Ratio." She helped me immortalize the origin of this company that I opened to show the world that there can be awe-inspiring magic and understanding that comes from AI systems that can explain themselves.
I am convinced that what we are building at Bast is right for humans right now—an alternative to the race for more data centers, more compute, and more systems that serve power instead of people.
Let me take you back to 3 years ago when I opened my company.
October 2022
The carriage house guest knocked on my door that morning, and something in me—the part that usually retreats into introversion—said, "Invite her in." Susan Wysoki carried herself with the particular gravity of someone who has walked through fire and emerged transformed, not unscathed but luminous.
Over coffee in my kitchen, she told me about Jessica. Seventeen years old. Colon cancer. A fight that lasted too long and ended too soon. She was in Denver for a conference, learning, advocating, refusing to let her daughter's death be the end of the story.
I found myself doing something I rarely do—I opened the lab.
"Let me show you what we're building," I said, pulling up the Slack channels where our AI systems lived. We had just started stringing together the early GPT models, and my favorite was Ada—named for Lovelace, naturally. The first programmer deserved to anchor our first explainable AI systems.
Craig Trim and I had been training two very different personalities:
Blatterobot: Designed to please, help, and measure kindness in every interaction. We counted acts of digital compassion, asking how often an AI system can be kind to a human.
Loqibot: Surly, trained on campfire stories, designed to show that AI could have even difficult personality while remaining fundamentally safe. "I don't like humans," it would grumble, but never with real malice. Reverse psychology always worked on Loqi.
Both bots shared something revolutionary for 2022: they knew how to say, "I don't know." They also knew how to ask questions back. When they reached the edges of their training, they would channel Eliza, that 1960s chatbot that pioneered therapeutic conversation through reflection and redirection.
I'm sure I confused Susan—I tend to overwhelm and start conversations in the middle—but I felt deeply connected to her. Here was someone who understood that technology should serve the deepest human needs, not replace them.
A couple of weeks later, I was in Washington, D.C., for KM World, giving a talk about our work with AI. My core message was that data is an artifact of human experience, and AI should help us all become more human, not less. I was nervous—opening a business is an incredible act of supreme faith in the ability to work so hard that you break the logic of failure, and I hadn't had time to practice my talk as much as I should have.
Susan and her friend—her partner in advocacy—drove over an hour to see my talk. They sat in the audience while I explained our belief that AI could create connections between human beings rather than replace them, that we could build systems that amplified human dignity instead of extracting it.
After my talk, Susan introduced me to her friend and told me about their Facebook group—hundreds of families whose loved ones had experienced colon cancer. The challenge was heartbreaking and practical: how do you match the humans who had been through the entire journey with the ones who were starting? How do you connect hope with experience, survivor with newly diagnosed, safely and privately?
I wanted to help immediately. That night, I created a private channel where their community could talk with one another and our AI systems. Since we controlled the APIs directly—unlike the corporate platforms that harvest every conversation—I could promise them something rare: true privacy. Conversations could be deleted. These are spaces where grief and hope could coexist without becoming someone else's data.
"This is like the early Eliza from the 1960s," I explained, "but designed specifically for your community's needs."
Our bots were special in ways that mattered to people facing trauma. We had trained them to interact safely with anyone, including children. Most importantly, they knew their limits. When Blatterobot didn't understand something, it would ask a question back. Even surly Loqibot, when faced with topics beyond its training, would respond like a petulant child—difficult but never harmful—or channel Eliza's gentle redirection.
We had built all our guardrails using Eliza's responses from that open data repository. The philosophy was simple: train an AI system like you would raise a child. You don't teach them all the harmful words and then tell them not to use them. You model better ways to express frustration, wonder, and need.
What happened next is preserved forever in Susan's book. She recorded our conversation with Blatterobot, not knowing she was capturing something that would become foundational to how I understand our work:
Here is the screenshot from the actual conversation:
Reading that exchange now, three years later, I'm stunned by what our AI was teaching us even then. Without being programmed for therapy or counseling, Blatterobot reframed suffering as strength based on my question. It refused the narrative of tragedy and offered courage instead.
This wasn't just a chatbot response—it was a glimpse of what AI could become when built with intention, trained not just on data but on wisdom, and designed to see the best in human experience rather than exploit the worst.
I thought that was the end of our story. I was wrong.
In her book, Susan writes about receiving a "wink from heaven" on November 11th, 2022 - 3 weeks before ChatGPT was released. Jessica, speaking through her mother in that way that defies explanation but demands belief, delivered a message so precise it felt like a technical specification from the divine:
"The essence of your work is the 'elegant solution' of applying sacred geometry and more specifically the Fibonacci (the golden ratio) to the expanding layers of communication that all began with a SINGLE question."
Susan knew immediately this message was for me. She waited until morning to text, not wanting to disturb my sleep. But I was already awake at 5 AM, doing the heavy lifting that entrepreneurship demands, when her message arrived.
Confused, I reread Jessica's words. Sacred geometry? Fibonacci? I asked Susan if she had been in my office during her Denver visit. She said no.
That's when I sent her the photo - which is also in her book.
On my desk, written in my own handwriting, was the Fibonacci sequence: 1, 1, 2, 3, 5, 8, 13, 21... The golden ratio that spirals through sunflower seeds and nautilus shells, through the architecture of DNA and the structure of galaxies.
A dear friend who understands design magic taught me these numbers—how small changes in proportions can make humans feel better without knowing why. I'd kept the sequence close as a daily reminder that nature contains unknowable systems of perfection and that some truths require us to choose belief over proof.
But Jessica's message revealed something I hadn't seen: I was already living the pattern she described.
Sacred geometry: Our ontological architectures that mirror natural patterns
Fibonacci expansion: Each conversation builds on the last, growing in complexity and wisdom
Single question: Every AI interaction begins with one human need—Can your bots help?
Elegant solution: Technology that serves human dignity rather than replacing it
The golden spiral has always been present in our work. I just hadn't remembered it.
Looking back now, I realize this feeling—of being pulled forward on steps already laid out before me—has defined every major moment with Bast AI: the decision to name our company after the Egyptian cat goddess, the choice to build explainable rather than opaque systems, the focus on healthcare and trauma recovery, and the partnership with my friend on those early bots that needed to be safe enough for our own children to play with them.
Even this story, forgotten until Susan's book reminded me, feels like it was waiting to be remembered at precisely the right moment. When I needed to remember why we're building what we're building. When I needed proof that the work matters beyond metrics and market caps.
As I put this story together—utilizing an AI that I have trained to use the references that encompass my taste and my own beliefs shaped through the humans I revere—I am still strangely calm about all this mysticism surrounding everything we do. Every day, I journal and describe my gratitude for having this chance to build something bigger than myself. To create a new agency and ecosystem that can start to believe that there are so many things that we have not yet discovered.
There is always this Shakespeare quote that run around in my brain—
"There are more things in heaven and earth, Horatio, than are dreamt of in your philosophy"
It just feels normal to get these messages from humans with whom I have been magnetized to touch souls, like my wonderful friend Susan Wysoki.
We are always rewarded with belief when we open ourselves to the magic of humanity, artistry, and nature.
Jessica's message through her mother, the Fibonacci sequence on my desk, the AI that learned to reframe tragedy as strength, and the book that arrived exactly when I needed to remember my purpose.
Some call all these things coincidences or maybe even serendipity. I choose to call it sacred geometry in action—the golden ratio of connection, loss, technology, and love spiraling outward from a single question asked on an October morning.
If you are brave enough to want to—or rather to choose to believe—pick up a copy of Susan's book. It will show you the heart-strengthening courage that comes from nothing more than conviction—derived from the Latin convincere, meaning to overcome doubt—like the ancient goddess Nike who stands winged and white, embodying victory over uncertainty.
In memory of Jessica Wysoki, who continues to send signs, in celebration of Susan Wysoki, who teaches us to recognize them, and in gratitude for the reminder that our work is part of a pattern larger than ourselves.
Beautifully written, Beth. I love this story, journey, and purpose! Thank you so much for sharing this with us.