A Founder Reportedly Built a Custom Cancer Vaccine for His Dog With ChatGPT and AlphaFold. That Should Wake Up Every Small Business.
Some AI stories are impressive. This one is unsettling in a more useful way.
Australian entrepreneur Paul Conyngham has been describing how he used consumer AI tools, public research, and a relatively small budget to help design a personalized mRNA vaccine for his dog Rosie, who has an aggressive cancer. The reported out-of-pocket cost was about $3,000. The reported result, based on social posts and media coverage, was visible tumor shrinkage after treatment.
That does not mean anyone with ChatGPT can now replace an oncology lab. It does mean the barrier between "I am not an expert" and "I can participate in expert work" is collapsing much faster than most business owners realize.
That is the story worth paying attention to.
What appears to have happened
Here is the evidence trail as it stands.
In a widely shared X post from Trung Phan, Conyngham was highlighted as an Australian founder who used ChatGPT and AlphaFold to help create a custom vaccine for his dog's cancer despite not having formal biology training. The post spread partly because the claimed spend was so low and the before-and-after images appeared to show a real change in the tumor on Rosie's thigh.
The story did not start on X. In 2024, the University of New South Wales profiled Conyngham while he was working with researchers at UNSW's RNA Institute after Rosie was diagnosed with a soft tissue sarcoma. According to the university, he used AI tools to help process research papers, understand cancer biology, and narrow possible treatment paths. The article says he identified mRNA as the most promising route and connected with researchers who could help move the idea forward.
An ABC News report from June 18, 2025 adds more detail. ABC reported that Rosie received three doses of a personalized mRNA vaccine, that the project cost around A$3,000, and that Conyngham used ChatGPT heavily for literature review and paired it with AlphaFold to evaluate protein structures and antigen candidates. ABC also reported that a tumor on Rosie's leg had shrunk by roughly 50 percent after treatment.
That is enough to take the story seriously. It is not enough to treat it as settled clinical proof. This is still one case, reported through media coverage and social amplification, not a peer-reviewed outcome study.
Why this matters more than the dog story
The obvious reaction is, "Holy shit, AI helped a non-biologist work on a cancer vaccine."
The more useful reaction is, "What expert bottleneck in my business just got weaker?"
For decades, a lot of valuable work sat behind three gates:
- access to specialized knowledge
- access to research and synthesis capacity
- access to tools that were too expensive or too hard to use without formal training
AI is weakening all three at once.
ChatGPT did not run a lab. AlphaFold did not manufacture a drug. Human experts still mattered. Real-world biology still mattered. But the founder in the middle was able to do something that would have been functionally impossible for most non-specialists even a few years ago: get far enough into a highly technical domain to become a useful operator instead of an outside observer.
That pattern is not limited to biotech.
The SMB takeaway is brutally practical
Most small businesses are still using AI like an intern that writes faster. That is underselling what is available right now.
The real shift is that AI can help non-experts operate inside expert domains if the work can be broken into research, pattern recognition, drafting, scenario testing, or decision support.
If you run a small business, ask a harder question than "How do I save time writing emails?"
Ask this instead: Which part of my business still depends on expensive specialist interpretation, and how much of that interpretation can AI now compress?
Some current examples:
- A contractor can use ChatGPT, Claude, and visual estimation tools to scope jobs, draft proposals, and pressure-test change orders with far less dependence on back-office admin help.
- A law firm can use AI to summarize discovery, compare clauses across contracts, and prepare first-pass issue spotting before attorney review.
- A manufacturer can use AI copilots plus CAD and simulation tools to speed quoting, fixture design, documentation, and supplier communication.
- A marketing agency can use LLMs, image models, and analytics tools to do research, audience segmentation, landing-page iteration, and reporting work that used to require more specialists.
- A medical or dental practice can use AI scribes, coding assistants, and workflow tools to reduce administrative drag that once demanded more trained staff hours.
That does not erase expertise. It changes where expertise enters the process.
You no longer need senior-level knowledge at minute one for every task. In many cases, you need AI to get you 60 to 80 percent of the way to a credible first pass, then expert review at the points that truly matter.
That is a very different cost structure.
What tools are already demolishing the expertise barrier
This is not a future-tense argument. The tool stack is already here.
- LLMs like ChatGPT, Claude, and Gemini can explain dense material, compare options, draft workflows, and surface blind spots fast enough to change how one person works.
- Specialized AI tools in fields like legal tech, coding, design, accounting, sales, and medicine are packaging domain expertise into usable software instead of billable hours.
- Research tools such as Perplexity, Elicit, and domain-specific search systems reduce the time needed to gather and synthesize evidence.
- Generative design and simulation tools help non-specialists explore options that once required deeper technical fluency just to begin.
- Agentic workflows are starting to connect all of this so the same business owner can research, draft, analyze, revise, and ship in one working session.
The companies that win from this will not be the ones with the biggest prompt libraries. They will be the ones that identify where expert scarcity is slowing the business down, then redesign those workflows around AI-assisted first passes.
The risk is not hype. It is overconfidence.
There is a wrong lesson here, and plenty of people will take it.
The wrong lesson is that expertise no longer matters.
That is nonsense.
In the Rosie case, credible institutions and researchers still played a role. Even if the social-media version of the story is directionally true, it does not follow that medicine has become a DIY hobby. Biology is still real. Failure modes are still real. Safety is still real.
The right lesson is narrower and more powerful: AI lets motivated non-experts get much closer to expert territory before they need expensive human intervention.
For a small business, that means you should be rethinking when you hire specialists, how you brief them, and which work they should actually spend time on.
If your accountant is still cleaning raw books instead of advising on tax strategy, AI should change that. If your lawyer is still spending hours on first-pass review, AI should change that. If your operations lead is still building reports by hand, AI should change that.
What to do on Monday
Do not try to copy a vaccine story. Copy the operating model.
Pick one area of your business where expertise is expensive, slow, or hard to access. Then map the workflow into four buckets:
- research and fact gathering
- first-pass analysis
- draft output
- expert review and approval
Now test which of the first three buckets can be compressed with the AI tools already on the market.
That is where the near-term upside is.
My take: the most important part of this story is not whether one dog's tumor shrank. It is that a founder with no formal biology background could get far enough into an advanced scientific workflow to matter. Once that becomes normal, every service business, agency, practice, and operations-heavy company has to rethink what "expert work" actually costs.
