I Want AI to Cure Rare Disease. I Just Don’t Want It to Break Everything Else

As a parent of two children with a rare condition, I’m excited about what AI can do for healthcare. I believe it has the potential to accelerate rare disease research, improve clinical trial matching, and even change how we approach drug discovery and gene-based treatments.

These are the kinds of breakthroughs many of us have hoped and prayed for. But contrary to what Silicon Valley has touted for years - you do not have to break things to move fast.

We can pursue AI advancement while still protecting:

  • workers from exploitative deployment,

  • communities from environmental strain,

  • and the public from unaccountable high-impact systems.

In early 2025, President Trump revoked the 2023 executive order focused on “safe, secure, and trustworthy” AI. This shift was framed as a necessary move in order to remove barriers to innovation and stay competitive in the global landscape - especially against China.

That argument isn’t new - and yet, history shows us that when powerful technologies scale without thoughtful oversight, the consequences show up later. Social media is one example. It connected the world in ways we never imagined, but it also introduced challenges around mental health, misinformation, and privacy in ways that we are still trying to grapple with.

AI is already raising similar concerns. There have been growing cases of AI companions and chatbots, forming intense emotional dependencies that are alarming. A teenager’s death was linked by his family to interactions with an AI chatbot - sparking debate about safety, boundaries, and responsibility.

I want this technology to advance quickly for the betterment of our society, and I do believe that is possible. However, AI shouldn’t be an open, ungoverned system. If it’s going to reshape healthcare, work, and information, it needs clear standards, accountability, and oversight. Right now, Congress is still working through what that should look like—and until that’s defined, we’re operating in a gray area that deserves more attention.

Next
Next

AI Learned From Us - Should Creators Get Paid?