AI shouldn’t decide who dies. It’s neither human nor humane
Join Fox News for access to this content
Plus special access to select articles and other premium content with your account – free of charge.
By entering your email and pushing continue, you are agreeing to Fox News’ Terms of Use and Privacy Policy, which includes our Notice of Financial Incentive.
Please enter a valid email address.
NEWYou can now listen to Fox News articles!
Artificial intelligence (AI) will certainly change the practice of medicine. As we write this, PubMed (the website repository for medical research) indexes 4,018 publications with the keyword “ChatGPT.” Indeed, researchers have been using AI and large-language models (LLMs) for everything from reading pathology slides to answering patient messages. However, a recent paper in the Journal of the American Medical Association suggests that AI can act as a surrogate in end-of-life discussions. This goes too far.
The authors of the paper propose creating an AI “chatbot” to speak for an otherwise incapacitated patient. To quote, “Combining individual-level behavioral data—inputs such as social media posts, church attendance, donations, travel records, and historical health care decisions—AI could learn what is important to patients and predict what they might choose in a specific circumstance.” Then, the