In August, Lance Johnson woke up in the middle of the night with excruciating stomach pain in his lower right side. He initially blamed it on the pizza and ice cream he had enjoyed the night before. But five sleepless hours later, the 17-year-old from Phoenix was still suffering, so he decided to consult the nearest expert: ChatGPT.
âI described what Iâd eaten the night before and where the pain was, and I was like, âDo you think itâs just my stomach?â And then it said that it sounded like it was appendicitis based on how long it was lasting and where it was,â Johnson says. âI kept asking it more questions, like could it be anything else? And it said, âBased on what you described, you should get it checked out.ââ
Johnson followed the botâs adviceâand, sure enough, the doctors at the emergency room soon said that he did, in fact, have appendicitis and needed immediate surgery. When he told them he had suspected as much because of ChatGPTâs insights, âI think they were kind of surprised that it would answer something like thatâthat it would diagnose me before they did,â Johnson said during a recent Zoom interview alongside his parents. âI didnât know anything about appendicitis before. I didnât even know it was in the bottom right.â
Similar scenarios are playing out across the country as ChatGPT usurps Dr. Google. One recent survey found that 1 in 7 adults over age 50 use AI to seek health information, while 1 in 4 of those under 30 do so. Usage is particularly prevalent in areas with limited access to health care providers. While there are plenty of potential risksâlike receiving inaccurate, outdated, or generic informationâsome doctors say AI platforms can be helpful, if you know how to use them the right way.Â
âThere is 100% a place for these tools to enrich patientsâ care journeys,â says Dr. Adam Rodman, an assistant professor at Harvard Medical School and a general internist at Beth Israel Deaconess Medical Center, where he is director of AI programs for the Carl J. Shapiro Center for Education and Research. âLearning language models (LLMs) have very powerful abilities in some domains, but they can fail dramatically in othersâyou donât want to rely on them as a doctor. However, LLMs are, I think, the best tool to help you understand your health right now.â
We asked providers to share the smartest ways patients are using AI platforms like ChatGPTâand how they might benefit your health, too.
Ask it medical facts
LLMs are a helpful way to get answers to fact-based queriesââwhat do plasma cells do?ââand questions about disease processes: âWhat happens when they mutate and become cancerous?â
âItâs not specific to a scenario,â says Dr. Adeel Khan, a hematologist-oncologist and epidemiologist whoâs an assistant professor of medicine and public health at the University of Texas Southwestern Medical Center. âItâs general, and thereâs a textbook answer.â An explanation about plasma cellsâ purpose, for example, doesnât require any context about individual circumstances; the answer will be the same regardless of your age, gender, and general health condition.
Read More: The 4 Words That Drive Your Doctor Up the Wall
Khan, who treats a rare form of cancer, has also seen newly diagnosed âtech-savvy patientsâ ask ChatGPT questions like this: âWhat is myeloma?â âWhat are common side effects of lenalidomide?â And, âWhat can a patient with myeloma expect?â
He prefers this type of usage to seeking personalized medical advice. âFor now, AI should be used to understand medical and treatment facts broadly,â he says. If you do turn to the tool for more individual-based insights, he cautions, use whatever you learn as a supplement toânot replacement forâactual medical care. The information you get from ChatGPT can guide your next conversation with a doctor, he adds, but it shouldnât be treated as the final word on your condition.
Plug in lots of details
Dr. Colin Banas suggests querying LLMs like this: âIâm a 48-year-old male whoâs completed X level of education, and I need to understand what this diagnosis is and what potential treatment options might be.â
âI think thatâs entirely fair game because it will give you good answersâ in a comprehensible way, says Banas, an internist and chief medical officer of DrFirst, a health care technology company. The more details and context you provide the toolâincluding your relevant health history or family history of a certain conditionâthe better equipped it will be to dispense information thatâs actually pertinent. But donât forget:
Be mindful of privacy concerns
Some people have uploaded medical test resultsâlike EKG scans, brain MRIs, and X-raysâor even their entire medical record into a LLM like ChatGPT for a âsecond-opinionâ-esque analysis. While it can be an interesting exercise that provides fodder for conversations with your doctor, Rodman worries about the privacy implications. âI think everyone needs to know that if youâre putting it into ChatGPT, that data is going straight to OpenAI,â he says. âYou are giving a tech company your personal health information, and thatâs probably not a good thing.â
Plus, he adds, vision language modelsâa type of AI designed to understand information based on both image and text inputâare not yet as accurate as text-based learning language models. âVision language models are not actually that good at image interpretation themselves,â Rodman says. âTheyâre usually exploiting text. If you put an EKG in, itâs mostly reading the text at the top to help interpret it, as well as the other context youâve given.â While he understands the urge to get a second opinion on potentially confusing results, these tools are âreally unreliable,â he says, âand at this point, Iâm comfortable definitively saying not to do it.â
Make sure to ask it unbiased questions
As you research, you can take steps to lower the chances of receiving biased information. For example, Khan recently asked ChatGPT why chemotherapy is preferred over immunotherapy for a certain type of cancer. That wording, he says, was intentionally biased: It suggested that chemo was the superior choice, which isnât necessarily true, and ChatGPT responded accordingly, ticking off chemoâs advantages.Â
A better approach, Khan says, is to ask the tool whether chemotherapy or immunotherapy was preferred, and to explain the pros and cons of each. AI tools âarenât foolproof,â he says. âHow itâs framed makes a difference.â
Let it help you decode medical jargon
AI tools like ChatGPT are âreally good at breaking down doctor-speak,â Banas says. âDoctors use a lot of advanced terminology and abbreviationsâwe canât help it. Itâs part of years and years of training, but patients donât always understand.â If you head home feeling mystified, plug your questions into your favorite AI platform, he recommends.
You might, for example, be stumped by an oncologistâs repeated use of the word âgrade.â Ask ChatGPT what it means, and within seconds, youâll have a few brief, easy-to-understand paragraphs explaining that grade refers to âhow severe, advanced, or abnormal something is when seen under the microscope or assessed clinicallyâ and how it differs from condition to condition.
At the end, youâll see a message like this from the bot: âWould you like me to also explain how grade differs from stage, since those terms are often confused?â From there, you can continue to follow the prompts until youâre ready to wrap up your impromptu session of medical school.
Use it to prepare for doctorâs appointments
Tools like ChatGPT can help you formulate better questions to take to your doctor. âPatients use it to prepare for their visits ahead of time,â Banas says. âTheyâll say, âHere are my symptoms; what are some questions I should ask my doctor?â Or, âWhat are some things my doctor should be thinking of?ââ
For example, say you input this query into ChatGPT: âIâve had a headache, nausea, and fatigue for two weeks. What questions should I ask my doctor?â The tool will advise you to seek medical care in a timely manner, and then suggest âfocused questionsâ that will help you âget the clearest answers,â broken into categories like symptoms, tests, treatment, and next steps.
Read More: 8 Symptoms Doctors Often Dismiss As Anxiety
Among the suggestions: âCould this be related to dehydration, infection, a migraine disorder, or something more serious?â âWhat initial tests should we do?â âShould we check for anemia, thyroid function, or other metabolic issues?â âWhat are safe options to manage my headache and nausea in the meantime?â âShould I avoid certain medications or foods until we know more?â âShould I be referred to a neurologist, endocrinologist, or another specialist?â
If you find the questions useful, Banas recommends writing them down or taking screenshots you can show your doctor.
Let it help you understand your care planÂ
Maybe your doctor just told you that you have gout and prescribed a high dose of ibuprofen and colchicine. When you get home, you might realize you canât remember the side effects they listed while you were absorbing the news. LLMs can help. Rodman suggests plugging in a prompt like this: âMy doctor thinks I have gout. This is what Iâve been prescribed. What are things I need to look out for? And what should make me call my doctor again?â
Use it to brainstorm lifestyle modificationsÂ
Shriya Boppana, an MBA candidate in North Carolina, credits ChatGPT with helping her manage her eczema, which is triggered by skin and makeup products. Every time she tries a new product, she uploads its information into the AI tool and documents whether it caused a reaction. âIf it does, I ask what ingredient might have caused the reaction so I can stay away,â she says. âItâs a running list, and itâs helped my skin stay super clear.â
While Gigi Robinson, a creator-economy strategist in New York, doesnât use ChatGPT to replace medical advice for her endometriosis, she says itâs been a âpowerful tool for empowerment and mindset shifts.â When sheâs navigating flare-ups, she asks it to help her brainstorm ways to adjust her work schedule or manage projects so she can still be productive while respecting her bodyâs needs. âItâs helped me reframe situations that would normally feel limiting into opportunities to work smarter,â she says. Robinson also leans on ChatGPT to talk through lifestyle adjustments like meal prep ideas, travel accommodations, and communication strategies for explaining her health needs to clients and colleagues.
Those uses exemplify the positive potential of AI tools. âInformation is power,â says Lora Sparkman, a longtime registered nurse whoâs now a clinical strategist at Relias, a health care tech and education company âWeâre not looking to replace the health care team, but this better informs the consumer on what theyâre interested in. Patients have these tools at their fingertips, and theyâre going to lead to a shift in conversations.â
Keep your doctor in the loop
If you donât get better following your providerâs treatment plan, Rodman is OK with the idea of uploading the documentation along with a prompt like this: âI didnât get any better; what else could this be?â âAnd then when you go see your doctor [for a follow-up], be honest about your LLM use and have an open conversation with them,â he says. âYou should not get a second opinion from the AI and then act on that without talking to a health provider.â
Read More: 10 Questions You Should Always Ask at Doctorsâ Appointments
If you and your doctor disagree about something related to your careâand their guidance contradicts or overlooks what you learned onlineâyou could even show them your conversation with the chatbot, Rodman says. Many will be open to taking the time to talk through it with you. âHonesty and transparency are the best way to have a good clinical conversation with your doctor,â he adds.
It also makes sense to experiment with your favorite AI platform to figure out what kind of usage feels the most helpful. âChatbots donât come with a userâs manual,â Rodman says. âThey couldnât, because everyone uses them differently, and theyâre kind of unpredictable. The only way youâre going to get good at them is by experimenting.â

