TikTok Influencers Miss Dream Trip After ChatGPT’s Bad Advice

URL copied to clipboard.
Travelers walking with rolling suitcases in an airport terminal near gate signs 200-299
GRAHAM HUGHES/EPA

Summary:

  • Spanish content creators Mery Caldass and Alejandro Cid miss flight to Puerto Rico after ChatGPT error on visa requirements.

  • Airport staff denies boarding; couple obtains ESTA and arrives in time for Bad Bunny concert.

  • 60-year-old man hospitalized for bromism after following ChatGPT dietary advice; experts warn against relying solely on AI.

Spanish content creators Mery Caldass and Alejandro Cid went viral recently after sharing a moment of heartbreak at the airport on TikTok. The couple said they missed a flight to Puerto Rico after relying on ChatGPT, which incorrectly told them that Spanish citizens didn’t need a visa to enter U.S. territories.

In reality, travelers from Visa Waiver Program countries—including Spain—must obtain an Electronic System for Travel Authorization (ESTA) ahead of departure.

@merycaldasssi hay una revolución de las IAs voy a ser la primera 4niquil-hada🧚‍♀️♬ sonido original – Mery Caldass

Airport staff denied them boarding due to the missing ESTA. The pair ultimately acquired the required document and arrived in Puerto Rico in time for the Bad Bunny concert.

U.S. Customs and Border Protection states clearly: visitors from Visa Waiver countries must have valid ESTAs to enter the continental United States and its territories—including Puerto Rico. Without one, entry is denied at the boarding.

In another recent incident, a 60‑year‑old man sought dietary advice from ChatGPT. The AI suggested replacing regular table salt (sodium chloride) with sodium bromide. Taking the recommendation seriously, he consumed the industrial chemical for three months. He began experiencing severe symptoms, including paranoia and hallucinations, and was hospitalized—ultimately diagnosed with bromism, a toxic condition from excessive bromide exposure.

Medical professionals treated him with fluids, electrolyte therapy, and antipsychotic medication. The case, published in Annals of Internal Medicine Clinical Cases, underscores the dangers of depending on AI tools for medical guidance without professional oversight. OpenAI reaffirmed its guidance that ChatGPT is not a substitute for medical advice, and training updates are underway to reinforce that message.

ADVERTISEMENT

These cases highlight a broader reality and the importance of understanding that not everything ChatGPT or other AI tools produce is accurate. From missed international flights to dangerous health outcomes, relying solely on AI for critical decisions can carry real consequences. Experts stress that information from AI should always be verified through official or authoritative sources before acting on it.

More headlines