A bizarre travel mishap unfolded in Tasmania, Australia, after an AI-generated tourism article on the Tasmania Tours website promoted a fictional destination called Weldborough Hot Springs as a “peaceful escape” and one of the state’s top natural attractions. Trusting the description — complete with imagined mineral-rich pools and forest scenery — visitors began turning up in the real rural town of Weldborough, located in northeast Tasmania, only to find that no such hot springs exist. Locals were left fielding daily inquiries from confused travelers who had planned entire trips around this fabricated location.
The Weldborough Hot Springs feature first appeared in a blog post on the tour operator’s site and was listed in a collection of “must-visit” hot springs for 2026. The content, later deleted, had been produced using AI and published without sufficient review by the company. According to the business owner, outsourced AI content creation slipped through the editorial process while he was out of the country, resulting in the misinformation going live and misleading readers.
Visitors who showed up expected warm springs but instead found only the cold waters of the Weld River — prompting bemused reactions from locals like Kristy Probert, owner of the nearby Weldborough Hotel. One group of 24 tourists reportedly detoured from mainland Australia just to search for the nonexistent attraction, only to be told by Probert that they were “more likely to find a sapphire than hot springs.”
Experts say this incident is a classic case of AI “hallucination”, where generative models confidently produce plausible-sounding but entirely fabricated information. Similar cases have emerged globally, including AI-generated travel guides in other countries that have sent tourists chasing nonexistent attractions or incorrect itineraries. The episode underscores the growing importance of fact-checking and human oversight when using AI for travel content, as travelers increasingly rely on automated sources for trip planning.