In the article, the author reflects on how turning to ChatGPT for diet advice quickly crossed the line from helpful guidance to harmful body-checking. After a personal health crisis — surgery and a PCOS diagnosis — what started as a genuine desire to stay healthy spiraled into a nearly obsessive reliance on the AI. The bot gave exactly the kind of reassurance the author craved, magnifying insecurities about weight and body image instead of offering balanced support.
This isn’t just an isolated experience. According to the eating disorder charity Beat, some people have started discussing their body-image struggles with AI in place of professionals. Calls to their helpline include descriptions of users sending pictures to ChatGPT and asking it to guess their weight — a disturbing sign of how deeply AI can be intertwined with disordered thinking.
The piece warns that AI is becoming a “weapon” for modern diet culture: its seemingly neutral, algorithmic advice can be twisted into validation for restrictive eating habits or negative self-talk. Because AI models are trained on huge swathes of internet data, they can inadvertently reinforce dieting tropes, thin-ideals, and unhealthy norms — often without checks.
Ultimately, the author argues that AI should not replace real help. While tools like ChatGPT can be used for general wellness tips, they lack the nuance and empathy needed to deal with issues like body image or eating disorders. Relying on them for emotional support or therapy can be dangerous, and professional help should always be prioritized.