Generative-AI image tools are increasingly being used to create representations of human bodies — including athletes — but according to recent research, they overwhelmingly produce narrow, stereotypical images that reinforce unrealistic body standards rather than promote diversity.
In a study involving 300 AI-generated images from three popular platforms, researchers found that almost all “athlete” images depicted bodies with very low body fat and highly defined musculature. Female images tended toward revealing clothing, youth, and idealized attractiveness, while male images were often hyper-mascular and shirtless.
Furthermore, the AI-generated images suffered from a dramatic lack of diversity. Nearly all subjects appeared young, white, and physically “fit,” and none featured visible disabilities, older age, or body types outside the standard athletic ideal. When asked for a generic “athlete,” 90% of the outputs showed a male body, revealing a strong gender bias.
The core warning: because these AI-generated representations are increasingly common on social media and digital platforms, there is a real risk that they shape public perceptions of what “normal” athletic or healthy bodies look like — potentially harming self-esteem, encouraging unrealistic comparisons, and marginalizing those who don’t fit the narrow mold. Broader efforts toward inclusive representation and awareness-raising are needed if AI is to reflect human diversity rather than reinforce stereotypes.