According to a new report from ManpowerGroup, the relationship between workers and artificial intelligence in the workplace is becoming strained, marked by declining trust, confidence, and morale rather than smooth cooperation. Instead of AI being universally welcomed as a productivity booster, many employees are expressing anxiety and skepticism about how the technology is being implemented — especially when it affects job roles, performance evaluation, and expectations around workload. This growing mistrust is shaping workplace dynamics in unexpected and sometimes harmful ways.
One major issue identified is that workers often don’t understand how AI tools make decisions, which fuels fear and resentment. When AI systems are used for tasks like performance monitoring, resume screening, or task assignment without clear explanation, employees report feeling watched, misunderstood, or unfairly judged. Lack of transparency about how algorithms operate or how outcomes are determined deepens anxiety and undermines confidence in both the technology and leadership teams that deploy it.
The report also highlights that confidence in AI’s usefulness has dropped among many workers. Initially, AI was pitched internally as a tool to enhance productivity and free employees from routine work. But in practice, some workers find that they spend more time correcting AI mistakes, explaining context AI doesn’t understand, or compensating for automated systems that prioritize speed over quality. Instead of empowering workers, poorly integrated AI can feel like an added burden, creating friction rather than assistance.
Finally, the growing disconnect between AI and human workers underscores the importance of intentional adoption strategies. ManpowerGroup suggests that organizations should prioritize training, transparent communication, and employee involvement in AI deployment decisions. When workers feel excluded from conversations about how AI will affect their roles, distrust grows. Conversely, when employees are educated about AI’s capabilities and limitations — and have a voice in how it’s used — they are more likely to embrace the technology as a collaborative partner rather than view it as a threat.