In Southeast Asia, organized crime rings have long forced trafficked workers — often from dozens of countries — to operate in scam compounds, carrying out online fraud. Experts now warn that artificial intelligence could dramatically change this landscape by automating key parts of these operations.
According to researchers, some scam centers are already using AI to generate messages that initiate contact in “pig butchering” scams, a type of long-term fraud where trust is built before victims are defrauded. As large language models improve, there’s a real fear that AI could take over even more of the scamming workflow, reducing the demand for human trafficked labor.
This shift could have unintended consequences: governments and international bodies have long used the risk of human trafficking as a lever to pressure Southeast Asian nations into cracking down on scam compounds. But if AI reduces the reliance on trafficked workers, that moral and political pressure may weaken. Without inside human informants or visible human victims, rogue operations might become harder to track and prosecute.
At the same time, crime networks are reportedly using other high-tech tools — including stablecoins and deepfake technology — to bolster their scams. This makes the problem not just a human-trafficking issue, but a growing tech-enabled crime challenge that could demand new forms of global law-enforcement and regulatory cooperation.