AI-Driven Scams: A Kenyan’s Ordeal in Myanmar
When 26-year-old Kenyan national Duncan Okindo boarded a plane to Thailand in 2023, he believed he was on the verge of a new beginning. A local recruiter had promised him a customer service job in Bangkok, a chance to support his family back home. Instead, he was deceived, abducted upon arrival, and transported across the border into Myanmar’s infamous KK Park compound—a fortress-like hub of organized cybercrime.
Okindo spent four harrowing months inside the compound, where heavily armed guards oversaw a massive operation that exploited forced laborers to carry out sophisticated online fraud. His experience sheds light on the growing role of AI-driven deception in global scams and the scale at which human trafficking feeds these networks.
Life Inside the Scam Compound
Once inside KK Park, Okindo joined hundreds of others forced to work in rows of computer-filled rooms. Their mission was clear: impersonate wealthy investors and lure victims—often U.S. real estate agents—into fake cryptocurrency schemes. The workers used AI tools, especially free versions of ChatGPT, to craft convincing messages that imitated native English speakers.
Okindo said the bosses gave workers daily scripts outlining when to discuss property investments and when to pivot to crypto. Success was measured by how many victims could be persuaded to deposit money into fraudulent platforms. Failures carried dire consequences. Those who missed quotas faced beatings, humiliation, or even electric shocks. Meanwhile, “big wins” were celebrated with drumbeats, a cruel reminder of how human suffering was commodified into profit.
AI as a Tool for Exploitation
ChatGPT played a central role in enabling the scam. Okindo explained that AI-generated text helped scammers sound authentic, whether posing as a Texas cattle rancher or an Alabama soybean farmer. It also served as an instant research tool, allowing scammers to answer detailed questions about cryptocurrency or local housing markets in real time.
The misuse of AI made the scams highly effective, even ensnaring individuals who had previously fallen victim to fraud. “ChatGPT was the most-used tool,” Okindo said. “If you miss any point, the realtor will know you are a scam.”
OpenAI, the company behind ChatGPT, has said it actively works to detect and prevent misuse of its technology, cutting off violators where possible. Yet the reality is that the same tool designed to assist users can be weaponized in the wrong hands.
Romance Scams and Global Reach
Okindo’s story echoes similar accounts from other survivors of Southeast Asian scam compounds. A Burmese man who worked in a different operation in 2022 told Reuters that ChatGPT revolutionized romance scams, enabling him to simultaneously court dozens of targets with AI-generated love poems and flirtatious exchanges. With the bot’s help, victims trusted more deeply, only to be defrauded of their savings.
These scams—known as pig-butchering schemes—require workers to slowly build trust before pushing victims toward fraudulent investments. The blend of human manipulation and AI assistance has created a powerful, global threat.
Escape and Aftermath
Okindo’s ordeal ended only when Thai authorities cut power to KK Park, forcing captors to release some of the workers. He returned to Kenya in April, physically free but emotionally scarred. He now faces stigma, financial strain, and fears for his safety after receiving threats he believes are linked to the cartels that trafficked him.
His story highlights the human cost behind online fraud. For every fake message received by a potential victim, there is often someone like Okindo—trapped, coerced, and stripped of dignity. As AI tools grow more advanced, the urgent challenge remains: how to harness their potential for good while preventing their exploitation by criminal networks.