Complete Story


ChatGPT: Navigating the rising financial crime landscape in the digital age

Beta News

In-depth discussions with financial crime compliance decision makers from 10 leading U.S. financial institutions reveal that real-time digital payments, digital fraud, and cybercrime are the primary concerns for compliance teams in 2023.That said, there is a new player that has entered the scene and demands our attention: ChatGPT. It has the dual ability to help or hurt compliance and security teams.

Because while this cutting-edge technology presents an opportunity for financial institutions to detect and mitigate fraud and financial crime, it also provides criminals with an avenue to commit these acts more easily.

The Expanding Influence of ChatGPT

Since its introduction in November 2022, ChatGPT has experienced exponential growth, boasting over 1 billion users by March 2023. Unsurprisingly, criminals have swiftly embraced it for their illicit activities, leveraging its capabilities to create convincing fake profiles, documents, and transactions that can easily bypass even the most well-trained compliance personnel. Additionally, ChatGPT serves as a breeding ground for the development of bots and malware to execute cybercrime schemes and perpetrate scams aimed at obtaining sensitive financial information. The use of AI-generated messages further compounds the threat, as it enhances the realism of impersonations, making scams more difficult to detect. Instances of ChatGPT being employed to create legitimate-looking social media personas for data theft and even monitoring cryptocurrency prices and payments have already come to light. The potential for fraud and scams to be "turbocharged" by ChatGPT, which FTC chair Lina Kahn has highlighted, poses a significant challenge for compliance teams striving to differentiate between criminal and legitimate transactions.


Printer-Friendly Version