Complete Story
 

09/26/2024

Operation AI Comply: Detecting AI-infused frauds and deceptions

FTC

There’s been a lot of hype and excitement about artificial intelligence and all the amazing things it can, or one day might, do. Some companies are developing and selling AI tools, while others are flaunting the benefits of incorporating it into their existing business models. And some businesses aren’t being truthful when it comes to AI.

With the announcement of Operation AI Comply, the FTC is cracking down on AI-infused frauds and deception, including chatbots supposedly giving “legal advice,” AI software that lets people create fake online reviews, and false claims of huge earnings from AI-powered business opportunities.

Chatbots — a type of AI that creates humanlike “answers” in response to a user's prompt — might be useful when the stakes are low, like getting ideas for a new game or finding a recipe. But AI responses can be inaccurate, inadequate, misleading, or made up. The FTC just sued U.K. based DoNotPay for falsely claiming its chatbot could act like a “robot lawyer” and produce “iron clad” legal documents for people.

More Info

Printer-Friendly Version