Complete Story
 

06/14/2024

Deploy This Software Trick to Reduce AI Hallucinations

A process called retrieval augmented generation is hitting Silicon Valley

If you have ever used a generative artificial intelligence (AI) tool, it’s lied to you. Probably multiple times.

These recurring fabrications are often called AI hallucinations, and developers are feverishly working to make generative AI tools more reliable by reigning in these unfortunate fibs. One of the most popular approaches to reducing AI hallucinations—and one that is quickly growing more popular in Silicon Valley—is called retrieval augmented generation.

The RAG process is quite complicated, but on a basic level it augments your prompts by gathering info from a custom database, and then the large language model generates an answer based on that data. For example, a company could upload all of its HR policies and benefits to a RAG database and have the AI chatbot just focus on answers that can be found in those documents.

Please select this link to read the complete article from WIRED.

Printer-Friendly Version