Tech

A New Study Reveals the Risks of the ‘Economy of Intentions’ and the Impact of Artificial Intelligence in Shaping Public Decisions

The Guardian published a report by Dan Milmo, reviewing a recent study conducted by researchers at the University of Cambridge. The study suggests that artificial intelligence tools could be used to manipulate online audiences, influencing their decisions—from what to buy to who to vote for.

The report explained that the study highlights an emerging market known as the “Economy of Intentions,” where AI is used to understand, predict, and manipulate human tendencies, selling this information to companies that benefit from it. According to the researchers, this phenomenon is seen as the next evolution of the “Attention Economy,” which relies on keeping users engaged with social media platforms through targeted advertising.

The study reviews how AI tech companies are promoting the “Economy of Intentions,” selling knowledge about individuals’ motives, ranging from their travel plans to their political opinions.

Dr. Jonny Ben, a technology historian at the Leverhulme Centre for the Future of Intelligence, was quoted in the report: “For decades, user interests have been the currency online, and sharing these interests with platforms like Facebook and Instagram has driven the digital economy.” He added, “But if left unregulated, the Economy of Intentions will become the new currency, and there will be a frantic race in the market to target, steer, and sell human intentions.”

Dr. Ben warned about the potential effects of this market on critical aspects of life, such as free and fair elections, press freedom, and fair market competition. He stressed the importance of considering these consequences before we become victims of them.

The report also noted that the study claims large language models, like those that support chatbots like “ChatGPT,” will be used to predict and guide user behavior based on various data, including psychological and social factors. The study suggested that these tools would be used at a low cost to steer conversations in ways that serve the interests of advertisers and companies, opening a wide door for subtly and opaquely influencing human decisions.

Back to top button