
Can AI change the opinion of people? (Photo: Freepik)
Lausanne-The argumentation of people in online debates is less convincing than that of large voice models (LLMS) like GPT-4 if you adapt your discussion contributions based on personalized information about your opponents. This is the result of an examination of a team around Francesco Salvi from the Federal Technical University.
Arguments adapted individually
Salvi and his colleagues brought 900 people in the USA either with another person or with GPT-4 to discuss various sociopolitical topics, for example about whether the United States should ban fossil fuels.
In some cases, the discussion partner received demographic information about his counterpart – including gender, age, ethnicity, level of education, employment status and political orientation. In this way, the arguments could be better tailored to the respective partners.
The debates took place in a controlled online environment, with the participants being recruited via a crowdsourcing platform specially set up for the study.
Salvi and his colleagues then found that GPT-4 was 64.4 percent more convincing than human discussion speakers if they were fed with personal information about the participants. Without access to personal data, however, GPT-4’s convincing skills were not to be distinguished from people.
People specifically influenced by AI
“The results underline the ability of GPT-4 to deliver convincing arguments and suggest that further research is necessary to reduce the risks associated with their use in the persuasion,” said Salvi.
Since conversations between humans and LLMs are becoming increasingly common, the risk that the bots could be used increasingly increases to change the beliefs or opinions of people. So far, however, it was unclear whether these models can adapt to personalized information in order to align arguments on certain discussion partners.
Source: www.pressetext.com
(PTE003/21.05.2025/06: 00)