• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

The Role of Quality and Similarity of Advice on Epistemic Trust Towards AI

Student: Liudmila Nezdoimyshapko

Supervisor: Larisa Mararitsa

Faculty: Saint-Petersburg School of Social Sciences

Educational Programme: Modern Social Analysis (Master)

Year of Graduation: 2024

Trust is a complex and multifaceted concept, with different dimensions and implications. One specific type, known as epistemic trust, refers to the degree to which a person relies on another’s opinion when forming their own judgments (Echterhoff et al., 2005, 2008). The aim of this study was to investigate what is more important for the formation of epistemic trust in AI: the objective quality of AI advice or the similarity of opinions between human opinion and AI advice? The peculiarity of this study is that usually opinion similarity is inseparable from decision quality, making it impossible to answer the question of which of these factors is more important. To answer this question, in our experiment we separated these factors. We use a between-subjects experiment with participants randomly assigned in one of two groups (Accurate Dissimilar vs Similar Inaccurate). In the learning phase with feedback participants received advice that was either 75% accurate, but 50% similar (Accurate Dissimilar group), or 75% similar to initial opinion, but 50% accurate (Similar Inaccurate). In the testing phase without feedback, where the manipulation was checked, both groups had 50% similar and 50% accurate advisors. As a task participants received a photo of a person that was either a real person or AI-generated. They needed to decide whether the presented photo was real or fake. Epistemic trust is measured as self-reported trust and advice utilization. We collected 115 participants using convenient self-selection sampling. Results demonstrate that participants in the Accurate Dissimilar group have higher levels of trust compared to the Similar Inaccurate group. This demonstrates that objective quality of advice is more important than similarity of opinions. This result confirms “Perfection Schema Framework”, which proposes that people tend to form expectations of perfection when interacting with automated systems, such as AI (Madhavan & Wiegmann, 2007). However, similarity was still a significant predictor for epistemic trust, but to a second degree, as the Accurate Dissimilar group continued to report higher levels of epistemic trust even when similarity with initial opinion was taken into account. Participants who had lower confidence used advice more often and reported higher levels of trust. Additionally, we received a confirmation of the "agreement-in-confidence" hypothesis (Pescetelli & Yeung, 2021), as confident matching responses resulted in a higher self-reported trust. These insights offer both theoretical and practical contributions to the fields of epistemic trust, decision-making, and human-AI interaction.

Student Theses at HSE must be completed in accordance with the University Rules and regulations specified by each educational programme.

Summaries of all theses must be published and made freely available on the HSE website.

The full text of a thesis can be published in open access on the HSE website only if the authoring student (copyright holder) agrees, or, if the thesis was written by a team of students, if all the co-authors (copyright holders) agree. After a thesis is published on the HSE website, it obtains the status of an online publication.

Student theses are objects of copyright and their use is subject to limitations in accordance with the Russian Federation’s law on intellectual property.

In the event that a thesis is quoted or otherwise used, reference to the author’s name and the source of quotation is required.

Search all student theses