Trust decision-making in multi-agent systems

Christopher Burnett, Timothy J Norman, Katia P Sycara

Research output: Chapter in Book/Report/Conference proceedingPublished conference contribution

52 Citations (Scopus)

Abstract

Trust is crucial in dynamic multi-agent systems, where agents may frequently join and leave, and the structure of the society may often change. In these environments, it may be difficult for agents to form stable trust relationships necessary for confident interactions. Societies may break down when trust between agents is too low to motivate interactions. In such settings, agents should make decisions about who to interact with, given their degree of trust in the available partners. We propose a decision-theoretic model of trust decision making allows controls to be used, as well as trust, to increase confidence in initial interactions. We consider explicit incentives, monitoring and reputation as examples of such controls. We evaluate our approach within a simulated, highly-dynamic multiagent environment, and show how this model supports the making of delegation decisions when trust is low.
Original languageEnglish
Title of host publicationProceedings of the Twenty Second International Joint Conference on Artificial Intelligence
Subtitle of host publicationBarcelona, Catalonia, Spain, 16-22 July 2011
EditorsToby Walsh
Place of PublicationMenlo Park, California
PublisherAAAI Press/International Joint Conferences on Artificial Intelligence
Pages115-120
Number of pages6
Volume1
ISBN (Electronic)9781577355168
ISBN (Print)9781577355137
DOIs
Publication statusPublished - 2011

Fingerprint

Dive into the research topics of 'Trust decision-making in multi-agent systems'. Together they form a unique fingerprint.

Cite this