Explore chapters and articles related to this topic
Reputation Management on D2D Ecosystems
Published in Khan Pathan Al-Sakib, Crowd-Assisted Networking and Computing, 2018
Dimitris Chatzopoulos, Pan Hui, Gunnar Karlsson
The emerging-sharing economy increases the importance of trust in P2P marketplaces and services. Reputation is defined as the opinion that someone has about someone or something, or how much respect or admiration someone or something receives, based on past behavior or character. Reputation systems allow users to rate each other in communities in order to build trust through reputation. Reputation-based schemes discourage misbehavior by estimating users’ reputation and punishing the ones with bad behavior. These schemes are based on the past attitude of the users, and each user calculates her trust to every other user. Trust and reputation are two concepts that get used so often that they get confused with each other. In terms of a mobile user, this means that someone is trustworthy if his or her actions are almost always what you expect an ideal user to do. Someone who is not trustworthy will frequently deviate from your expectations. In short, trust is your ability to accurately predict another person’s behavior. On the other hand, reputation is not a prediction of the future, but knowledge of the past. Reputation is a memory tied to a specific identity. It is a collectively agreed upon version of how history has taken place. A strong reputation builds trust. Next, we present the most popular reputation and trust-based schemes, and in the next section we discuss how they can be adapted in D2D ecosystems.
Introducing NovaGenesis as a Novel Distributed System-Based Convergent Information Architecture
Published in Phan Cong Vinh, Nature-Inspired Networking: Theory and Applications, 2018
Antonio Marcos Alberti, Marco Aurelio Favoreto Casaroli, Rodrigo da Rosa Righi, Dhananjay Singh
Sophisticated trust and reputation systems have been proposed, some of them even providing distributed reputation and quality assurance of any node, message, or information. Some models designed include a data-centric trust establishment [54] framework and distributed emergent cooperation through adaptive evolution [55].
The implications of shared identity on indirect reciprocity
Published in Journal of Information and Telecommunication, 2020
Wafi Bedewi, Roger M. Whitaker, Gualtiero B. Colombo, Stuart M. Allen, Yarrow Dunham
This opens up the possibility of components of identity being shared and may lead to ‘group mind’ behaviours like in-group favouritism and out-group bias (Swann & Buhrmester, 2015). It is from identity that reputation is derived. Reputation provides a currency through which cooperation can be recognized and signalled (Nowak & Sigmund, 2005), allowing individuals to leverage future help when needed (Molleman et al., 2013). Human group identity can become an important component of the extent to which the reputation of an individual is formed and recognized independently form the personal actions. In extreme situations, this can lead to the loss of any personal identity, where reputation is fully merged with that of the group(s), leading to social phenomenons such as stereotyping (Hales, 1998). In recent times reputation systems have also emerged to support decision making in diverse areas of e-commerce. In auction systems, a seller's reputation proves fundamental in the willingness of buyers to decide whether or not to place a bid (Melnik & Alm, 2002). Furthermore, e-commerce reputation also serves beyond auction settings to signal the quality of product and services (Resnick et al., 2000). This information has significant value as a ‘public good’ (Wasko & Faraj, 2000). There are several other areas of work in multi-agent systems where the focus is to engineer protocols or rules that seek to ensure cooperation is followed. These approaches aim to disincentivize deviation from behaviours that benefit the public good (Wu et al., 2016).
A framework for informing consumers on the ecological impact of products at point of sale
Published in Behaviour & Information Technology, 2018
Satu Elisa Schaeffer, Sara Elena Garza, Juan Carlos Espinosa, Sandra Cecilia Urbina, Petteri Nurmi, Laura Cruz-Reyes
Another issue consists of how to ensure content quality and correctness. Reputation systems usually handle environments where the existence of controversial content is possible; it is not uncommon to see this kind of system applied to social media. The traditional approach for reputation systems consists of voting or rating contents to establish their quality or confidence. In Wikipedia, for example, content-based reputation systems based on contribution persistence have been proposed (Adler and de Alfaro 2007); consequently, contributors whose content frequently gets edited have a low confidence score. Reputation can also be established via metadata collection, virtual trophies – these have also been used in eco-friendly applications (Massung et al. 2013) –, feedback, profiles, bot identification, statistical filtering, user promotions, and runtime analytics (Daniel et al. 2018). More sophisticated methods include logical argumentation (Sklar et al. 2016), which is an extension of automated reasoning and negotiation in an agent-based environment to reach agreements when conflicting information is presented. However, it has been stated that crowd-sourcing itself is a truth-converging tool (Goodchild and Li 2012).
Towards a Conceptual Typology of Darknet Risks
Published in Journal of Computer Information Systems, 2023
Obi Ogbanufe, Jordan Wolfe, Fallon Baucum
Online reputation and feedback systems play a big role in reducing risks faced by sellers and buyers of illicit goods and services in darknet markets.25 It can signal high-quality services and products (e.g., drugs), thereby reducing buyers’ harm from purchasing contaminated drugs, especially since there are no legal protections. In addition, given the risk of being apprehended by undercover law enforcement, the darknet participants use reputation scores to vet potential exchange partners.26 As such, online reputation systems act as virtue-giving tools that protect participants from scammers, reduce harm from low-quality products, and reduce the probability of interacting with undercover law enforcement.27