“Determine how trustworthy a person is in just one minute.” That’s the pitch from DeepScore, a Tokyo-based company that spent last week marketing its facial and voice recognition app to potential customers and investors at CES 2021.Here’s how it works: A person—seeking a business loan or coverage for health insurance, perhaps—looks into their phone camera and answers a short series of questions. Where do you live? How do you intend to use the money? Do you have a history of cancer? DeepScore analyzes the muscular twitches in their face and the changes in their voice and delivers a verdict to the lender or insurer. This person is trustworthy, this person is probably not.
Iyad Rahwan: How to trust machines?Machine intelligence plays a growing role in our lives. Today, machines recommend things to us, such as news, music, and household products. They trade in our stock markets and optimise our transportation and logistics. They are also beginning to drive us around, play with our children, diagnose our health. How do we ensure that these machines will be trustworthy? This lecture explores various psychological, social, cultural, and political factors that shape our trust in machines and pleads for the accomplishment of the challenges of the information revolution not only to be understood as a problem of computer science.
Previously we build Semaphore 148 allows static anonymous reputation system. Here we propose an expansion of Semaphore where we can destroy a users reputation without knowing their Identity. We use this to build a binary reputation system which can trivially be expanded to a non binary reputation system.
We have studied trust for 20 years and believe that it is the ultimate currency in the relationship that all institutions—companies and brands, governments, NGOs and media—build with their stakeholders. Trust defines an organization’s license to operate, lead and succeed. Trust is the foundation that allows an organization to take responsible risk, and, if it make mistakes, to rebound from them.For a business, especially, lasting trust is the strongest insurance against competitive disruption, the antidote to consumer indifference, and the best path to continued growth. Without trust, credibility is lost and reputation can be threatened.Edelman’s trust research, the Edelman Trust Barometer, turns the deep data we collect into real-world insights, and our trust consulting platform, Edelman Trust Management, interprets those insights to help our clients plan, make decisions and take action.
Source: Trust | Edelman
China’s plan to establish a social credit system (SCS) has aroused the concern of building a surveillance state. Yet this view oversimplifies and misunderstands the essence of the SCS. The highest priorities of the SCS are promoting economic credibility and reinforcing court orders. Meanwhile, the SCS aims to steer citizens’ social behaviors and interactions by utilizing a redlist system that introduces numerous moderate rewards. The SCS is also more lax in execution than in planning. It reflects a unique Chinese understanding of law, which treats law as a moral guide. This article also acknowledges the concerns for the SCS. Without actively preventing positive and negative invasions in the construction of the project, the SCS authorities will risk creating further mistrust in society.
NFTs  are not just cat pictures that people trade on blockchains. Today digital art , collectibles , and in-game assets  are the most visible use cases for these nifty non-fungibles. But the market holds an inconspicuous secret: there is a staggering diversity of online digital content that can be placed on a blockchain in the form of NFTs.
As we use these services, they learn more and more about us. They see who we are, but we are unable to see into their operations or understand how they use our data. As a result, we have to trust online services, but we have no real guarantees that they will not abuse our trust. Companies share information about us in any number of unexpected and regrettable ways, and the information and advice they provide can be inconspicuously warped by the companies’ own ideologies or by their relationships with those who wish to influence us, whether people with money or governments with agendas.
To protect individual privacy rights, we’ve developed the idea of “information fiduciaries.” In the law, a fiduciary is a person or business with an obligation to act in a trustworthy manner in the interest of another. Examples are professionals and managers who handle our money or our estates. An information fiduciary is a person or business that deals not in money but in information. Doctors, lawyers, and accountants are examples; they have to keep our secrets and they can’t use the information they collect about us against our interests. Because doctors, lawyers, and accountants know so much about us, and because we have to depend on them, the law requires them to act in good faith—on pain of loss of their license to practice, and a lawsuit by their clients. The law even protects them to various degrees from being compelled to release the private information they have learned.
Given all the positivity surrounding SSI, and its laudable promise to give people control, it may be surprising to find an essay called “The dystopia of self-sovereign identity (SSI)”. Its author Philip Sheldrake warns the SSI community that their projects may achieve the opposite of what is intended, partly by viewing the problem too much from a technical perspective: “SSI cannot provide an ‘identity layer’ of the Internet any more than the Internet might be said to be missing a ‘truth layer’.”
The music industry group filed a copyright complaint with code repository Github, demanding that the project be taken down for breaching the anti-circumvention provisions of the DMCA. While this was never likely to be well received by the hoards of people who support the software, the response was unprecedented.