Facebook is at pains to address such anxieties, but its initial outline of how the new currency will be used in Messenger and WhatsApp via Calibra is not exactly reassuring. At the very least, Facebook will know the people and companies with whom its users have financially interacted, but that is likely to only be the start. “Calibra will not share account information or financial data with Facebook, Inc or any third party without customer consent,” it says (my italics). Obviously, the company has past form here: the kind of forced consent that means you either allow Facebook to gobble up your data or don’t get access to many of its services. Whatever the guarantees, the most basic point is obvious enough: why should a company with such an appalling record on personal data be trusted to so massively extend its reach?
AbstractEmerging as a comprehensive and aggressive governance scheme in China, the “Social Credit System” (SCS) seeks to promote the norms of “trust” in the Chinese society by rewarding behavior that is considered “trust-keeping” and punishing those considered “trust-breaking.” This Article closely examines the evolving SCS regime and corrects myths and misunderstandings popularized in the international media. We identify four key mechanisms of the SCS, i.e., information gathering, information sharing, labeling, and joint sanctions, and highlight their unique characteristics as well as normative implications. In our view, the new governance mode underlying the SCS — what we call the “rule of trust” — relies on the fuzzy notion of “trust” and wide-ranging arbitrary and disproportionate punishments. It derogates from the notion of “governing the country in accordance with the law” enshrined in China’s Constitution.This Article contributes to legal scholarship by offering a distinctive critique of the perils of China’s SCS in terms of the party-state’s tightening social control and human rights violations. Further, we critically assess how the Chinese government uses information and communication technologies to facilitate data-gathering and data-sharing in the SCS with few meaningful legal constraints. The unbounded and uncertain notion of “trust” and the unrestrained employment of technology are a dangerous combination in the context of governance. We conclude with a caution that with considerable sophistication, the Chinese government is preparing a much more sweeping version of SCS reinforced by artificial intelligence tools such as facial-recognition and predictive policing. Those developments will further empower the government to enhance surveillance and perpetuate authoritarianism.Keywords: Social Credit, information and communications technologies, governance, social control, human rightsSuggested Citation:
These days, it’s not a shared drill that’s redefining trust and supplanting institutional intermediaries; it’s the blockchain. Botsman now says that the blockchain is the next step in shifting trust from institutions to strangers. “Even though most people barely know what the blockchain is, a decade or so from now, it will be like the internet,” she writes. “We’ll wonder how society ever functioned without it.”
The ambitious promises all sound very familiar.
The Trust & Technology Initiative brings together and drives forward interdisciplinary research from Cambridge and beyond to explore the dynamics of trust and distrust in relation to internet technologies, society and power; to better inform trustworthy design and governance of next generation tech at the research and development stage; and to promote informed, critical, and engaging voices supporting individuals, communities and institutions in light of technology’s increasing pervasiveness in societies.
Source: Trust & Technology Initiative
What blockchain does is shift some of the trust in people and institutions to trust in technology. You need to trust the cryptography, the protocols, the software, the computers and the network. And you need to trust them absolutely, because they’re often single points of failure.When that trust turns out to be misplaced, there is no recourse. If your bitcoin exchange gets hacked, you lose all of your money. If your bitcoin wallet gets hacked, you lose all of your money. If you forget your login credentials, you lose all of your money. If there’s a bug in the code of your smart contract, you lose all of your money. If someone successfully hacks the blockchain security, you lose all of your money. In many ways, trusting technology is harder than trusting people. Would you rather trust a human legal system or the details of some computer code you don’t have the expertise to audit?Blockchain enthusiasts point to more traditional forms of trust—bank processing fees, for example—as expensive. But blockchain trust is also costly; the cost is just hidden. For bitcoin, that’s the cost of the additional bitcoin mined, the transaction fees, and the enormous environmental waste.Blockchain doesn’t eliminate the need to trust human institutions. There will always be a big gap that can’t be addressed by technology alone. People still need to be in charge, and there is always a need for governance outside the system. This is obvious in the ongoing debate about changing the bitcoin block size, or in fixing the DAO attack against Etherium. There’s always a need to override the rules, and there’s always a need for the ability to make permanent rules changes. As long as hard forks are a possibility—that’s when the people in charge of a blockchain step outside the system to change it—people will need to be in charge.
It’s hard enough to get enterprises that compete with each other to work together as a team, but it’s especially tricky when one of those rivals owns the team.Shipping giant Maersk and tech provider IBM are wrestling with this problem with TradeLens, their distributed ledger technology (DLT) platform for supply chains.Some 10 months ago, the project was spun off from Maersk (the largest container shipping company on the planet) into a joint venture with IBM. But in that time the network has enticed only one other carrier onto the platform: Pacific International Lines (PIL), one of eight shipping lines in Asia and 17th in the world based on cargo volumes.As those involved admit, that’s not enough.
The recent financial crisis and, especially, anti-austerity policies, reflects and, at the same time, contribute to a crisis of representative democracy. In this article, I discuss which different conceptions of trust (and relations to democracy) have been debated in the social sciences, and in public debates in recent time. The financial crisis has in fact stimulated a hot debate on “whose trust” is relevant for “whose democracy”. After locating the role of trust in democratic theory, I continue with some illustrations of a declining political trust in Europe, coming from my own research on social movements, but also of the emergence, in theory and practices, of other conceptions of democracy and democratic spaces, where critical trust develops. Indignados’ movements in Spain and Greece as well as the Occupying Wall Street protest in the US are just the most visible reaction of a widespread dissatisfaction with the declining quality of democratic regimes. They testify for the declining legitimacy of traditional conceptions of democracy, as well as for the declining trust in representative institutions. At the same time, however, these movements conceptualize and practice different democratic models that emphasize participation over delegation and deliberation over majority voting. In doing this, they present a potential for reconstructing social and political trust from below.
At the tip of the hype cycle, trust-free systems based on blockchain technology promise to revolutionize interactions between peers that require high degrees of trust, usually facilitated by third party providers. Peer-to-peer platforms for resource sharing represent a frequently discussed field of application for “trust-free” blockchain technology. However, trust between peers plays a crucial and complex role in virtually all sharing economy interactions. In this article, we hence shed light on how these conflicting notions may be resolved and explore the potential of blockchain technology for dissolving the issue of trust in the sharing economy. By means of a dual literature review we find that 1) the conceptualization of trust differs substantially between the contexts of blockchain and the sharing economy, 2) blockchain technology is to some degree suitable to replace trust in platform providers, and that 3) trust-free systems are hardly transferable to sharing economy interactions and will crucially depend on the development of trusted interfaces for blockchain-based sharing economy ecosystems.
With an increasing number of technologies supporting transactions over distance and replacing traditional forms of interaction, designing for trust in mediated interactions has become a key concern for researchers in human computer interaction (HCI). While much of this research focuses on increasing users’ trust, we present a framework that shifts the perspective towards factors that support trustworthy behavior. In a second step, we analyze how the presence of these factors can be signalled. We argue that it is essential to take a systemic perspective for enabling well-placed trust and trustworthy behavior in the long term. For our analysis we draw on relevant research from sociology, economics, and psychology, as well as HCI. We identify contextual properties (motivation based on temporal, social, and institutional embeddedness) and the actor’s intrinsic properties (ability, and motivation based on internalized norms and benevolence) that form the basis of trustworthy behavior. Our analysis provides a frame of reference for the design of studies on trust in technology-mediated interactions, as well as a guide for identifying trust requirements in design processes. We demonstrate the application of the framework in three scenarios: call centre interactions, B2C e-commerce, and voice-enabled on-line gaming.
This article departs from the post 2008 financial crisis context, from its intersection with technological developments, and from the socio-technical arrangements configured by this conjuncture. It explores plans and actions – of mainstream financial institutions, and of a community seeking for alternatives to centralised economy and governance – for the use of digital platforms supported by blockchain infrastructure. In particular, it explores how such plans and actions relate to conceptions of public and peer trust and how they appear to produce, or reinforce, reputational imaginaries and quantification practices within added value philosophies. By illuminating a tension between the two identified case examples, I seek to render alternative communities’ and financial institutions’ conceptions, imaginaries and practices (more) visible and to analyse their organisational marketing strategies – where there is a pragmatic and discursive operationalisation of technology as well as of trust as means to gain more self-sovereignty in action, while navigating markets and regulated actual world contexts.