The cryptocurrency movement is the spiritual heir to previous open computing movements, including the open source software movement led most visibly by Linux, and the open information movement led most visibly by Wikipedia.1991: Linus Torvalds’ forum post announcing Linux; 2001: the first Wikipedia pageBoth of these movements were once niche and controversial. Today Linux is the dominant worldwide operating system, and Wikipedia is the most popular informational website in the world.Crypto tokens are currently niche and controversial. If present trends continue, they will soon be seen as a breakthrough in the design and development of open networks, combining the societal benefits of open protocols with the financial and architectural benefits of proprietary networks. They are also an extremely promising development for those hoping to keep the internet accessible to entrepreneurs, developers, and other independent creators.
Source: Crypto Tokens: A Breakthrough in Open Network Design
When designed properly, decentralized, open source, tokenized cryptoasset networks solve the problem of incentive alignment between network creators and network participants. When the software is public, and the organizing body is a nonprofit rather than a for-profit, the Extraction Imperative is eliminated — because there are literally no longer shareholders with a claim on cash flows. The value is held in the network itself, represented by token ownership. Everyone who participates in the activity of the network, using and accruing tokens in the process, is effectively a network “owner.” Because there is no division between owners and participants, there is no point at which their incentives diverge.
Source: The Future Of Network Effects: Tokenization and the End of Extraction
The potential opened by distributed ledger technologies for peer-to-peer exchange enabling users and developers to co-own their platforms, organize their own communities and share the value generated according to their own rules has led many to believe in the ‘sharing economy’ as a way to foster cooperation between individuals on large scale, leading to a new, socially pacified post-capitalism era. In spite of any such utopian expectation, however, this paper argues that capitalism has simply strengthened, not only through the growing centralization of peer-to-peer digital services on proprietary platforms, but also through highly speculative practices embedded in decentralized architectural protocols. We tackle the new challenges raised by the engineering of human interactions through algorithmic governance, stressing the necessity to carefully evaluate sharing economy and platform cooperativism as complex phenomena with risks, benefits and unintended consequences inevitably intertwined in the fabric of human existence.
Source: Architecting the eSociety on Blockchain: A Provocation to Human Nature by Marcella Atzori, Mihaela Ulieru :: SSRN
The following discussion of computational capital takes the electronic database, an infrastructure for storing in-formation, as vantage point. Following a brief look into how database systems serve in-formation desires, the notion of ‘database as discourse’ by Mark Poster is explored and further developed. Database as discourse establishes a machinic agency, directed towards the individual in a specific mode of hailing. This mode of hailing in turn leads to a scattered form of subjectivity, that is identified with Manuela Ott and Gerald Raunig as dividual. How does dividualization emerge from database infrastructure? What is the specific quality of data, that is produced by and being harvested from in/dividuals into databases, and what are the consequences of such a shifted view?
Source: Epistemic Harvest: The Electronic Database as Discourse and Means of Data Production | a peer-reviewed journal about_
The fact these lazy seeming workarounds foreshadow later popular protocols seems to tell us something about decentralization. The progression of centralized hosting → Napster → Kazaa → BitTorrent seems to represent the minimum viable decentralization required to stay alive as defined by the law at the time. These lazy workarounds match because decentralization isn’t the product, it is just a means of staying alive.Plenty of people went further with decentralization and anonymity, but it wasn’t necessary for staying alive and it only mattered to a privacy-focused minority of people. Beyond staying alive, decentralization is a weakness not a strength. In many ways, 2005’s BitTorrent was more centralized than Kazaa, but it decentralized file transfer and outsourced content discovery which made it more resilient than Kazaa which decentralized search at the protocol level.
Decentralization and other technological tricks help keep technologies online which wouldn’t last if they were centralized, but they don’t fully solve the problem. Instead, it seems like decentralized technologies depend on activists in order to fully realize the vision of the technology. Bram played this part by open sourcing his protocol, limiting his ability to profit from the system, and creating an environment where killing his client would basically do nothing to stop BitTorrent usage. The Pirate Bay is a more obvious example of activism and they go hand in hand with Piratbyrån’s anti-copyright mission. Yes, there are private torrent trackers and public options besides The Pirate Bay, but no one has provided the continuity and resilience that The Pirate Bay has in staying alive no matter the cost.
Decentralized technologies don’t take the legally impossible and make it unstoppable. Decentralization is a tactic for diffusing risk for many and lowering the risk for the activists that operate the most sensitive parts of the system. We see the same with Tor, where the risk of participating in the system is concentrated at the exit nodes which can attract undesirable legal attention. Without activism, we would have beautifully designed decentralized technologies which are impossible to use in practice.
Source: Resistant protocols: How decentralization evolves – John Backus – Medium
Posted in applications, certificate, critique, data protection, decentralization, EU policy, identity, issues/conflicts, law, opinion, papers, related_projects, Research Notes
Cloud Communities: The Dawn of Global Citizenship?, kickoff contribution by Liav Orgad
Citizenship in Cloud Cuckoo Land?, by Rainer Bauböck
Citizenship in the Era of Blockchain-Based Virtual Nations, by Primavera De Filippi
Global Citizenship for the Stay-at-Homes, by Francesca Strumia
A World Without Law; A World Without Politics, by Robert Post
Virtual Politics, Real Guns: On Cloud Community, Violence, and Human Rights, by Michael Blake
A World Wide Web of Citizenship, by Peter J. Spiro
Citizenship Forecast: Partly Cloudy with Chances of Algorithms, by Costica Dumbrava
The Separation of Territory and State: a Digital French Revolution?, by Yussef Al Tamimi
A Brave New Dawn? Digital Cakes, Cloudy Governance and Citizenship á la carte, by Jelena Dzankic
Old Divides, New Devices: Global Citizenship for Only Half of the World, by Lea Ypi
Escapist technology in the service of neo-feudalism, by Dimitry Kochenov
Cloud communities and the materiality of the digital, by Stefania Milan
Cloud Agoras: When Blockchain Technology Meets Arendt’s Virtual Public Spaces, by Dora Kostakopoulou
Global Cryptodemocracy is Possible and Desirable, by Ehud Shapiro
The Future of Citizenship: Global and Digital. A Rejoinder, by Liav Orgad
This is the essence of engineering decentralized instutitions: it is about strategically using coordination problems to ensure that systems continue to satisfy certain desired properties.
Source: Engineering Security Through Coordination Problems
In which I argue that “tightly coupled” on-chain voting is overrated, the status quo of “informal governance” as practiced by Bitcoin, Bitcoin Cash, Ethereum, Zcash and similar systems is much less bad than commonly thought, that people who think that the purpose of blockchains is to completely expunge soft mushy human intuitions and feelings in favor of completely algorithmic governance (emphasis on “completely”) are absolutely crazy, and loosely coupled voting as done by Carbonvotes and similar systems is underrated, as well as describe what framework should be used when thinking about blockchain governance in the first place.
Source: Notes on Blockchain Governance
Bitcoin’s mining hardware (hashrate) has tripled since December, as can be seen above, even while price has fallen by 3x since December.It is now therefore a lot more expensive to mine a bitcoin than in December, while at the same time one mined bitcoin is worth a lot less.At some point miners are unable to afford energy costs or to keep up with adding more and more hardware as their old one becomes useless due to the constant increase of hashrate difficulty. So they close shop.Some miners, however, like Bitman, have lower costs, presumably because they manufacture themselves the mining hardware.So as other miners struggle, like Bitfury which has now dropped to 2%, Bitmain starts gaining more and more hashrate to the point they are now nearing 51%.The above bitcoin hashrate chart, however, even in a common sense way, looks quite unusual because it rarely goes down, if ever.Rather than responding to the price action, the hashrate appears completely detached. A situation that can not go for much longer because that increased new hardware itself puts pressure on price as the new barely profitable miners need to sell everything to cover costs.
Source: Bitmain Nears 51% of Bitcoin’s Network Hashrate
Data is being hailed as “the new oil.” The analogy seems appropriate given the growing amount of data being collected, and the advances made in its gathering, storage, manipulation and use for commercial, social and political purposes.Big data and its application in artificial intelligence, for example, promises to transform the way we live and work — and will generate considerable wealth in the process. But data’s transformative nature also raises important questions around how the benefits are shared, privacy, public security, openness and democracy, and the institutions that will govern the data revolution.The delicate interplay between these considerations means that they have to be treated jointly, and at every level of the governance process, from local communities to the international arena. This series of essays by leading scholars and practitioners, which is also published as a special report, will explore topics including the rationale for a data strategy, the role of a data strategy for Canadian industries, and policy considerations for domestic and international data governance.
Source: Data Governance in the Digital Age | Centre for International Governance Innovation