25

Re-decentralizing the Web

 4 years ago
source link: https://www.tuicool.com/articles/FvAF7ji
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

The current web is in trouble. The majority of users and their data are owned by a handful of corporations that have unbalanced market power . This has been achieved by bundling users, data and applications or services into monolithic platforms . The consequences are severe, far-reaching, and manifold: data lock-in and data leaks, omnipresent user manipulation and censorship, and high barriers to entry for businesses and stifled innovation. The situation is unsustainable.

In order to regain our online liberty and unleash innovation, we need to re-decentralize the web.

While many decentralized solutions and networks have been built over the past few years (particularly in the cryptocurrency space), most of them are still employing the single platform model. We lack protocols that allow applications and services to remain independent of the underlying network and to work across networks . This means that each application is bundled with and deployable in only a single network, creating silos where users, data, and thus developers, are locked into a particular platform. Furthermore, many networks use a logically centralized, “single source of truth” database, which establishes a major throughput bottleneck. We lack protocols and tools to build scalable, decentralized applications, databases and services .

This is holding us all back from realizing the vision of the decentralized web. We founded Haja Networks to solve this problem.

In order to build applications and systems for the decentralized web which can improve the status quo, we as an industry need to:

  • Decouple data and users from the platforms
  • Address scalability in terms of both throughput and latency
  • Create interoperability between networks, centralized and decentralized
  • Address the logical centralization of current solutions

All of this, we believe, is possible.

Today, to help to realize the vision of the decentralized web, we present the Ambients protocol .

Ambients is a novel protocol for distributed computation. It allows developers to build and run decentralized applications, databases and services in a peer-to-peer network. It decouples data from the underlying platform and network and creates an interoperability layer between different networks and systems. Decentralized applications can use it to build, deploy, execute, and share programs and data in a compositional, verifiably safe, and scalable way.

In this blog post, we’ll discuss in detail what the challenges are for the current web, the motivations behind creating the Ambients protocol, and a summary of it. We’ll finish by presenting the full Ambients protocol whitepaper and inviting you to join the open source community building the protocol.

3yqYZr3.jpg!web

The Current Web is in Trouble

“I imagined the web as an open platform that would allow everyone, everywhere to share information, access opportunities and collaborate across geographic and cultural boundaries.” Tim Berners-Lee

The web started as a way to share research and communicate between independent entities. Over time, as value was created and unlocked, the implementation of the web took a direction that favored centralization in favor of efficiency over resilience, profit and control of markets over digital liberty. Walls have been built everywhere to keep users in, digital boundaries are hard to cross, access to opportunities is limited, a few control what information we can share and how, and how we find information in the first place ; and the online individual is easily tracked and compromised . The current web is in trouble.

“Catalini and Gans defined “the cost of networking” as a problem associated with the increasing market power of the FANG group. Reduction of that cost will lead to disentanglement of the network effect benefits from market power.” The Blockchain Effect , Catalini & Gans

Market power on the web is centralized to a handful of corporations and their platforms. This has been achieved through creating a platform and a data moat, in which the users and their data are aggregated in a single, massive database. The data is accessible to the users and developers only through a proprietary application UI, but the data is owned and controlled by the platform. Capturing the individual user, attracting their contacts and limiting competitive external developer access, so that the users can’t transact outside the system, creates substantial network effects — a powerful incentive for centralization.

This consolidated market power has lead to a situation where it’s hard for new businesses to compete with the incumbent platforms. New businesses can’t access users or their data outside the major platforms and very few can contend with the amount of resources they have. As well, new businesses can only get access to users and data by going through the incumbents, which further establishes the platforms’ control over a) their users b) who can access those users and c) what the rules for access are.

This is a massive risk to entrepreneurial businesses as the rules can change at any point. This in turn reduces the number of new entrants, innovation and competition. The major platforms don’t have pricing or service quality pressure and they charge higher prices for lower quality services. Any new business submitting to a platform can be shut down without a notice. For the end user, this ultimately means that there are less options to choose from and they’re locked in. Centralization on the web has lead to data and market monopolization, with all of the ills that it creates for the market itself.

At the same time, the vast amount of data has become a liability — data gets stolen or otherwise compromised every so often. For good reason, the regulatory bodies are increasing their demands on the major platforms to take care of users’ privacy. However, at other times, the regulatory bodies want and get access to the platform and their users’ data. This makes the centralized platforms a lucrative environment for performing mass-surveillance and an easy target for censorship.

The situation is unsustainable and ultimately prevents innovation from happening, limiting the potential of the web. In order to protect users, enable a fair market, and encourage innovation and growth, a paradigm shift from centralized to decentralized models is needed.

The paradigm shift is achieved by reversing the authority .

Instead of platforms working as a gatekeeper between the user and their data and between users and other services, in reverse the users own their data and control who or which application can access it. While in the current model data is centralized around the platforms, in the decentralized model the data is “centralized” around the user.

“In a network of autonomous systems, an agent is only concerned with assertions about its own policy; no external agent can tell it what to do, without its consent. This is the crucial difference between autonomy and centralized management.” Burgess

Instead of users asking the application if they can access and use it, the applications ask the user for permission to access their data. Instead of one big database for all users, there are an infinite number of small databases, many for every user and application. Instead of having a separate account in thousands of services, self-certified identities and user profiles work across all applications. Blog posts, activity feeds, friend lists are owned by the users, who decide which service and user interface they use to access and operate on them. The user can allow their data to be used by multiple applications simultaneously, making the data re-usable and interoperable between applications and networks. Keeping local copies of small databases and operating on them locally is efficient and makes the user experience feel instantaneous. Applications work in disconnected environments by default.

Instead of a business requiring permission from the platforms to get access to users, they can directly communicate with the user and request access to their data. They can talk to other services through a unified common language and create new services by composing other services and data, creating emergent value and new business opportunities. An environment where businesses don’t need to build and operate expensive infrastructure to acquire and keep users at high cost allows them to focus on their core offering instead of building a platform and data moat.

Because the data, users, and applications are decoupled, and because everything is cryptographically verified, the developers can build and compose applications and services swiftly and fearlessly and compete on user experience, algorithms, data insights, service quality, price, ethical values, such as respect for privacy, and more. Because the rules and access to users are not dictated by the platforms, businesses can move faster to and in the market and even small players can enter, compete, and seize opportunities.

This leveled playing field creates a new wave of innovation and decentralized applications, services, and business models. Ultimately, it is the end users who benefit from more diverse service and application offerings, better user experience, improved privacy, and lowered prices. The decentralized web will be open , free (as in freedom), creative, and fun.

A Glimpse of Hope

The rise of crypto-networks has created a new wave of interest in technologies that enable decentralization. Bitcoin and Ethereum have led the way and as a whole we’ve built a lot in the past few years. We’ve developed technologies that can be used as building blocks for decentralized systems, such as IPFS , libp2p , and most of these new technologies, protocols, and systems are open source. A variety of new, cryptocurrency-based business models and digital governance models have been created. In a short time, systems that weren’t possible to build before have been conceived, implemented, and deployed.

At the core of many crypto-networks is a global ledger. The global ledger works as a “single global truth”, which means that all transactions between network participants are recorded on the same global ledger. This creates a bottleneck: requiring everyone to synchronize with everyone else reduces the maximum throughput of the network as a whole. Many projects have tackled improving the overall throughput of a network, but a single stream can only flow so fast.

Most ledger-based networks have a programming interface to program the network with “ smart contracts ”. Smart contracts are programs that run in a network by the network participants. We can create decentralized applications (dApps), make payments, run business logic, implement new protocols, and more. However, most smart contract languages and execution environments (that is the ledger they run on) are not compatible with each other . This means that the developers need to write their programs in a platform-specific language, effectively needing to decide up-front in which network they wish to run their programs or services. This creates fragmentation between the plethora of networks and creates an obstacle for interoperability between them. In this regard, most of the current blockchain systems create a technological silo for the developers: you have to choose which platform to bet on before even building the application, because switching platforms means rewriting the application.

Furthermore, because the smart contract programs are tied to the underlying crypto-network and ledger, they preclude the possibility of freely developing on open, non-cryptocurrency based building blocks. That is, if a program is built on a specific blockchain platform, coins or tokens or payments are required, so the developers and users have to sign up with the network and acquire tokens for that network.

This reminds us of the problem we have with centralized platforms, and on a more broader level doesn’t seem to align with the original motivations that created the whole field.

At the same time, we’ve become painfully aware of the realities of the centralized platforms and have started countermeasures through regulation, such as the GDPR in the European Union, or even calls to break down the platform owners.

These opposing forces to centralization, especially the people and societies waking up to question the level of privacy and security the platforms offer, are a glimpse of hope. Fundamentally though, the problems still exist.

Architectural Requirements for the Decentralized Web

To build and deploy infrastructure that can realize the full potential of decentralized networks, reverse the authority and decouple data from network effects , we think the following aspects need to be addressed:

  • Asynchronous message passing
  • Eventual consistency
  • The ability to structure large networks as smaller networks
  • A decoupling of computation from the platform
  • Usability in disconnected or disrupted environments

The biggest limitation to the current blockchain networks is the requirement for a single, global source of truth, the ledger, and forcing every participant to constantly synchronize and keep the global state to transact on it. While the motivations to do so are understandable (strong consistency guarantees, crypto-economic incentives, etc.), it creates an unbearable inefficiency for the network to operate.

To let infinitely diverse, offline-first applications be built on the decentralized web, the underlying protocols can not be built on a single token or network, or be dependent on them. Rather, the core protocols need to be flexible enough to build such networks on top and without requiring a payment to be able to use them. We need building blocks, not chains. The networks and protocols that require a single, global ledger are still needed and complementary to the core protocols, but the baseline needs to be as flexible, available, and efficient as possible, so that data can be decoupled from the underlying platform.

The real world is asynchronous: events occur and “things happen” disconnected from each other, at “some point in time”, but rarely do we observe them exactly at the same time . All forms of consistency and consensus are in fact eventually consistent; messages are passed between participants, according to a predefined set of rules (the protocol), and consensus is agreed upon only when the required messages have been exchanged and the steps of the protocol were executed. While the end result, the consensus, is a form of strong consistency and requires synchronization, the underlying mechanisms are asynchronous. Messages are sent and “at some point in time” they are received and acknowledged by the receivers. From this perspective, we need to consider eventual consistency as the baseline consistency level for all data and applications. Building on asynchronous message passing and eventual consistency, we can construct stronger consistency guarantees as needed.

“You must give up on the idea of a single database for all your data, normalized data, and joins across services. This is a different world, one that requires a different way of thinking and the use of different designs and tools” Jonas Bonér

With asynchronous messaging and eventual consistency, we gain the superpower of being able to program applications and networks that can withstand disconnected and disrupted service environments. The programs operate locally first, are always available, that is they work offline, and can synchronize with others when network connection is available. Being constantly connected is no longer needed. Overall, this leads to a better user experience than what we have today with web apps.

Every participant interacting through a single global agreement, e.g. a blockchain, creates a bottleneck, which can only be so fast. Instead of requiring disconnected and unrelated interactions to be verified by the full network (for example Alice buying a coffee from Bob and sharing photos with Charlie) the individual programs should form sub-networks. The sub-networks can be seen as “micro-networks” per application or interaction. For example, Alice sharing photos with Charlie forms a network of two participants. Or, in a chat program, a channel with 50 users forms a network of 50 participants. Dividing large networks into sub-networks per application can be seen as “built-in sharding”. Each participant in a network only stores and interacts with some parts of the network, but almost never all of them.

Each crypto-network having its own execution environment and a custom programming language to program the network causes fragmentation. This fragmentation is an inefficiency that prevents network effects from forming around programs and data. The accrued value is locked in each individual network. To overcome this challenge and to unlock value, we need to decouple the program execution layer from the platform (the ledger) and realize an efficient, distributed computation model that makes it possible to run the same program in multiple, otherwise disconnected and incompatible networks.

For this purpose, we present the Ambients protocol. Ambients is a protocol to build and run databases and programs in a peer-to-peer network. It decouples data from the underlying platform and network and creates an interoperability layer between different networks and systems. Decentralized applications can use it to build, deploy, execute, and share code and data in a compositional, verifiably safe, and scalable way.

Rethinking Databases

To understand the need for the Ambients protocol, let’s consider the challenges that developers currently face with prevailing programming models during the paradigm shift.

The paradigm shift from platform-centric model to a decentralized one requires the decentralized applications to be equally as good or better than the ones offered by the platforms. Reversing the authority is only one of the steps. To succeed in creating better user experiences, we need to build our applications and services accordingly. The current programming models for decentralized applications and services unfortunately resemble the platforms: they use a database which is effectively centralized, often times a blockchain.

With this in mind, consider that databases are a combination of a storage (the database state) and programs to access, update and query the storage (the database services). Relational databases commonly offer SQL as an interface to update and query the database, making SQL an interoperability layer for all programs that wish to access and manipulate the database state. Smart contract platforms work the same way: blockchains are the centralized database state and smart contracts provide the database services and an interface. In a way, all centralized platforms form around a centralized database to which the data moves into. In fact, such a database model shifts the decentralized web towards the centralized platform model.

What happens when the database state is decentralized and local to its owner? The whole concept of a database is inverted: the database state no longer accumulates in one single place, but in multiple places in a network. Therefore, to have a true decentralized programming model equivalent to the traditional database-centric model, the database services (or any program) need to move to where the data is. To do so, the computation (the programs) needs to be distributed.

It turns out, solving this problem means much more than just getting decentralized databases.

The Ambients Protocol: Peer-to-Peer Computation

Today, we’re excited to release the full Ambients protocol whitepaper .

The Ambients protocol is a novel distributed computation protocol that allows developers to build decentralized applications, databases and services and to deploy and run them in peer-to-peer networks. It allows developers to build distributed programs that are compositional , safe , scalable and decentralized .

To establish these properties, the Ambients protocol defines:

  • A programming model for Ambients programs based on immutable values and pure, total functions
  • A formal basis for the programming model which is based on a process algebra called Ambient Calculus
  • A set of protocol primitives , using common computation abstractions such as monoids and functors as examples, to encode programs as algebraic Ambients programs
  • A compilation model to compile programs from almost any source language to Ambients executables
  • An execution model for evaluating Ambients programs as a confluent rewrite system using eventually consistent, Merkle-DAG based event logs

Programs built using the Ambients protocol are turned into distributed executables that can be executed safely in the Ambients network. This lets the programs move where the data is . Sharing these distributed programs is essential for the interoperability and scalability of decentralized applications, for example when building the aforementioned decentralized, local-to-its-owner database where the programs form the distributed database services.

The Ambients programming model is restrictive enough to be verifiably safe. At the same time, it is expressive enough to let developers build data structures, functions, algorithms, business logic, databases, even full-fledged systems and services. Most programming languages today have an Ambients-compliant subset of features, which means that developers can build their decentralized applications and services on the Ambients protocol using a familiar programming language .

The Ambients protocol is designed to be platform-independent. The Ambients network can overlay and connect multiple different Ambients-compliant runtime environments , from trusted, permissioned centralized systems (like traditional web services) to trustless, permissionless decentralized systems (like blockchain platforms).

How the compositionality, safety and scalability guarantees are achieved and how developers can benefit from building decentralized data structures, databases, protocols, applications, digital services and more, are further discussed in the whitepaper.

The Ambients protocol is open source, and free for everyone to build on. If you’d like to participate in building the Ambients protocol or just to be part of the community, we invite you to head over to Github and join us !

We invite the curious reader to dive into the whitepaper to learn how the Ambients protocol can unlock massive utility and ultimately help us to re-decentralize the web.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK