30

Exploring the Ocean Protocol

 5 years ago
source link: https://www.tuicool.com/articles/hit/RvURZfJ
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
NRfMFjI.png!web2yyiYzi.png!web

Decentralized artificial intelligence(AI) is one of those trends that seems completely obvious conceptually but results very difficult to implement in practice. While almost everyone agrees with the risks of centralized AI models, decentralized alternatives impose a very high barrier of entry from the technical standpoint. From the decentralized AI stacks in the market, the Ocean Protocol is a platform with one of the most practical approaches to enable the implementation of decentralized AI applications.

If you follow this blog, you know I am a believer on the decentralization of AI. Last year, I published a three-part essay( Part I , Part II , Part III ) outlining the relevance of decentralized AI models both from the financial and technical standpoints. I followed that with another article exploring the centralization risks of the current generation of AI applications .

Practical, is in fact one of the terms I often use to describe the Ocean Protocol. While the platform actively leverages blockchain technologies and tokenize incentives to decentralized AI workflows it does so without neglecting any of the tools, frameworks and compute infrastructures that are powering AI workloads today. For instance, its perfectly possible to layer the Ocean protocol on top of AI workloads running on Spark or AWS. In that sense, the Ocean Protocol allows data science teams to introduce incremental levels of decentralization instead of a drastic re-architecture.

Challenges & Basic Principles

Part of the practicality of the Ocean Protocol is based on its basic principles that target some of the main challenges of decentralized AI applications. From the conceptual standpoint, the Ocean Protocol attempts to address four fundamental challenges that are a common denominator in any decentralized AI architecture:

Ybq2ayA.png!webiQbmQbZ.png!web

To address the aforementioned challenges, the Ocean Protocol provides a model that tries to coordinate actions between the different parties in a decentralized AI workflow. At a high level, the interactions in any AI application can be decomposed in the following roles:

bMrUZfZ.png!webFv6Fbur.png!web

To some extent, the Ocean Protocol can be seen as a decentralized orchestration layer between the roles depicted above. The interactions between the different roles is abstracted via blockchain smart contracts while the execution can remain in their native environment. The following figure illustrates that concept in detail:

I3uIFjn.png!webB3qaAfr.png!web

Architecture

The main role of the Ocean Protocol architecture is to enable decentralized communications between entities in an AI workflow. From data or algorithm providers to analytics tools the Ocean Protocol provides a model based on tokenized incentives and blockchain smart contracts to allow different parties to collaborate in AI workloads following fair and efficient interactions. The different components of the Ocean Protocol architecture rely on four key concepts to enable their interactions:

· Service Execution Agreements(SEAs): SEAs are smart contracts that establish the dynamics of data service supply chains. Conceptually, SEAs allows connection to, monetization of, and curation of arbitrary data services.

· Proof-Of-Service and Incentives: The Ocean Protocol relies on network rewards to incentivize the sharing of AI resources. The Proof-Of-Service model acts as a higher-level consensus mechanisms to assert the correct interaction between the different parties in an AI workflow.

· The Ocean Token: This component acts as the fundamental unit of exchange in the Ocean Protocol network.

From an architecture standpoint, the Ocean Protocol is organized in three fundamental layers:

  • The Keeper layer that manages service agreements, low level access control, accounts, balances, and the incentive schema (or block reward).
  • The Verification layer that introduces cryptographic challenges to improve the integrity and security of the services.
  • A Curation layer that serves as a discovery mechanism as well as signalling and governance aspects. This layer accounts for human subjectivity.
Av2a6vu.png!webvYVvEfV.png!web

The interactions between those layers is abstracted via SEAs running on the Keeper layer. The role of the Keepers is to maintain the state of the entire decentralized workflow by enforcing the corresponding smart contracts. The role of the Verifiers is to enforce the clauses expressed in the underlying smart contracts. Verifiers rely on consensus mechanisms and cryptographic proofs for enforce their role. Curators complement the Verifiers’ cryptographic rules by introducing more subjective opinions and data signals.

If we apply the three-layer model of the Ocean Protocol to a decentralized AI workflow, we get something like the following:

BbmaIrv.png!webuE3aae6.png!web

Let’s deep dive into some of the key technical building blocks of the Ocean Protocol architecture:

· Pleuston Frontend: Is a marketplace template that enables functionalities such as data publishing and consumption.

· Data Science Tools: Data science tools are the interface to Ocean used by AI researchers and data scientists. Typically written in Python, those tools and libraries expose a high-level API allowing one to integrate Ocean capabilities in various computation pipelines.

· Squid: Squid is a High Level specification API abstracting the interaction with the most relevant Ocean Protocol components. It allows one to use Ocean capabilities without worrying about the details of the underlying Keeper Contracts or metadata storage systems.

· Aquarius: Aquarius is a Python application running in the backend that enables metadata management. It abstracts access to different metadata stores, allowing Aquarius to integrate different metadata repositories. The OceanDB plugin system can integrate different data stores (e.g. Elasticsearch, MongoDB, BigchainDB) implementing the OceanDB interfaces.

· Brizo: Brizo is a component providing capabilities for publishers. Brizo interacts with the publisher’s cloud and/or on-premise infrastructure. Brizo enables functionalities such as compute, storage or the gathering of service proofs.

Put together, these components abstract the fundamental dynamics of any decentralized AI application. The Ocean Protocol is still in very early stages but already achieved some important customer wins and partnerships . The platform was also accepted by prestigious crypto-market place Coinlist to conduct a new token sale . Together with efforts like SigularityNet or Numerai’s Erasure, the Ocean Protocols is one of the most viable stacks powering the next generation of decentralized AI applications.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK