内容简介:The Ocean Protocol is One of the Most Practical Platforms for Decentralized Machine LearningThe platform includes key building blocks for decentralizing data, computations and models in AI systems.Decentralization is one of the those ideas that sounds grea
The Ocean Protocol is One of the Most Practical Platforms for Decentralized Machine Learning
The platform includes key building blocks for decentralizing data, computations and models in AI systems.
Decentralization is one of the those ideas that sounds great in theory but its incredibly hard to achieve in practice. While, in principles, we love the idea of federated interactions without a centralized authority, our social and economic foundations operated around centralized control systems. Decentralization has been a core ideal in the software industry for decades and has been recently catalyzed by the raise of Bitcoin and the evolution of blockchain architectures. However, from all the ideas we hear about decentralized architectures, the reality is that only a few domains are well suited for that approach. Artificial intelligence(AI) is one of the few areas in the current software industry that can really benefit from the emergence of decentralized models.
The top existential question for the next decade of AI is not whether we will be able to achieve artificial general intelligence(AGI) but whether the breakthroughs in AI will remain in control of a handful of large technology companies or will they be opened to the rest of the world. Democratizing access to data, AI models and resources is essential to ensure a healthy evolution of AI. Decentralized artificial intelligence(AI) is one of those trends that seems completely obvious conceptually but results very difficult to implement in practice. While almost everyone agrees with the risks of centralized AI models, decentralized alternatives impose a very high barrier of entry from the technical standpoint. From the decentralized AI stacks in the market, the Ocean Protocol is a platform with one of the most practical approaches to enable the implementation of decentralized AI applications.
If you follow this blog, you know I am a believer on the decentralization of AI. Last year, I published a three-part essay( Part I , Part II , Part III ) outlining the relevance of decentralized AI models both from the financial and technical standpoints. I followed that with another article exploring the centralization risks of the current generation of AI applications .
Practical, is in fact one of the terms I often use to describe the Ocean Protocol. While the platform actively leverages blockchain technologies and tokenize incentives to decentralized AI workflows it does so without neglecting any of the tools, frameworks and compute infrastructures that are powering AI workloads today. For instance, its perfectly possible to layer the Ocean protocol on top of AI workloads running on Spark or AWS. In that sense, the Ocean Protocol allows data science teams to introduce incremental levels of decentralization instead of a drastic re-architecture.
Challenges & Basic Principles
Part of the practicality of the Ocean Protocol is based on its basic principles that target some of the main challenges of decentralized AI applications. From the conceptual standpoint, the Ocean Protocol attempts to address four fundamental challenges that are a common denominator in any decentralized AI architecture:
To address the aforementioned challenges, the Ocean Protocol provides a model that tries to coordinate actions between the different parties in a decentralized AI workflow. At a high level, the interactions in any AI application can be decomposed in the following roles:
To some extent, the Ocean Protocol can be seen as a decentralized orchestration layer between the roles depicted above. The interactions between the different roles is abstracted via blockchain smart contracts while the execution can remain in their native environment. The following figure illustrates that concept in detail:
Architecture
The main role of the Ocean Protocol architecture is to enable decentralized communications between entities in an AI workflow. From data or algorithm providers to analytics tools the Ocean Protocol provides a model based on tokenized incentives and blockchain smart contracts to allow different parties to collaborate in AI workloads following fair and efficient interactions. The different components of the Ocean Protocol architecture rely on four key concepts to enable their interactions:
· Service Execution Agreements(SEAs): SEAs are smart contracts that establish the dynamics of data service supply chains. Conceptually, SEAs allows connection to, monetization of, and curation of arbitrary data services.
· Proof-Of-Service and Incentives: The Ocean Protocol relies on network rewards to incentivize the sharing of AI resources. The Proof-Of-Service model acts as a higher-level consensus mechanisms to assert the correct interaction between the different parties in an AI workflow.
· The Ocean Token: This component acts as the fundamental unit of exchange in the Ocean Protocol network.
From an architecture standpoint, the Ocean Protocol is organized in three fundamental layers:
- The Keeper layer that manages service agreements, low level access control, accounts, balances, and the incentive schema (or block reward).
- The Verification layer that introduces cryptographic challenges to improve the integrity and security of the services.
- A Curation layer that serves as a discovery mechanism as well as signalling and governance aspects. This layer accounts for human subjectivity.
The interactions between those layers is abstracted via SEAs running on the Keeper layer. The role of the Keepers is to maintain the state of the entire decentralized workflow by enforcing the corresponding smart contracts. The role of the Verifiers is to enforce the clauses expressed in the underlying smart contracts. Verifiers rely on consensus mechanisms and cryptographic proofs for enforce their role. Curators complement the Verifiers’ cryptographic rules by introducing more subjective opinions and data signals.
If we apply the three-layer model of the Ocean Protocol to a decentralized AI workflow, we get something like the following:
Let’s deep dive into some of the key technical building blocks of the Ocean Protocol architecture:
· Pleuston Frontend: Is a marketplace template that enables functionalities such as data publishing and consumption.
· Data Science Tools: Data science tools are the interface to Ocean used by AI researchers and data scientists. Typically written in Python, those tools and libraries expose a high-level API allowing one to integrate Ocean capabilities in various computation pipelines.
· Squid: Squid is a High Level specification API abstracting the interaction with the most relevant Ocean Protocol components. It allows one to use Ocean capabilities without worrying about the details of the underlying Keeper Contracts or metadata storage systems.
· Aquarius: Aquarius is a Python application running in the backend that enables metadata management. It abstracts access to different metadata stores, allowing Aquarius to integrate different metadata repositories. The OceanDB plugin system can integrate different data stores (e.g. Elasticsearch, MongoDB, BigchainDB) implementing the OceanDB interfaces.
· Brizo: Brizo is a component providing capabilities for publishers. Brizo interacts with the publisher’s cloud and/or on-premise infrastructure. Brizo enables functionalities such as compute, storage or the gathering of service proofs.
Put together, these components abstract the fundamental dynamics of any decentralized AI application. The Ocean Protocol is still in very early stages but already achieved some important customer wins and partnerships . The platform was also accepted by prestigious crypto-market place Coinlist to conduct a new token sale . Together with efforts like SigularityNet or Numerai’s Erasure, the Ocean Protocols is one of the most viable stacks powering the next generation of decentralized AI applications.
以上所述就是小编给大家介绍的《The Ocean Protocol is One of the Most Practical Platforms for Decentralized Machine Learning》,希望对大家有所帮助,如果大家有任何疑问请给我留言,小编会及时回复大家的。在此也非常感谢大家对 码农网 的支持!
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。
Data Structures and Algorithms
Alfred V. Aho、Jeffrey D. Ullman、John E. Hopcroft / Addison Wesley / 1983-1-11 / USD 74.20
The authors' treatment of data structures in Data Structures and Algorithms is unified by an informal notion of "abstract data types," allowing readers to compare different implementations of the same......一起来看看 《Data Structures and Algorithms》 这本书的介绍吧!