Fastai is now using Python and PyTorch to be productive and hackable.

栏目: IT技术 · 发布时间: 5年前

Fastai is now using Python and PyTorch to be productive and hackable.

A recent paper published by fastai states that the use of Python and PyTorch is helping them to be quick and clear.

Fastai is now using Python and PyTorch to be productive and hackable.

Image by alan9187 from Pixabay

Organised around two major design goals, fastai is a modern deep learning library that simplifies training of fast and accurate neural nets. The two major design goals of fastai are

  1. To be approachable and rapidly productive.
  2. To be deeply hackable and configurable.

The framework provides engineers with high-level components that can quickly and easily provide state-of-the-art results in standard deep learning domains, and provides researchers with low-level components that can be mixed and matched to build new approaches. The new version of fastai, fastai v2 , that is expected to be released officially around July 2020, uses the dynamic nature of Python Language and flexibility of PyTorch to be concise and clear. The library is specifically designed with ease of use, flexibility, and performance in mind.

Architecture

Fastai’s carefully layered architecture holds the key in its being productive and configurable. Most of the modern day deep learning libraries focuses on either one of this, but fastai is specifically designed for being both at the same time. The team wanted to get the clarity and development speed of Keras and the customizability of PyTorch.

Fastai uses decoupled abstractions to represent underlying patterns of many deep learning and data processing techniques which creates its layered architecture. This helps fastai to achieve best of both worlds. There is a high level API which can be called by ready-to-use functions to train models for various applications. This high level API is build on top of multiple composable low level APIs, which can be switched and swapped as per the need for particular behaviour.

Following diagram shows a layered API from fast.ai

Fastai is now using Python and PyTorch to be productive and hackable.

Users of the API can either use the High Level API to train a model for common applications or they have the option to play with Mid Level or Low Level APIs if they want to hack into a more custom solution.

Beginners and to practitioners will be able widely using high-level of the API. It offers concise APIs over four main application areas: vision, text, tabular and time-series analysis, and collaborative filtering. All of these application areas are highly optimised for ease of use with maximum benefits as these APIs choose intelligent default values and behaviours based on all available information. This use of intelligent defaults–based on system’s experience or best practices–extends to incorporating state-of-the-art research wherever possible. This means that beginners with less knowledge about the system will be able to train models which are of top level research quality.

The mid-level API is designed for scalability and customisation provides the core deep learning and data-processing methods for each of these applications. The mid level API makes sure that the low level API will not become too cluttered too fast as in the case of may two layered frameworks. Also it provides a layer of abstraction for any one who wants to customise only high level API without having to learn a lot about the low level APIs

The low-level APIs provide a library of optimized primitives and functional and object-oriented foundations, which allows the mid-level to be developed and customised.

Mid Level APIs and Low Level APIs makes more sense for researchers and is designed ins such away that they can exploit most, if not all, of the capabilities of underlying language and framework.

Getting the most out of Phyton and PyTorch

Build on top of different Phyton based libraries such as PyTorch, NumPy, PIL, pandas, and various other libraries, in order to achieve its goal of hackability, the library does’t aim to supplant or hide these lower levels or these foundation. For instance, in a fastai model, developer can interact directly with the underlying PyTorch primitives; and within a PyTorch model, one can incrementally adopt components from the fastai library as conveniences rather than as an integrated package.

That means this is really powerful for research and related tasks as there are a lot ways we can experiment leveraging current tools and frameworks with out making things complex.

In same lines, rather than keeping Phyton itself as the low-level of computation, fastai depends on a layer of well defined abstraction at the lower level. The mid level APIs depend on this lower level APIs for functionalities. Along with this fastai has a few more additions designed to make Python easier to use, including a NumPy-like API for lists called L , and some decorators to make delegation or patching easier.

This means that Python is used in places where it can provide value to the users of the library and provide benefits to the framework. For instance, the transform pipeline system is built on top of the foundations provided by PyTorch. But the design of the framework itself is in such a way that the language or language based libraries will not be bottle neck for the customisation for a new solution.

Conclusion:

Fastai seems incredibly promising as a library which can improve productivity and customisation at the same time as the team says

We believe fastai meets its design goals. A user can create and train a state-of-the-art vision model using transfer learning with four understandable lines of code.

Intelligent layered architecture of the system provide a way to use the capabilities of language and language provided APIs to greater extends while keeping the system stable and easy to maintain. This results in faster turnaround times.

Early results from using fastai are very positive. We have used the fastai library to rewrite the entire fast.ai course “Practical Deep Learning for Coders”, which contains 14 hours of material, across seven modules, and covers all the applications described in this paper

The library seems tempting for both researches and engineers at the same time.

Please read f ull paper from fastai .

Thanks for your time.


以上所述就是小编给大家介绍的《Fastai is now using Python and PyTorch to be productive and hackable.》,希望对大家有所帮助,如果大家有任何疑问请给我留言,小编会及时回复大家的。在此也非常感谢大家对 码农网 的支持!

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

网络多人游戏架构与编程

网络多人游戏架构与编程

格雷泽 (Joshua Glazer)、马达夫 (Sanjay Madhav) / 王晓慧、张国鑫 / 人民邮电出版社 / 2017-10-1 / CNY 109.00

本书是一本深入探讨关于网络多人游戏编程的图书。 全书分为13章,从网络游戏的基本概念、互联网、伯克利套接字、对象序列化、对象复制、网络拓扑和游戏案例、延迟、抖动和可靠性、改进的延迟处理、可扩展性、安全性、真实世界的引擎、玩家服务、云托管专用服务器等方面深入介绍了网络多人游戏开发的知识,既全面又详尽地剖析了众多核心概念。 本书的多数示例基于C++编写,适合对C++有一定了解的读者阅读。本......一起来看看 《网络多人游戏架构与编程》 这本书的介绍吧!

JSON 在线解析
JSON 在线解析

在线 JSON 格式化工具

Base64 编码/解码
Base64 编码/解码

Base64 编码/解码

html转js在线工具
html转js在线工具

html转js在线工具