Fastai is now using Python and PyTorch to be productive and hackable.

栏目: IT技术 · 发布时间: 4年前

Fastai is now using Python and PyTorch to be productive and hackable.

A recent paper published by fastai states that the use of Python and PyTorch is helping them to be quick and clear.

Fastai is now using Python and PyTorch to be productive and hackable.

Image by alan9187 from Pixabay

Organised around two major design goals, fastai is a modern deep learning library that simplifies training of fast and accurate neural nets. The two major design goals of fastai are

  1. To be approachable and rapidly productive.
  2. To be deeply hackable and configurable.

The framework provides engineers with high-level components that can quickly and easily provide state-of-the-art results in standard deep learning domains, and provides researchers with low-level components that can be mixed and matched to build new approaches. The new version of fastai, fastai v2 , that is expected to be released officially around July 2020, uses the dynamic nature of Python Language and flexibility of PyTorch to be concise and clear. The library is specifically designed with ease of use, flexibility, and performance in mind.

Architecture

Fastai’s carefully layered architecture holds the key in its being productive and configurable. Most of the modern day deep learning libraries focuses on either one of this, but fastai is specifically designed for being both at the same time. The team wanted to get the clarity and development speed of Keras and the customizability of PyTorch.

Fastai uses decoupled abstractions to represent underlying patterns of many deep learning and data processing techniques which creates its layered architecture. This helps fastai to achieve best of both worlds. There is a high level API which can be called by ready-to-use functions to train models for various applications. This high level API is build on top of multiple composable low level APIs, which can be switched and swapped as per the need for particular behaviour.

Following diagram shows a layered API from fast.ai

Fastai is now using Python and PyTorch to be productive and hackable.

Users of the API can either use the High Level API to train a model for common applications or they have the option to play with Mid Level or Low Level APIs if they want to hack into a more custom solution.

Beginners and to practitioners will be able widely using high-level of the API. It offers concise APIs over four main application areas: vision, text, tabular and time-series analysis, and collaborative filtering. All of these application areas are highly optimised for ease of use with maximum benefits as these APIs choose intelligent default values and behaviours based on all available information. This use of intelligent defaults–based on system’s experience or best practices–extends to incorporating state-of-the-art research wherever possible. This means that beginners with less knowledge about the system will be able to train models which are of top level research quality.

The mid-level API is designed for scalability and customisation provides the core deep learning and data-processing methods for each of these applications. The mid level API makes sure that the low level API will not become too cluttered too fast as in the case of may two layered frameworks. Also it provides a layer of abstraction for any one who wants to customise only high level API without having to learn a lot about the low level APIs

The low-level APIs provide a library of optimized primitives and functional and object-oriented foundations, which allows the mid-level to be developed and customised.

Mid Level APIs and Low Level APIs makes more sense for researchers and is designed ins such away that they can exploit most, if not all, of the capabilities of underlying language and framework.

Getting the most out of Phyton and PyTorch

Build on top of different Phyton based libraries such as PyTorch, NumPy, PIL, pandas, and various other libraries, in order to achieve its goal of hackability, the library does’t aim to supplant or hide these lower levels or these foundation. For instance, in a fastai model, developer can interact directly with the underlying PyTorch primitives; and within a PyTorch model, one can incrementally adopt components from the fastai library as conveniences rather than as an integrated package.

That means this is really powerful for research and related tasks as there are a lot ways we can experiment leveraging current tools and frameworks with out making things complex.

In same lines, rather than keeping Phyton itself as the low-level of computation, fastai depends on a layer of well defined abstraction at the lower level. The mid level APIs depend on this lower level APIs for functionalities. Along with this fastai has a few more additions designed to make Python easier to use, including a NumPy-like API for lists called L , and some decorators to make delegation or patching easier.

This means that Python is used in places where it can provide value to the users of the library and provide benefits to the framework. For instance, the transform pipeline system is built on top of the foundations provided by PyTorch. But the design of the framework itself is in such a way that the language or language based libraries will not be bottle neck for the customisation for a new solution.

Conclusion:

Fastai seems incredibly promising as a library which can improve productivity and customisation at the same time as the team says

We believe fastai meets its design goals. A user can create and train a state-of-the-art vision model using transfer learning with four understandable lines of code.

Intelligent layered architecture of the system provide a way to use the capabilities of language and language provided APIs to greater extends while keeping the system stable and easy to maintain. This results in faster turnaround times.

Early results from using fastai are very positive. We have used the fastai library to rewrite the entire fast.ai course “Practical Deep Learning for Coders”, which contains 14 hours of material, across seven modules, and covers all the applications described in this paper

The library seems tempting for both researches and engineers at the same time.

Please read f ull paper from fastai .

Thanks for your time.


以上所述就是小编给大家介绍的《Fastai is now using Python and PyTorch to be productive and hackable.》,希望对大家有所帮助,如果大家有任何疑问请给我留言,小编会及时回复大家的。在此也非常感谢大家对 码农网 的支持!

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

产品经理手册(原书第4版)(白金版)

产品经理手册(原书第4版)(白金版)

[美] 琳达·哥乔斯(Linda Gorchels) / 祝亚雄、冯华丽、金骆彬 / 机械工业出版社 / 2017-8 / 65.00

产品经理的职责起点是新产品开发,贯穿产品生命周期的全过程。本书按上下游产品管理进行组织。 在上游的新产品开发流程中,作者阐述了如何从市场、产品、行业、公司的角度规划企划方案,并获得老板、销售部、运营部的资源支持,推进新产品的项目流程,实现所有目标,制定和实施新产品发布。 下游产品的管理核心在于生命周期的管理,营销更是生命周期管理的重中之重。产品经理如何让产品满足客户需求,让客户获得对产......一起来看看 《产品经理手册(原书第4版)(白金版)》 这本书的介绍吧!

HTML 压缩/解压工具
HTML 压缩/解压工具

在线压缩/解压 HTML 代码

UNIX 时间戳转换
UNIX 时间戳转换

UNIX 时间戳转换

HEX CMYK 转换工具
HEX CMYK 转换工具

HEX CMYK 互转工具