Fastai is now using Python and PyTorch to be productive and hackable.

栏目: IT技术 · 发布时间: 6年前

Fastai is now using Python and PyTorch to be productive and hackable.

A recent paper published by fastai states that the use of Python and PyTorch is helping them to be quick and clear.

Fastai is now using Python and PyTorch to be productive and hackable.

Image by alan9187 from Pixabay

Organised around two major design goals, fastai is a modern deep learning library that simplifies training of fast and accurate neural nets. The two major design goals of fastai are

  1. To be approachable and rapidly productive.
  2. To be deeply hackable and configurable.

The framework provides engineers with high-level components that can quickly and easily provide state-of-the-art results in standard deep learning domains, and provides researchers with low-level components that can be mixed and matched to build new approaches. The new version of fastai, fastai v2 , that is expected to be released officially around July 2020, uses the dynamic nature of Python Language and flexibility of PyTorch to be concise and clear. The library is specifically designed with ease of use, flexibility, and performance in mind.

Architecture

Fastai’s carefully layered architecture holds the key in its being productive and configurable. Most of the modern day deep learning libraries focuses on either one of this, but fastai is specifically designed for being both at the same time. The team wanted to get the clarity and development speed of Keras and the customizability of PyTorch.

Fastai uses decoupled abstractions to represent underlying patterns of many deep learning and data processing techniques which creates its layered architecture. This helps fastai to achieve best of both worlds. There is a high level API which can be called by ready-to-use functions to train models for various applications. This high level API is build on top of multiple composable low level APIs, which can be switched and swapped as per the need for particular behaviour.

Following diagram shows a layered API from fast.ai

Fastai is now using Python and PyTorch to be productive and hackable.

Users of the API can either use the High Level API to train a model for common applications or they have the option to play with Mid Level or Low Level APIs if they want to hack into a more custom solution.

Beginners and to practitioners will be able widely using high-level of the API. It offers concise APIs over four main application areas: vision, text, tabular and time-series analysis, and collaborative filtering. All of these application areas are highly optimised for ease of use with maximum benefits as these APIs choose intelligent default values and behaviours based on all available information. This use of intelligent defaults–based on system’s experience or best practices–extends to incorporating state-of-the-art research wherever possible. This means that beginners with less knowledge about the system will be able to train models which are of top level research quality.

The mid-level API is designed for scalability and customisation provides the core deep learning and data-processing methods for each of these applications. The mid level API makes sure that the low level API will not become too cluttered too fast as in the case of may two layered frameworks. Also it provides a layer of abstraction for any one who wants to customise only high level API without having to learn a lot about the low level APIs

The low-level APIs provide a library of optimized primitives and functional and object-oriented foundations, which allows the mid-level to be developed and customised.

Mid Level APIs and Low Level APIs makes more sense for researchers and is designed ins such away that they can exploit most, if not all, of the capabilities of underlying language and framework.

Getting the most out of Phyton and PyTorch

Build on top of different Phyton based libraries such as PyTorch, NumPy, PIL, pandas, and various other libraries, in order to achieve its goal of hackability, the library does’t aim to supplant or hide these lower levels or these foundation. For instance, in a fastai model, developer can interact directly with the underlying PyTorch primitives; and within a PyTorch model, one can incrementally adopt components from the fastai library as conveniences rather than as an integrated package.

That means this is really powerful for research and related tasks as there are a lot ways we can experiment leveraging current tools and frameworks with out making things complex.

In same lines, rather than keeping Phyton itself as the low-level of computation, fastai depends on a layer of well defined abstraction at the lower level. The mid level APIs depend on this lower level APIs for functionalities. Along with this fastai has a few more additions designed to make Python easier to use, including a NumPy-like API for lists called L , and some decorators to make delegation or patching easier.

This means that Python is used in places where it can provide value to the users of the library and provide benefits to the framework. For instance, the transform pipeline system is built on top of the foundations provided by PyTorch. But the design of the framework itself is in such a way that the language or language based libraries will not be bottle neck for the customisation for a new solution.

Conclusion:

Fastai seems incredibly promising as a library which can improve productivity and customisation at the same time as the team says

We believe fastai meets its design goals. A user can create and train a state-of-the-art vision model using transfer learning with four understandable lines of code.

Intelligent layered architecture of the system provide a way to use the capabilities of language and language provided APIs to greater extends while keeping the system stable and easy to maintain. This results in faster turnaround times.

Early results from using fastai are very positive. We have used the fastai library to rewrite the entire fast.ai course “Practical Deep Learning for Coders”, which contains 14 hours of material, across seven modules, and covers all the applications described in this paper

The library seems tempting for both researches and engineers at the same time.

Please read f ull paper from fastai .

Thanks for your time.


以上所述就是小编给大家介绍的《Fastai is now using Python and PyTorch to be productive and hackable.》,希望对大家有所帮助,如果大家有任何疑问请给我留言,小编会及时回复大家的。在此也非常感谢大家对 码农网 的支持!

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

产品增长力

产品增长力

李阳 / 机械工业出版社 / 2018-4-1 / 59

本书由京东资深数据产品经理撰写,重新定义了数据与产品、业务的关系,从数据分析方法、数据价值挖掘、数据结果倒逼业务优化3个层次,以及设计、运营和优化3个维度,为产品增长提供了科学的依据和方法论,得到了PMCaff创始人阿德、GrowingIO创始人&CEO张溪梦、增长官研究院创始人范冰、腾讯高级产品经理刘涵宇等专家的高度评价。 全书内容以理论为主线,以实操为目标,萃取技术实操与管理思维中的精华......一起来看看 《产品增长力》 这本书的介绍吧!

JSON 在线解析
JSON 在线解析

在线 JSON 格式化工具

RGB转16进制工具
RGB转16进制工具

RGB HEX 互转工具

HEX HSV 转换工具
HEX HSV 转换工具

HEX HSV 互换工具