Tensorflow Extended, ML Metadata and Apache Beam on the Cloud
A practical and self-contained example using GCP Dataflow
The fully end to end example that tensorflow extended provides by running tfx template copy taxi $target-dir
produces 17 files scattered in 5 directories.
If you are looking for a smaller, simpler and self contained example
that actually runs on the cloud and not locally, this is what you are looking for. Cloud services setup is also mentioned here.
What’s going to be covered
We are going to generate statistics and a schema for the Chicago taxi trips csv dataset that you can find by running the tfx template copy taxi
command under the data
directory.
Generated artifacts such as data statistics or the schema are going to be viewed from a jupyter notebook, by connecting to the ML Metadata store or just by downloading artifacts from simple file/binary storage.
Full code sample at the bottom of the article
Services Used
The whole pipeline can run on your local machine ( or on different cloud providers/your custom spark clusters as well). This is an example that can be scaled by using bigger datasets. If you wish to understand how this happens transparently, read this article .
Execution Process
- If running locally, code will not be serialised or sent to the cloud (of course). Otherwise, Beam is going to send everything to a staging location (typically bucket storage). Check out cloudpickle to get some intuition on how serialisation is done.
- Your cloud running service of choice (ours is Dataflow) is going to check if all the mentioned resources exist and are accessible (for example, pipeline output, temporary file storage, etc)
- Compute instances are going to be started and your pipeline is going to be executed in a distributed scenario, showing up in the job inspector while it is still running or finished.
It’s a good naming practise to use /temp
or /tmp
for temporary files and /staging
or /binaries
for the staging directory.
The TFX Pipeline
Tensorflow Extended provides it’s custom component wrappers around plain old beam components. They are a bit more federated in the form: artifacts are only produced and consumed. This means that they do not stream all the dataset everytime, they just pass around resource locator strings. Your dataset gets streamed for analysis preprocessing speed reasons and then saved in small chunks as tfrecords
for maximum performance, taking full advantage of the fast storage technology of Storage Buckets.
This is why when you declare custom components
, you declare strongly typed input and output channels (artifact types and names), which get mapped to multiple, tagged input-outputs on the beam side
. You return these with a Dict
. Feel free to look into the source of the default TFX Components for more insights on these
This is why you need to do things like:
example_gen = CsvExampleGen(...)
statistics_gen = StatisticsGen(examples=example_gen.outputs['examples'])
Related Articles
Request for deletion
About
MC.AI – Aggregated news about artificial intelligence
MC.AI collects interesting articles and news about artificial intelligence and related areas. The contributions come from various open sources and are presented here in a collected form.
The copyrights are held by the original authors, the source is indicated with each contribution.
Contributions which should be deleted from this platform can be reported using the appropriate form (within the contribution).
MC.AI is open for direct submissions, we look forward to your contribution!
Search on MC.AI
mc.ai aggregates articles from different sources - copyright remains at original authors
以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。
产品心经:产品经理应该知道的60件事(第2版)
闫荣 / 机械工业出版社 / 2016-4 / 69.00
本书第一版出版后广获好评,应广大读者要求,作者把自己在实践中新近总结的10个关于产品的最佳实践融入到了这本新书中。这"10件事"侧重于深挖产品需求和产品疯传背后的秘密,配合之前的"50件事",不仅能帮产品经理打造出让用户尖叫并疯传的产品,还能帮助产品经理迅速全方位提升自己的能力。 本书作者有超过10年的产品工作经验,在互联网产品领域公认的大咖,这本书从产品经理核心素养、产品认知、战略与规划、......一起来看看 《产品心经:产品经理应该知道的60件事(第2版)》 这本书的介绍吧!
RGB转16进制工具
RGB HEX 互转工具
RGB HSV 转换
RGB HSV 互转工具