Using upstream Apache Airflow Hooks and Operators in Cloud Composer

栏目: PHP · 发布时间: 6年前

Using upstream Apache Airflow Hooks and Operators in Cloud Composer

Using upstream Apache Airflow Hooks and Operators in Cloud Composer

admin GoogleCloud No comments

Source: Using upstream Apache Airflow Hooks and Operators in Cloud Composer from Google Cloud

For engineers or developers in charge of integrating, transforming, and loading a variety of data from an ever-growing collection of sources and systems, Cloud Composer has dramatically reduced the number of cycles spent on workflow logistics. Built on Apache Airflow , Cloud Composer makes it easy to author, schedule, and monitor data pipelines across multiple clouds and on-premises data centers.

Let’s walk through an example of how Cloud Composer makes building a pipeline across public clouds easier. As you design your new workflow that’s going to bring data from another cloud (Microsoft Azure’s ADLS, for example) into Google Cloud, you notice that upstream Apache Airflow already has an ADLS hook that you can use to copy data. You insert an import statement into your DAG file, save, and attempt to test your workflow. “ImportError – no module named x.” Now what?

As it turns out, functionality that has been committed upstream—such as brand new Hooks and Operators —might not have made its way into Cloud Composer just yet. Don’t worry, though: you can still use these upstream additions by leveraging the Apache Airflow Plugin interface.

Using the upstream AzureDataLakeHook as an example, all you have to do is the following:

  1. Copy the code into a separate file (ensuring adherence to the Apache License)

  2. Import the AirflowPlugin module ( from airflow.plugins_manager import AirflowPlugin )

  3. Add the below snippet to the bottom of the file:

Once you have completed the above steps, you need to ensure that all other dependencies required by the functionality you added are included in your Cloud Composer environment. In this example we need to include the azure-datalake-store package. To install this package into your environment, you can use the Cloud Console. Navigate to Cloud Composer, click on your environment, followed by PyPI Packages, and then click “Edit.” It may take a few moments for the operation to complete, but once it succeeds, you should see a view similar to the screenshot below:

Using upstream Apache Airflow Hooks and Operators in Cloud Composer

Next, we need to make the plugin available to the Cloud Composer environment. To do this, you can copy the plugin to the plugins folder following the instructions here . This command will look something like this:

Once the plugin has been imported, you can now use it. This simple example snippet shows how to import the plugin and leverage the AzureDataLakeHook functionality that the plugin now provides in conjunction with the GoogleCloudStorageHook to copy data from ADLS to Cloud Storage:

You could easily extend this to create a more robust Operator that provides this functionality, and use the same workflow to make that available to your specific workflows.

In summary, you can use features from the upstream Apache Airflow codebase, including newer connectors to external data sources, even with Cloud Composer, Google’s managed Airflow service. For more on working with upstream components, check out the Airflow documentation here .

除非特别声明,此文章内容采用 知识共享署名 3.0 许可,代码示例采用 Apache 2.0 许可。更多细节请查看我们的 服务条款

Tags: Cloud


以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

从入门到精通:Prezi完全解读

从入门到精通:Prezi完全解读

计育韬、朱睿楷、谢礼浩 / 电子工业出版社 / 2015-9 / 79.00元

Prezi是一款非线性逻辑演示软件,它区别于PowerPoint的线性思维逻辑;而是将整个演示内容铺呈于一张画布上,然后通过视角的转换定位到需要演示的位置,并且它的画布可以随时zoom in和zoom out,给演示者提供了一个更好的展示空间。 Prezi对于职场人士和在校学生是一个很好的发挥创意的工具,因为它的演示逻辑是非线性的,所以用它做出来的演示文稿可以如思维导图一样具有发散性,也可以......一起来看看 《从入门到精通:Prezi完全解读》 这本书的介绍吧!

JS 压缩/解压工具
JS 压缩/解压工具

在线压缩/解压 JS 代码

RGB CMYK 转换工具
RGB CMYK 转换工具

RGB CMYK 互转工具

HSV CMYK 转换工具
HSV CMYK 转换工具

HSV CMYK互换工具