afctl – CLI to manage and deploy Airflow project faster and smoother

栏目: IT技术 · 发布时间: 4年前

内容简介:The proposed CLI tool is authored to make creating and deployment of airflow projects faster and smoother. As of now, there is no tool out there that can empower the user to create a boilerplate code structure for airflow projects and make development + de

afctl

The proposed CLI tool is authored to make creating and deployment of airflow projects faster and smoother. As of now, there is no tool out there that can empower the user to create a boilerplate code structure for airflow projects and make development + deployment of projects seamless.

Requirements

  • Python 3.5+
  • Docker

Getting Started

1. Installation

Create a new python virtualenv. You can use the following command.

python3 -m venv <name>

Activate your virtualenv

source /path_to_venv/bin/activate
pip3 install afctl

2. Initialize a new afctl project.

The project is created in your present working directory. Along with this a configuration file with the same name is generated in /home/.afctl_configs directory.

afctl init <name of the project>

Eg.

afctl init project_demo
  • The following directory structure will be generated
.
├── deployments
│   └── project_demo-docker-compose.yml
├── migrations
├── plugins
├── project_demo
│   ├── commons
│   └── dags
├── requirements.txt
└── tests

If you already have a git repository and want to turn it into an afctl project. Run the following command :-

afctl init .

3. Add a new module in the project.

afctl generate module -n <name of the module>

The following directory structure will be generated :

afctl generate module -n first_module
afctl generate module -n second_module

.
├── deployments
│   └── project_demo-docker-compose.yml
├── migrations
├── plugins
├── project_demo
│   ├── commons
│   └── dags
│       ├── first_module
│       └── second_module
├── requirements.txt
└── tests
    ├── first_module
    └── second_module

4. Generate dag

afctl generate dag -n <name of dag> -m <name of module>

The following directory structure will be generate :

afctl generate dag -n new -m first_module

.
├── deployments
│   └── project_demo-docker-compose.yml
├── migrations
├── plugins
├── project_demo
│   ├── commons
│   └── dags
│       ├── first_module
│       │   └── new_dag.py
│       └── second_module
├── requirements.txt
└── tests
    ├── first_module
    └── second_module

The dag file will look like this :

from airflow import DAG
from datetime import datetime, timedelta

default_args = {
'owner': 'project_demo',
# 'depends_on_past': ,
# 'start_date': ,
# 'email': ,
# 'email_on_failure': ,
# 'email_on_retry': ,
# 'retries': 0

}

dag = DAG(dag_id='new', default_args=default_args, schedule_interval='@once')

5. Deploy project locally

You can add python packages that will be required by your dags in requirements.txt . They will automatically get installed.

  • To deploy your project, run the following command (make sure docker is running) :
afctl deploy local

If you do not want to see the logs, you can run

afctl deploy local -d

This will run it in detached mode and won't print the logs on the console.

  • You can access your airflow webserver on browser at localhost:8080

6. Deploy project on production

  • Here we will be deploying our project to Qubole . Sign up at us.qubole.com.
  • add git-origin and access-token (if want to keep the project as private repo on Github) to the configs.
  • Push the project once completed to Github.
  • Deploying to Qubole will require adding deployment configurations.
afctl config add -d qubole -n <name of deployment> -e <env> -c <cluster-label> -t <auth-token>

This command will modify your config file. You can see your config file with the following command :

afctl config show

For example -

afctl config add -d qubole -n demo -e https://api.qubole.com -c airflow_1102 -t khd34djs3
  • To deploy run the following command
afctl deploy qubole -n <name>

The following video also contains all the steps of deploying project using afctl -

https://www.youtube.com/watch?v=A4rcZDGtJME&feature=youtu.be

Manage configurations

The configuration file is used for deployment contains the following information.

global:
-airflow_version:
-git:
--origin:
--access-token:
deployment:
-qubole:
--local:
---compose:
  • airflow_version can be added to the project when you initialize the project.
afctl init <name> -v <version>
  • global configs (airflow_version, origin, access-token) can all be added/ updated with the following command :
afctl config global -o <git-origin> -t <access-token> -v <airflow_version>

Usage

Commands right now supported are

  • init
  • config
  • deploy
  • list
  • generate

To learn more, run

afctl <command> -h

Caution

Not yet ported for Windows.

Credits

Docker-compose file : https://github.com/puckel/docker-airflow


以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

马云现象的经济学分析:互联网经济的八个关键命题

马云现象的经济学分析:互联网经济的八个关键命题

胡晓鹏 / 上海社会科学院出版社 / 2016-11-1 / CNY 68.00

互联网经济的产生、发展与扩张,在冲击传统经济理论观点的同时,也彰显了自身理论体系的独特内核,并与那种立足于工业经济时代的经典理论发生显著分野。今天看来,“马云”们的成功是中国经济长期“重制造、轻服务,重产能、轻消费,重国有、轻民营”发展逻辑的结果。但互联网经济的发展却不应仅仅止步于商业技巧的翻新,还需要在理论上进行一番审慎的思考。对此,我们不禁要问:互联网经济驱动交易发生的机理是什么?用户基数和诚......一起来看看 《马云现象的经济学分析:互联网经济的八个关键命题》 这本书的介绍吧!

Markdown 在线编辑器
Markdown 在线编辑器

Markdown 在线编辑器

html转js在线工具
html转js在线工具

html转js在线工具

HEX HSV 转换工具
HEX HSV 转换工具

HEX HSV 互换工具