2017-07-23 70 views
1

我在氣流中製作了以下DAG,並執行一組EMRSteps來運行我的管道。Airflow EMR從傳感器執行步驟

default_args = { 
    'owner': 'airflow', 
    'depends_on_past': False, 
    'start_date': datetime(2017, 07, 20, 10, 00), 
    'email': ['[email protected]'], 
    'email_on_failure': False, 
    'email_on_retry': False, 
    'retries': 5, 
    'retry_delay': timedelta(minutes=2), 
} 

dag = DAG('dag_import_match_hourly', 
     default_args=default_args, 
     description='Fancy Description', 
     schedule_interval=timedelta(hours=1), 
     dagrun_timeout=timedelta(hours=2)) 

try: 
    merge_s3_match_step = EmrAddStepsOperator(
     task_id='merge_s3_match_step', 
     job_flow_id=cluster_id, 
     aws_conn_id='aws_default', 
     steps=create_step('Merge S3 Match'), 
     dag=dag 
    ) 

    mapreduce_step = EmrAddStepsOperator(
     task_id='mapreduce_match_step', 
     job_flow_id=cluster_id, 
     aws_conn_id='aws_default', 
     steps=create_step('MapReduce Match Hourly'), 
     dag=dag 
    ) 

    merge_hdfs_step = EmrAddStepsOperator(
     task_id='merge_hdfs_step', 
     job_flow_id=cluster_id, 
     aws_conn_id='aws_default', 
     steps=create_step('Merge HDFS Match Hourly'), 
     dag=dag 
    ) 

    ## Sensors 
    check_merge_s3 = EmrStepSensor(
     task_id='watch_merge_s3', 
     job_flow_id=cluster_id, 
     step_id="{{ task_instance.xcom_pull('merge_s3_match_step', key='return_value')[0] }}", 
     aws_conn_id='aws_default', 
     dag=dag 
    ) 

    check_mapreduce = EmrStepSensor(
     task_id='watch_mapreduce', 
     job_flow_id=cluster_id, 
     step_id="{{ task_instance.xcom_pull('mapreduce_match_step', key='return_value')[0] }}", 
     aws_conn_id='aws_default', 
     dag=dag 
    ) 

    check_merge_hdfs = EmrStepSensor(
     task_id='watch_merge_hdfs', 
     job_flow_id=cluster_id, 
     step_id="{{ task_instance.xcom_pull('merge_hdfs_step', key='return_value')[0] }}", 
     aws_conn_id='aws_default', 
     dag=dag 
    ) 

    mapreduce_step.set_upstream(merge_s3_match_step) 
    merge_s3_match_step.set_downstream(check_merge_s3) 

    mapreduce_step.set_downstream(check_mapreduce) 

    merge_hdfs_step.set_upstream(mapreduce_step) 
    merge_hdfs_step.set_downstream(check_merge_hdfs) 

except AirflowException as ae: 
    print ae.message 

的DAG工作正常,但我想用的傳感器,以確保我去執行下一步當且僅當電子病歷工作已經正確完成。我嘗試了一些東西,但沒有一個在工作。上面的代碼不能正確完成這項工作。有人知道如何使用EMRSensorStep來實現我的目標嗎?

+0

是您寫的那些自定義傳感器嗎?我正在設置airflow和emr,並需要一種方法來檢查羣集中步驟的狀態。 – luckytaxi

+1

沒有那些傳感器,你可以在'''contrib'''包中找到 - 見這裏[https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/sensors/emr_step_sensor.py ](https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/sensors/emr_step_sensor.py) – davideberdin

+0

Thx ...我沒有意識到'contrib'目錄中還有其他的。 – luckytaxi

回答

1

它看起來像你的EmrStepSensor任務需要設置正確的依賴關係,例如,check_mapreduce,如果你想等待check_mapreduce完成,下一步應該是merge_hdfs_step.set_upstream(check_mapreduce)check_mapreduce.set_downstream(merge_hdfs_step)。因此,它將是TaskA >> SensorA >> TaskB >> SensorB >> TaskC >> SensorC,嘗試使用這種方式設置依賴關係

+0

這完全有效,謝謝! – davideberdin