& pip install -user -no-cache-dir -no-warn-script-location -r. ".Hankehly ~/src/airflow-2-docker-example $ poetry run airflow db initĭB: 19:23:20,344] \ AIRFLOW_WEBSERVER_WORKER_REFRESH_INTERVAL=1800 labels: POSTGRES_PASSWORD=$ # Restart workers every 30min instead of 30seconds Image: postgres:9.6 container_name: af_postgres environment: dags:/opt/airflow/dags command: worker restart: always dags:/opt/airflow/dags command: scheduler restart: always worker: redis command: flower restart: always scheduler: dags:/opt/airflow/dags command: "webserver " restart: always flower: c "airflow list_users || (airflow initdb & airflow create_user -role Admin -username airflow -password airflow -e -f airflow -l airflow)" restart: on-failure webserver: POSTGRES_USER: airflow POSTGRES_DB: airflow POSTGRES_PASSWORD: airflow redis: Version: "3.7 " x-airflow-environment: &airflow-environment AIRFLOW_CORE_EXECUTOR: CeleryExecutor AIRFLOW_WEBSERVER_RBAC: "True " AIRFLOW_CORE_LOAD_EXAMPLES: "False " AIRFLOW_CELERY_BROKER_URL: " AIRFLOW_CORE_SQL_ALCHEMY_CONN: services: Quick start documentation planned in #8542 <<: *database-env command: worker depends_on: Image: airflow:1.10.10 user: airflow volumes: <<: *database-env command: scheduler networks: logs:/opt/airflow/logs command: flower networks: Test: "] interval: 30s timeout: 30s retries: 3 networks: <<: *database-env <<: *airflow-env ADMIN_PASSWORD: airflow depends_on: var/run/docker.sock:/var/run/docker.sock environment: For now, you can use the development version, and when a stable version is released it will be very easy for you to migrate. Unfortunately, this guide has not been released yet. Image: airflow:1.10.10 user: airflow ports: I recently added a quick start guides to the official Apache Airflow documentation. REDIS_HOST: redis REDIS_PORT: 6379 ports: postgres -c listen_addresses=* -c logging_collector=on -c log_destination=stderr -c max_connections=200 networks: ![]() &airflow-env AIRFLOW_CORE_EXECUTOR: CeleryExecutor AIRFLOW_WEBSERVER_RBAC: 'True ' AIRFLOW_CORE_CHECK_SLAS: 'False ' AIRFLOW_CORE_STORE_SERIALIZED_DAGS: 'False ' AIRFLOW_CORE_PARALLELISM: 50 AIRFLOW_CORE_LOAD_EXAMPLES: 'False ' AIRFLOW_CORE_LOAD_DEFAULT_CONNECTIONS: 'False ' AIRFLOW_SCHEDULER_SCHEDULER_HEARTBEAT_SEC: 10 services: &database-env POSTGRES_USER: airflow POSTGRES_DB: airflow POSTGRES_PASSWORD: airflow x-airflow-env: airflow-data/plugins:/opt/airflow/plugins Image: apache/airflow:1.10.10 container_name: airflow_worker3_cont environment: airflow-data/plugins:/opt/airflow/plugins airflow-worker3: Image: apache/airflow:1.10.10 container_name: airflow_worker2_cont environment: airflow-data/plugins:/opt/airflow/plugins airflow-worker2: AIRFLOW_WEBSERVER_RBAC=True command: worker volumes: Image: apache/airflow:1.10.10 container_name: airflow_worker1_cont environment: airflow-data/plugins:/opt/airflow/plugins airflow-worker1: AIRFLOW_WEBSERVER_RBAC=True command: scheduler volumes: Image: apache/airflow:1.10.10 container_name: airflow_scheduler_cont environment: from /etc/os-release): RHEL7 What happened: After running fine for some time my airflow tasks got stu. airflow-data/plugins:/opt/airflow/plugins airflow-scheduler: Apache Airflow version: 2.0.1 Kubernetes version (if you are using kubernetes) (use kubectl version): v1.17.4 Environment: Dev OS (e.g. AIRFLOW_WEBSERVER_RBAC=True command: webserver ports: Image: apache/airflow:1.10.10 environment: ![]() AIRFLOW_WEBSERVER_RBAC=True command: flower ports: AIRFLOW_CORE_FERNET_KEY=FB0o_zt4e3Ziq3LdUUO7F2Z95cvFFx16hU8jTeR1ASM= ![]() airflow-data/dags:/opt/airflow/dags environment: This is just example so this issue will not implement all of it and we will likely split those docker-compose into separate postgres/sqlite/mysql similarly as we do in CI script, so I wanted to keep it as separate issue - we will deal with user creation in #8606 Rabitmq (should we choose one ?)ĭepending on the setup, those Docker compose file should do proper DB initialisation.Įxample Docker Compose (From ) that we might use as a base and #8548. They should be varianted and possible to specify the number of parameters: Kubernetes Executor (? do we need to have a Kubernetes Executor in a Compose ? I guess not.).We seem to get to consensus that we need to have several docker-compose "sets" of files: ![]() In order to use the production image we are already working on a helm chart, but we might want to add a production-ready docker compose that will be able to run airflow installation.įor local tests/small deployments - being able to have such docker-compose environment would be really nice.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |