Questions tagged with Amazon Managed Workflows for Apache Airflow (MWAA)
Content language: English
Select up to 5 tags to filter
Sort by most recent
Browse through the questions and answers listed below or filter and sort to narrow down your results.
Hello team,
I have a use case to read data from on-premise table to s3 using mwaa . Once the read completes i need to update a flag in the on-premise table { a typical oracle update records from glue...
Hi,
I have been trying to establish connection between MWAA to Snowflake. As part of this have below requirement.txt file. When i updated MWAA environment with requirement file i could see from...
Hi!
I have a mwaa server using airflow 2.2.2
I also use the provider "apache-airflow-providers-amazon" version 2.4
I'm launching the task using FARGATE, but all are launched with the capacity...
While deploying MWAA in AWS ASIA PACIFIC(SEOUL) REGION, the following error occurred.
The limit of 0 environments has been reached.
I never deployed MWAA in that region, so why am I getting this...
I previously created an airflow environment in sa-east-1 and now I want to create an airflow environment in us-east-1
But when I try to create an airflow environment I see the message:
The limit of 0...
**Background:**
I am trying to build a system with MWAA to pull data from various databases at different schedules.
Eg:
- workFlow1 pulls data from mysqldb1 every hour
- workFlow2 pulls data from...
Hi, I'm creating the Pythonvirtualoperator like this:
```
virtualenv_task = PythonVirtualenvOperator(
task_id="virtualenv_python",
dag=dag,
op_args=redshift_con,
...
![MWAA Issue](/media/postImages/original/IMhlVGa4RAQTCo4Ri1wNwtLA)
When trying to create an Amazon MWAA environment, it insists I haven't put in valid numbers for max and min worker counts. Can anyone...
I'm running a DAG every minute that gets exchange rates from an API, puts them in a CSV and uploads them to an S3 bucket.
However, intermittently I get this error: FileNotFoundError: [Errno 2] No...
Hi Team,
All examples that I am seeing are passing application configuration overrides as a dictionary. how to pass this value from a S3 path. This configuration will be pushed from airflow
Hi Team,
We are invoking our EMR-S jobs using an airflow EMR job submit operator. The EMR application is configured with a set of spark default run time configurations. while invoking the job from...
Hi all,
I'm currently working with MWAA service version 2.8.1(Latest) upgraded from 2.7.2, and encountered an issue while trying to utilize the KubernetesPodOperator class in a Python DAG. I'm facing...