Комментарии:
BUN!
Nice tutorial, thank you for your effort. Learned a lot.
This tutorial was really usefull to understand how airflow works. Thanks, you are great!
Ответитьthanks! also some intro to docker and postgres are good
ОтветитьThis doesn't work on Windows?
ОтветитьAnd doing threw all steps as you for webserver part it’s gaves error evwn using port 8081 8082
ОтветитьI'm stuck at around 8 minutes. I've followed the steps but when I open the webserver at port 8080. I get the following error: sqlite3.OperationalError: no such table: session.When I inspect the airflow.db there's no session table in the db. I've tried the following commands without any luck: airflow db migrate and airflow db upgrade. Has anyone encountered the same problem as me?
Edit:
Airflow version: 2.7.3
Python version: 3.9.11
very useful guide for beginner
thank u much
After assign port I get an error "ModuleNotFoundError: No module named 'pwd'". I found that windows computers haven't that module, So what should I do?
ОтветитьI'm completely new to using airflow and this video has been so good to get started with the key concepts and use cases! Very clear explanations, concise information and brilliantly broken down into logical modules. Thank you for creating this content for us :)
ОтветитьHow do I change the python version once I am in the directory?
And I am getting the following error message:
"ERROR: Could not build wheels for google-re2, which is required to install pyproject.toml-based projects"
I am not able to connect to airflow can anyone help with that
ОтветитьI literally followed each step, and I don't have the example dags. Which makes me fear, that other components (like hooks or operators) weren't properly installed.
Any idea of what could it be?
I have load_examples = True, in airflow.cfg
Thanks beforehand.
Amazing tutorial
ОтветитьYour voice very poor, Please speak loudly dear.
ОтветитьThis tutorial is very different when you use windows os to run it :(
ОтветитьCan anyone help.. How can I solve the PWD module error in windows
ОтветитьHi, I have a few questions, slightly confused here.
I tried to run the command "docker-compose up -d", didnt work, so I launched the Docker Desktop first, then ran the command again, and it worked. My question, is Docker Daemon is essentially a process(which is initiated by launching Docker Desktop?
Secondly, when I launched Docker Desktop & looked under my containers tab, it had 6 containers running, out of 7. Then running the command "docker-compose up -d", will only run it in the background OR would it cause the application to run extra containers?
Would love some clarifications if possible.
Cheers!
Hello great tutorial!!
I have a little problem at the postgres part
I get: FATAL: password authentication failed for user "airflow"
FATAL: password authentication failed for user "airflow"
I have followed the the video step by step so I don't think I missed something.
If you could provide any help it would be appreciated :) Cheers!
anyone had the "no module named pwd" after airflow webserver -p 8080?
Ответитьthe example dags are not showing in the UI for me, any reason why?
Ответитьcoder2j: "BOOOOM" also, great tutorial, I'm learning a lot with this.
ОтветитьI'm using Mac, but it's still showing below warning while running docker-compose up airflow-init. Should i run echo -e "AIRFLOW_UID=$(id -u)" > .env. If yes the why? Please suggest.
WARN[0000] The "AIRFLOW_UID" variable is not set. Defaulting to a blank string.
Thank you, this was responsibly informative. There was other sources of info I didn't even get as good of information when compared to this video.
Simply put your video is really good! Thank you for making it.
Just finished this tutorial, took me about a week plus I made changes to the dags to fully understand it.
It is maybe a little bit outdated, I was running the latest version of airflow 2.6.3 but it is still a very good tutorial. The only real differences are tree view is deprecated. The only challenge I had here was using minio s3 as the s3 connection is also deprecated. I modified the code though to use azure blob storage instead and I’m pretty happy. Great tutorial, thank you for this.
Also you’re at 5.9k liked where is the video on docker operator @coder2j ??
Great explanation!
ОтветитьMr.GOAT
Ответитьwell done, it was an awesome session. Thank you
Ответитьyou got 5000+ likes. Please create video on airflow docker operator
ОтветитьMam dowód że Google używa cały czas Apacze 2.0 na Microsoft Azure.
ОтветитьWow this overview is great. Thanks so much for doing this.
ОтветитьCan I follow these steps of installing airflow
on windows
This is amazing. I wasn't expecting like this tutorial. Thank you <3
ОтветитьThis video is very very awesome. I appreciate your hard work. Thank you very much.
ОтветитьI just love the keyboard sound, oof. Better than having a background music
ОтветитьAwesome tutorial! Thank you
ОтветитьBrilliant tutorial!thank you a lot!
Ответитьi'm using windows and i can't find any bin/activate
Ответитьthank you!!
Ответитьgreat job, thanks for the content man👍
ОтветитьThank you, it's very usefull!
ОтветитьI perform all your procedure but when I go to login in apache airflow there is none of the 26 Dags standard, you can tell me why?
Ответитьwhy my Airflow DAGS doesn't have example files like yours. Where can I download it?
ОтветитьThank you for sharing the airflow concepts. I was looking for a way to use the GCP Secret Manager SA connection. I have a few questions regarding the secret manager service account in the airflow connection. My end goal is to deploy DAGs into composer that use a different service account than the current environment cluster SA. In my scinerio we have multiple tasks /Dags(used dags (using Pythonoperator) that need to use our secret manager SA connection. I have created the secret manager service account in the airflow connection (GCP connection). Question: How do we use the created connection to run the DAGs? Can we add the connection to default_args? If yes/no, how can we reference this connection to other tasks?
If possible, can you please help me out? Many thanks.
Gracias por compartir
Ответить