Комментарии:
Amazing content! Thanks
ОтветитьThanks for your videos, I find them helpful. I could get the message published by a python script to pub/sub, updated to the data column in a big query table, by simply creating a subscription that writes to Big Query (to the same topic) without using Dataflow. Since pub sub is schema less, it is receiving whatever schema is published by the python script. My question is , is there a way to update a big query table using the same schema received in pub/sub?
ОтветитьAwsome explanation really wanted to know this, If I migrate Control-M Workload automation tool to GCP. How will I connect control-m to pub/sub?
ОтветитьHi, Thanks for the great informative video. can you explain the flow if the data source is from a Rest API. Can we have a dataflow configured to extract from a Rest API to big query with dataflow without having cloud functions or Apache beam scripts involved? Thanks a lot in advance..
ОтветитьHey bro. Thanks for the video. I have a ETL process running on VM, using docker and Kafka. And the data is getting stored in big query, as soon as I run the producer and consumer manually. I wanted to use cloud compose to automate this (like whenever I login to my VM the etl process starts automatically), but I couldn’t. Can you tell me if it’s possible to do this with dataflow? I am having trouble setting it up.
ОтветитьHey thx for teaching, good explaining, I want to ask a stupid question > <
Why not send data directly to Bigquery ?? ( only 1 step )
Send to PubSub => Dataflow => Bigquery ( 3 steps .... ) Thx !!!