Комментарии:
Hi, I have scenario like ' File move to one folder to another then process(load to Snowflake) the file & Archive it(move file again source to Archive folder)' In it Flowfile to time GetFile & PutFile, how i do depending condition one after other.
ОтветитьHi Steven, Thank you a lot for you shown! One question, how to fill Catalog in correct way?
Ответить"Input requirement: This component DOWS NOT ALLOW an incoming relationship". Is it possible to start this processor after another one?
ОтветитьSir, you're videos are amazing!
ОтветитьHello Steven, thank you for these videos. I have used Nifi for several months. I can handle easily with tables migration that contains 5 - 20 million rows. Now ı am loading a big transaction table that contains 550 million rows. Generally ı perefer QueryDataforbaseTable > PutDatabaseRecord processors for small size migration, and GenerateTableFetch > ExecuteSQL > PutDatabaseRecord for medium size tables (5- 20 m). I can't use the first method because ı get heap size memory errors although ı tried max rows for a flow parameter. For very big size like 550 m rows, ı can use parallel flows that generatetablefetch create different partitions according to a where condition and order by or max column value parameters. This method works but ı try to find much more faster methods. Maybe you can give me an advice to handle very big size table migration from PostgreDB to VerticaDB without any transformations.
Ответитьcan you please put together a video for getting data from mysql to publishing to Kafka, consuming and the putting that in s3.
ОтветитьGood afternoon. I managed to perform all the steps and be successful! Thank you very much. Just one question: - There is no way to send only the registration records to the DatawareHouse, without the need to send all the records again - avoiding duplication of information?
ОтветитьHello, Thanks for the training videos. How do you keep the QueryDatabaseTable processor to stop reading the old events over and over again?
ОтветитьI'm glad that I've found your video, sir would you please let me know how to transform the data ! I basically want to do some kind of masking or updation before inserting the data into the destination database. Thanks in advance
ОтветитьNice explanations. thank you! Could you demonstrate how can i made this for updates too. in this case a incremental line works very well, but how can i get the updates too? thanks.
ОтветитьHello! I try to configure data selection from Hive (Serialization type = JSON) to Oracle. In PutDatabaseRecord Config I need to select Record reader - I select JsonPathReader, but state of JsonPathReader is Invalid. what could be wrong?
ОтветитьThank you so much, I am using your videos to learn NIFI ..I am getting java.sql.SQLDataException: None of the fields in the record map to the columns defined ..still couldn't fix the issue ..
ОтветитьHi Steven,
Thanks a lot for the Nifi videos they've been VERY helpful. I'm running into a challenge using QueryDatabaseTable processor. My incremental load relies on two columns, an ID and an update_time column. When I put these two columns in the Maximum-value Columns property the generated query to poll the changes has WHERE clause with an AND between the two columns condition which results in losing some of the changes. However, I would like to change that to an OR. I'd like my WHERE clause to be where ID > {MAX_ID} OR update_time > {MAX_Update_time}. Any idea how to achieve that?
Awesome video I like this video tutorial. I have one doubt if I have multiple table to read and insert then how can we achieve
ОтветитьHi Steven,
Thanks for the videos ...very helpful information.
Oh boy oh boy Steven, I have no words to describe how useful your videos been. We were quoted $4400/month by a company for an AWS solution that I am now able to do on a single $100/month server with Nifi with capacity to spare... anda second $100/month server ready for DR. Sure, still lots of things to do and learn but the numbers speak for themselves!
Your presentation style and pace is great, those 26 minutes went by fast! Looking forward to learning some more! Cheers.