Azure Data Factory - Partition a large table and create files in ADLS using copy activity

Azure Data Factory - Partition a large table and create files in ADLS using copy activity

All About BI !

1 год назад

9,842 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

@payalkalantri7525
@payalkalantri7525 - 19.07.2023 12:42

Just a small help , for all kind of I/O errors /connection timeout errors you can use CONNECTIONTIMEOUT in your DB string, this works for ADF.

Ответить
@payalkalantri7525
@payalkalantri7525 - 19.07.2023 11:51

Very nice explanation however partitions are not working for I/O exception errors in ADF.

Ответить
@Kishyist
@Kishyist - 28.03.2023 09:42

How do we get the filename prefix maintained in sick? Like filename1_ProductID_NA_760_0.txt. Also, in your example, how did you achieve sql_SalesLT_Product_ProductID_NA_760_0.txt? Can you help telling me about sink details you provided?

Ответить
@scarabic5937
@scarabic5937 - 09.03.2023 19:42

Thank you for your awesome tutorials, could you provide the sink parameters please as it is not shown in the video please?

Ответить
@mohitarora792
@mohitarora792 - 07.02.2023 10:12

What settings did you used in sink to do multiple writes?

Ответить
@Thegameplay2
@Thegameplay2 - 19.01.2023 12:00

when it is talking about MPP it is not recommended to have more than 60 partitions

Ответить
@nr3807
@nr3807 - 09.08.2022 06:13

Can you please make a video for a source system like db2 where these options won't show up in the copy activity. But if we have a very big table, what will be the better approach for such scenario?

Ответить
@user-hd9lk6qx4u
@user-hd9lk6qx4u - 07.08.2022 13:52

Nice mam

Ответить