30.Access Data Lake Storage Gen2 or Blob Storage with an Azure service principal in Azure Databricks

30.Access Data Lake Storage Gen2 or Blob Storage with an Azure service principal in Azure Databricks

WafaStudies

2 года назад

39,977 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

govardhanreddy samireddy
govardhanreddy samireddy - 28.08.2023 07:07

connecting to data lake storage videos are confusing, why should we use service principal when we can access through Azure KeyVault directly?

Ответить
goSmart
goSmart - 13.08.2023 18:28

Thank you so much for your video. It was a much needed help.

Ответить
MBA_ANALYST
MBA_ANALYST - 25.05.2023 13:50

♥♥

Ответить
Ramu M
Ramu M - 03.01.2023 06:24

I want to know to what is the benefit of using this service principle I'd ,name and value under oauth function while we can access files from blobstorage using just secret scope itself directly. Is there any advantage of using this ?

Ответить
Bhromon India
Bhromon India - 27.10.2022 12:42

love it, very well explained.

Ответить
mallik_ CMC
mallik_ CMC - 15.10.2022 19:51

content explanation is very nice .but one suggestion is that use proper naming conventions . that will give more understanding to new users.

Ответить
Ravikumar kumashi
Ravikumar kumashi - 28.09.2022 01:14

Very crisp and clean thank you for this vedio

Ответить
Ishaan Gupta
Ishaan Gupta - 25.08.2022 12:03

Hi, One question . Dont you have to mount the file system again by using these Azure service principal's configuration ?
I think you are able to read the Data coz your storage is already mounted by direct access keys ??

Ответить
Joe Stopansky
Joe Stopansky - 31.07.2022 16:11

Thanks for the video; it is very informative. Using this method, do you need to execute the spark.conf.set() commands every time you restart the cluster? My guess is that you would since you are only affecting configs of this specific spark session.

Ответить
Johnpaul Prathipati
Johnpaul Prathipati - 21.07.2022 04:50

Hi sir i have been following every video in this db playlist.
could you tell me how many more videos can be there to complete this DB playlist??

Ответить
bamidele james
bamidele james - 20.07.2022 18:10

Hi, can you treat how to set up a shared external Hive metastore to be used across multiple Databricks workspaces, the purpose is to be able to reference dev workspace data in prod instance

Ответить
Sewa Studies
Sewa Studies - 19.07.2022 12:00

10 api access through adf in azure 6 success and 4 failed how to get only failed api?

Ответить
Amol Parihar
Amol Parihar - 18.07.2022 14:47

Please make video on polybase and jdbc approach.

Ответить
Bollywood Badshah.
Bollywood Badshah. - 17.07.2022 17:49

Thank a lot sir

Ответить
Dinesh Deshpande
Dinesh Deshpande - 17.07.2022 17:03

Sir, how can we configure Azure data bricks Hive metastore to some external etl tool like informatica . The purpose is to fetch data from hive tables and use databricks engine for push down optimization to improve the performance of the data fetching.

Ответить