How to install (Latest) Apache Spark and Scala in Ubuntu

How to install (Latest) Apache Spark and Scala in Ubuntu

Sreyobhilashi IT

9 лет назад

24,681 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

@pradeepk4335
@pradeepk4335 - 19.07.2022 11:20

hi sir other videos are hidden in this play list, plz unlock

Ответить
@anilkumarkrishnappa7775
@anilkumarkrishnappa7775 - 03.07.2018 00:40

You have explained very nice...thank you so much....you saved a lot of time

Ответить
@TheVikash620
@TheVikash620 - 14.03.2018 10:16

There was no use to install scala all over again once we had downloaded the scala distribution already. The proable reason why scala didn't start after using command 'scala' because you missed to re-initialized .bashrc file. Can be done by using command source ~/.bascrc or simply rebotting the terminal.

Ответить
@growwitharosh5052
@growwitharosh5052 - 26.12.2017 07:13

Hi Bro, Can u please share me the location of these parameters. I got only one.

spark.driver.cores
Path= spark-2.2.0-bin-hadoop2.7/conf/spark-defaults.conf
spark.driver.memory
Path=?
spark.executor.cores
Path=?
spark.executor.memory
Path=?
spark.reducer.maxSizeInFlight
Path=?

Ответить
@da811208
@da811208 - 29.09.2017 18:08

hello,my version sprak2.2.0、hadoop2.7.2
but i use this instruction: SPARK_JAR=/usr/local/spark/lib/spark-assembly-2.2.0-hadoop2.7.2.jar HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop MASTER=yarn-client /usr/local/spark/bin/spark-shell
There have been some errors, could tell me how to run Spark on Yarn, thank you~

Ответить
@chandankumar-xs1fy
@chandankumar-xs1fy - 04.08.2017 22:13

Dear Sir,

Spark is not supporting JAVA 1.7 then how can install spark in Hadoop 2.6?

Ответить
@pankajkumar-kk8ox
@pankajkumar-kk8ox - 28.06.2017 23:29

sir , i did what you explain in this tutorial and run the spark-shell command after this i run ( hadoop:4040/jobs/ ) on my browser : It's showing The connection has timed out ...plz help i'm in big trouble ...

Ответить
@srinidhiskanda754
@srinidhiskanda754 - 30.04.2016 18:03

hi nice tutorial thanks i followed every step finally if i run spark-shell it showing command not found what went wrong

Ответить
@SOURAJITDATTA
@SOURAJITDATTA - 09.04.2016 12:36

It was really so helpful bro...!1
KUDOS

Ответить
@vjusof
@vjusof - 07.04.2016 10:53

I keep getting the below error:

"Failed to find Spark assembly in /opt/spark-1.6.1/assembly/target/scala-2.10.
You need to build Spark before running this program."

Can someone please help :(

Ответить
@gwavebabe7748
@gwavebabe7748 - 06.09.2015 05:23

The presenter mumbled along. Very hard to understand.

Ответить
@MauriBrunner
@MauriBrunner - 28.08.2015 20:58

To install Spark I must have previously installed hadoop? Thank you

Ответить
@shubhamkumar8832
@shubhamkumar8832 - 27.07.2015 22:16

Can you upload the tutorial of installing spark on above Hadoop

Ответить
@ramyasundaresan
@ramyasundaresan - 03.07.2015 20:55

i have installed spark and hive successfully... it works well.
But when Spark starts up, lots of Hive Warnings are produced... what are the correct steps to set Hive on Spark?

Ответить
@ramyasundaresan
@ramyasundaresan - 03.07.2015 20:54

Amazing Tutorial! :D

Ответить
@brajeshkokkonda
@brajeshkokkonda - 17.06.2015 23:26

Excellent Explanation !!! I had big headache installing spark (I used maven and sbt to build, it too much time to build ). But your video made it very simple. Thank you very much !!!

Ответить