Комментарии:
Can you please make a tutorial to install pig, hive and hbase ?
ОтветитьSaved my time. Struggled alot before watching this video to install Hadoop.! Thank you sir!
Ответитьthanks man it worked!
ОтветитьMan, thank you very much for your help
ОтветитьTo the point explanation and I've successfully installed Hadoop in my Ubuntu OS. Thanks you for your time.
ОтветитьThanks a lot for video!
ОтветитьThank you, sir, i was legit struggling even after attending college classes on this!
ОтветитьThank you so much bro for making this detailed video for hadoop installation. I was wondering here and there but now I found the one that I needed.
ОтветитьDidn't work
ОтветитьOut of so many tutorials out there, this is the one to follow. Thanks!
Ответитьexplain why you did what you did with those configuration...... otherwise video is useless
Ответитьthe first link die?
ОтветитьThank you! Bless you for the amazing video!
ОтветитьThanks a lot!!! Great job man
ОтветитьI love so much, made my day !!! ♡♥
ОтветитьBro it shows start-all.ssh cmd not found
ОтветитьCan I move hadoop 3.2.3 content to /etc/ to avoid this to be in /home/user/Download/?
ОтветитьI had an issue when run ''ssh localhost"
The authenticity of host ''localhost (127.0.01)'' can't be established?
Someone also had issue same to me can help?😢
This works for me! Really Helpful
ОтветитьThanks for video 😊
Ответитьzsh: no such file or directory: hadoop-3.3.5/bin/hdfs
im getting this error at
hadoop-3.3.5/bin/hdfs namenode -format
this point
thanks!!! This works for me!
Ответитьname node is not showing in JPS. Everytime I have have fromat the namode like hadoop namenode -format. can anyone plz say a solution
ОтветитьThanks for this video. It helps me a lot but I have got some problem. I received info that: "server: ssh: connect to host server port 22: No route to host". My port 22 is allow and I do not know what should I do. When I try command: "hadoop fs -ls" I received info: " ls: RPC response exceeds maximum data length". I'd appreciate your help.
ОтветитьWhile doing sshlocalhost I'm getting permission denied
Ответитьlink Medium article for installation not working
ОтветитьGigachad
Ответитьi followed all the step but the i try to access the localhost it doesn't show anything, can some1 help me ?
Ответитьlocalhost: ERROR: Cannot set priority of namenode process 17674
Starting datanodes
localhost: ERROR: Cannot set priority of datanode process 17777
Starting secondary namenodes [pop-os]
pop-os: ERROR: Cannot set priority of secondarynamenode process 17964
Starting resourcemanager
ERROR: Cannot set priority of resourcemanager process 18155
Starting nodemanagers
localhost: ERROR: Cannot set priority of nodemanager process 18266
hmm i'm not able to extract hadoop-3.2.4.tar.gz... uh hjelp?
Ответитьthank you
Ответитьhow did you install jdk, if it is not downloaded
Ответитьbased
ОтветитьGood walk-through for haddop installation.
ОтветитьThank you very much
ОтветитьWhen I m browsing local host 9870 and get into file system , it is showing “Failed to retrieve data from /web Hdfc/v1/?op=LISTSTATUS:server error”. Pls help
Ответитьvery good
ОтветитьCan I work with openjdk 11 instead of 8 ?
ОтветитьWorking like a charm. Made my day!
Thank you!
when i am executing "hadoop-3.2.3/bin/hdfs namenode -format" it showing error that cannot execute.....what to do ?
Ответитьwhen we restarting our pc
I need to reformat the namenode again in order use hadoop
is there any fix ?
well done :)
Ответитьyou are a life saver, thank you so much!!!
ОтветитьHow to install Ubuntu On mac m1
ОтветитьReally helpful!
Ответить