I am getting Connection Refused error in hadoop cluster?

This is for Hadoop eco system like HDFS, Map reduce, Hive, Hbase, Pig, sqoop,sqoop2, Avro, solr, hcatalog, impala, Oozie, Zoo Keeper and Hadoop distribution like Cloudera, Hortonwork etc.
alpeshviranik
Posts: 81
Joined: Thu Jul 17, 2014 4:58 pm
Contact:

I am getting Connection Refused error in hadoop cluster?

Postby alpeshviranik » Mon Jul 21, 2014 7:33 pm

When I try to run below hadoop command on hadoop cluster

bin/hadoop dfs -put /Users/alpesh_virani/test /Users/alpesh_virani/tmp/in

09:40:45 INFO ipoc.Client: Retrying connect to server: localhost/127.0.0.0:8701. Already tried 0 time(s). 09:40:45 INFO ipoc.Client: Retrying connect to server: localhost/127.0.0.0:8701. Already tried 1 time(s). Bad connection to FS. command aborted. exception: Call to localhost/127.0.0.0:8701 failed on connection exception: java.net.ConnectException: Connection Refused


Guest

Re: I am getting Connection Refused error in hadoop cluster?

Postby Guest » Wed Jul 23, 2014 12:07 am

this issue is realted to host, network and firewall configuration. As it is your cluster, only you can find out and track down the problem.

First thing you can do is:
Check if your HDFS processes are running? Run 'jps' command to check the running java processes.

You shoudl have at least 'Namenode' and 'Datanode' processes running. Please check and let me know.

Check it on below link:
http://wiki.apache.org/hadoop/ConnectionRefused

You get a ConnectionRefused Exception when there is a machine at the address specified, but there is no program listening on the specific TCP port the client is using -and there is no firewall in the way silently dropping TCP connection requests. If you do not know what a TCP connection request is, please consult the specification.
Unless there is a configuration error at either end, a common cause for this is the Hadoop service isn't running.
1. Check the hostname the client using is correct
2. Check the IP address the client gets for the hostname is correct.
3. Check that there isn't an entry for your hostname mapped to 127.0.0.1 or 127.0.1.1 in /etc/hosts (Ubuntu is notorious for this)
4. Check the port the client is using matches that the server is offering a service on.
5. On the server, try a telnet localhost <port> to see if the port is open there.
6. On the client, try a telnet <server> <port> to see if the port is accessible remotely.
7. Try connecting to the server/port from a different machine, to see if it just the single client misbehaving.
8. If you are using a Hadoop-based product from a third party, including those from Cloudera, Hortonworks, Intel, EMC and others -please use the support channels provided by the vendor.
9. Please do not file bug reports related to your problem, as it will be closed as Invalid


Return to “Hadoop and Big Data”

Who is online

Users browsing this forum: No registered users and 3 guests