javax.net.ssl.SSLException: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty

This is for Hadoop eco system like HDFS, Map reduce, Hive, Hbase, Pig, sqoop,sqoop2, Avro, solr, hcatalog, impala, Oozie, Zoo Keeper and Hadoop distribution like Cloudera, Hortonwork etc.
forum_admin
Site Admin
Posts: 185
Joined: Wed Jul 16, 2014 9:22 pm
Contact:

javax.net.ssl.SSLException: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty

Postby forum_admin » Tue Jun 12, 2018 7:49 pm

I am getting below error while try to access S3 bucket from AWS instance. I use ubuntu.
hdfs@localhost:~$ hadoop fs -ls s3a://mydata/telecom_data_usa/migrationtables

Code: Select all

18/06/12 19:43:26 INFO http.AmazonHttpClient: Unable to execute HTTP request: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty
javax.net.ssl.SSLException: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty
   at sun.security.ssl.Alerts.getSSLException(Alerts.java:208)
   at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1914)
   at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1872)
   at sun.security.ssl.SSLSocketImpl.handleException(SSLSocketImpl.java:1855)
   at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1376)
   at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1353)
   at org.apache.http.conn.ssl.SSLSocketFactory.connectSocket(SSLSocketFactory.java:553)
   at org.apache.http.conn.ssl.SSLSocketFactory.connectSocket(SSLSocketFactory.java:412)
   at com.amazonaws.http.conn.ssl.SdkTLSSocketFactory.connectSocket(SdkTLSSocketFactory.java:128)
   at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:179)
   at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:328)
   at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:612)
   at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:447)
   at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:884)
   at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
   at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)
   at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:728)
   at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:489)
   at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:310)
   at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3785)
   at com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1107)
   at com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:1070)
   at org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:294)
   at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:256)
   at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2795)
   at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
   at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2829)
   at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2811)
   at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390)
   at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
   at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:325)
   at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:235)
   at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:218)
   at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103)
   at org.apache.hadoop.fs.shell.Command.run(Command.java:165)
   at org.apache.hadoop.fs.FsShell.run(FsShell.java:297)
   at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
   at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
   at org.apache.hadoop.fs.FsShell.main(FsShell.java:356)
Caused by: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty


forum_admin
Site Admin
Posts: 185
Joined: Wed Jul 16, 2014 9:22 pm
Contact:

Re: javax.net.ssl.SSLException: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty

Postby forum_admin » Tue Jun 12, 2018 7:50 pm

run below command as root user. It worked for me in ubuntu.

Code: Select all

update-ca-certificates -f


Return to “Hadoop and Big Data”

Who is online

Users browsing this forum: No registered users and 2 guests