Wednesday, November 14, 2018

Hive installation on Windows 10





Hive Introduction


In reference to Hadoop and HBase outline as well installation over Windows environment, already we have been talked and gone through the same in my previous post. We came to know that Hadoop can perform only batch processing, and data will be accessed only in a sequential manner. It does mean one has to search the entire data-set even for the simplest of jobs. 

In such scenario, a huge data-set when processed results in another huge data set, which should also be processed sequentially. At this point, a new solution is needed to access any point of data in a single unit of time (random access). Here the HBase can store massive amounts of data from terabytes to petabytes and allows fast random reads and writes that cannot be handled by the Hadoop. 

HBase is an open source non-relational (NoSQL) distributed column-oriented database that runs on top of HDFS and real-time read/write access to those large data-sets. Initially, it was Google Big Table, afterwards it was re-named as HBase and is primarily written in Java, designed to provide quick random access to huge amounts of the data-set.

Next in this series, we will walk through Apache Hive, the Hive is a data warehouse infrastructure work on Hadoop Distributed File System and MapReduce to encapsulate Big Data, and makes querying and analyzing stress-free. In fact, it is an ETL tool for Hadoop ecosystem, enables developers to write Hive Query Language (HQL) statements very similar to SQL statements.

Hive Installation


In brief, Hive is a data warehouse software project built on top of Hadoop, that facilitate reading, writing, and managing large datasets residing in distributed storage using SQL. Honestly, before moving ahead, it is essential to install Hadoop first, I am considering Hadoop is already installed, if not, then go to my previous post how to install Hadoop on Windows environment.

I went through Hive (2.1.0) installation on top of Derby Metastore (10.12.1.1), though you can use any stable version.

Download Hive 2.1.0
  • https://archive.apache.org/dist/hive/hive-2.1.0/


Download Derby Metastore 10.12.1.1
  • https://archive.apache.org/dist/db/derby/db-derby-10.12.1.1/


Download hive-site.xml
  • https://mindtreeonline-my.sharepoint.com/:u:/g/personal/m1045767_mindtree_com1/EbsE-U5qCIhIpo8AmwuOzwUBstJ9odc6_QA733OId5qWOg?e=2X9cfX
  • https://drive.google.com/file/d/1qqAo7RQfr5Q6O-GTom6Rji3TdufP81zd/view?usp=sharing


STEP - 1: Extract the Hive file


Extract file apache-hive-2.1.0-bin.tar.gz and place under "D:\Hive", you can use any preferred location – 

[1] You will get again a tar file post extraction – 

Hive folder

[2] Go inside of apache-hive-2.1.0-bin.tar folder and extract again – 

Extracted folder

[3] Copy the leaf folder “apache-hive-2.1.0-bin” and move to the root folder "D:\Hive" and removed all other files and folders – 

Hive folder details

STEP - 2: Extract the Derby file


Similar to Hive, extract file db-derby-10.12.1.1-bin.tar.gz and place under "D:\Derby", you can use any preferred location –

Derby folder
Derby folder details

STEP - 3: Moving hive-site.xml file


Drop the downloaded file “hive-site.xml” to hive configuration location “D:\Hive\apache-hive-2.1.0-bin\conf”. 

Hive-site.xml

STEP - 4: Moving Derby libraries


Next, need to drop all derby library to hive library location – 
[1] Move to library folder under derby location D:\Derby\db-derby-10.12.1.1-bin\lib.

Derby libraries

[2] Select all and copy all libraries.

[3] Move to library folder under hive location D:\Hive\apache-hive-2.1.0-bin\lib.

Moved libraries

[4] Drop all selected libraries here.

Libraries



STEP - 5: Configure Environment variables


Set the path for the following Environment variables (User Variables) on windows 10 – 
  • HIVE_HOME - D:\Hive\apache-hive-2.1.0-bin
  • HIVE_BIN - D:\Hive\apache-hive-2.1.0-bin\bin
  • HIVE_LIB - D:\Hive\apache-hive-2.1.0-bin\lib
  • DERBY_HOME - D:\Derby\db-derby-10.12.1.1-bin
  • HADOOP_USER_CLASSPATH_FIRST - true


This PC - > Right Click - > Properties - > Advanced System Settings - > Advanced - > Environment Variables 

HIVE_HOME


HIVE_BIN

HIVE_LIB

DERBY_HOME

HADOOP_USER

STEP - 6: Configure System variables


Next onward need to set System variables, including Hive bin directory path – 

HADOOP_USER_CLASSPATH_FIRST - true
Variable: Path 
Value: 
  1. D:\Hive\apache-hive-2.1.0-bin\bin
  2. D:\Derby\db-derby-10.12.1.1-bin\bin


System Variables

STEP - 7: Working with hive-site.xml


Now need to do a cross check with Hive configuration file for Derby details – 
  • hive-site.xml


[1] Edit file D:/Hive/apache-hive-2.1.0-bin/conf/hive-site.xml, paste below xml paragraph and save this file.

<configuration>
<property> 
<name>javax.jdo.option.ConnectionURL</name> 
<value>jdbc:derby://localhost:1527/metastore_db;create=true</value> 
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property> 
<name>javax.jdo.option.ConnectionDriverName</name> 
<value>org.apache.derby.jdbc.ClientDriver</value> 
<description>Driver class name for a JDBC metastore</description>
</property>
<property> 
<name>hive.server2.enable.impersonation</name> 
<description>Enable user impersonation for HiveServer2</description>
<value>true</value>
</property>
<property>
<name>hive.server2.authentication</name> 
<value>NONE</value>
<description> Client authentication types. NONE: no authentication check LDAP: LDAP/AD based authentication KERBEROS: Kerberos/GSSAPI authentication CUSTOM: Custom authentication provider (Use with property hive.server2.custom.authentication.class) </description>
</property>
<property>
<name>datanucleus.autoCreateTables</name>
<value>True</value>
</property> 
</configuration>

STEP - 8: Start the Hadoop


Here need to start Hadoop first -

Open command prompt and change directory to “D:\Hadoop\hadoop-2.8.0\sbin" and type "start-all.cmd" to start apache.

Start Hadoop



It will open four instances of cmd for following tasks – 
  • Hadoop Datanaode
  • Hadoop Namenode
  • Yarn Nodemanager
  • Yarn Resourcemanager


 Hadoop Started



It can be verified via browser also as – 
  • Namenode (hdfs) - http://localhost:50070 
  • Datanode - http://localhost:50075
  • All Applications (cluster) - http://localhost:8088 etc.


Hadoop In Browser

Since the ‘start-all.cmd’ command has been deprecated so you can use below command in order wise - 
  • “start-dfs.cmd” and 
  • “start-yarn.cmd”


STEP - 9: Start Derby server


Post successful execution of Hadoop, change directory to “D:\Derby\db-derby-10.12.1.1-bin\bin” and type “startNetworkServer -h 0.0.0.0” to start derby server.

Start Derby

Derby Started

STEP - 10: Start the Hive


Derby server has been started and ready to accept connection so open a new command prompt under administrator privileges and move to hive directory as “D:\Hive\apache-hive-2.1.0-bin\bin” – 

[1] Type “jps -m” to check NetworkServerControl

Validate server

[2] Type “hive” to execute hive server.

Start Hive

Hive Started

Congratulations, Hive installed !! 😊

STEP-11: Some hands on activities


[1] Create Database in Hive - 
CREATE DATABASE IF NOT EXISTS TRAINING;

Create database

[2] Show Database - 
SHOW DATABASES;

Show Databases

[3] Creating Hive Tables - 
CREATE TABLE IF NOT EXISTS testhive(col1 char(10), col2 char(20));

Create table

Create table

[4] DESCRIBE Table Command in Hive - 
Describe Students

describe students

[5] Usage of LOAD Command for Inserting Data Into Hive Tables
Create a sample text file using ‘|’ delimiter – 

text file

Load data

[6] Hive Select Data from Table - 
SELECT * FROM STUDENTS;

Select


Stay in touch for more posts

37 comments:

  1. thanks! I'll try to come with more articles, meanwhile I went through your training website www.fitaacademy.com
    ....it covered almost all courses...looks great !

    ReplyDelete
  2. I recently came across your blog and have been reading along. I thought I would leave my first comment. I don’t know what to say except that I have enjoyed reading. odzyskiwanie danych Warszawa

    ReplyDelete
  3. I followed the process as said. Whenever I type hive on cmd . The output is "The syntax of the command is incorrect. File Not Found. Not a Valid JAR: C:/Users/Dell/org.apache.hive.beeline.cli.HiveCli"
    My hadoop and hive home is in the E: drive and i have double checked every variable.

    ReplyDelete
  4. Hi Shwetabh,
    Regret about delay response.

    I trust you went through the same steps as I stated in the article. Do a cross check and follow again, but don't forget Hadoop installation is essential to proceed Hive installation.

    So if you missed the Hadoop installation, please visit the precise post and let me know if you still face difficulties.

    ReplyDelete
  5. Hi
    I followed exact steps as given above
    But i get this error FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    while executing "create database training"

    ReplyDelete
  6. I AM ALSO GETTING THE SAME EXCEPTION FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

    ReplyDelete
  7. Hi, I am getting below issue once I run the hive on the cmd:

    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/C:/Hive/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/C:/Users/dream/Downloads/hadoop-3.1.0/hadoop-3.1.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
    ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
    Connecting to jdbc:hive2://
    Error applying authorization policy on hive configuration: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    Beeline version 2.1.0 by Apache Hive
    Error applying authorization policy on hive configuration: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    Connection is already closed.

    Not sure what is causing this.

    ReplyDelete
    Replies
    1. me too! do you resolve the problem?

      Delete
    2. i have the same problem as yours did you find a solution till now or not

      Delete
    3. i have solve the problem if u are using hive 2.1.0 and your hadoop version is 3.0 or above then it will not run on windows so install hadoop 2.8.0 then run hive it will work

      Delete
    4. yes issue is with versions, i have installed hadoop lower version to 3.0, it worked for me with no issues

      Delete
  8. Really it is very useful blog, by following this blog i installed hive in windows 10 successfully, but while installing we need to care about steps what he mentioned, thanks a lot for your contribution @Rajendra kumar.

    ReplyDelete
  9. Hi,

    I am not able to use sql count query in hive, getting this below error

    Error: Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask (state=08S01,code=2).

    Please let me know how to solve this.

    ReplyDelete
  10. Hi,
    I have followed the steps as suggested But when i start hive am getting Error message as
    Error Message :
    -----------------------------------
    C:\Bigdata\apache-hive-3.1.2-bin\bin>hive
    'hive' is not recognized as an internal or external command,
    operable program or batch file.

    C:\Bigdata\apache-hive-3.1.2-bin\bin>jps -m
    1776 StartLocalClient
    24592 NodeManager
    28816 NetworkServerControl start -h 0.0.0.0
    8384 Jps -m
    27240 NameNode
    7464 ResourceManager

    C:\Bigdata\apache-hive-3.1.2-bin\bin>


    am using hadoop version -
    --------------------------------------------
    Hadoop 3.1.3
    Source code repository https://gitbox.apache.org/repos/asf/hadoop.git -r ba631c436b806728f8ec2f54ab1e289526c90579
    Compiled by ztang on 2019-09-12T02:47Z
    Compiled with protoc 2.5.0
    From source with checksum ec785077c385118ac91aadde5ec9799
    This command was run using /C:/Bigdata/hadoop-3.1.3/share/hadoop/common/hadoop-common-3.1.3.jar

    Note : I have copied the hive-site.xml as well


    user variable
    variable value
    DERBY_HOME C:\Bigdata\db-derby-10.12.1.1-bin
    HIVE_HOME C:\Bigdata\apache-hive-3.1.2-bin
    HIVE_BIN C:\Bigdata\apache-hive-3.1.2-bin\bin
    HIVE_LIB C:\Bigdata\apache-hive-3.1.2-bin\lib
    HADOOP_USER_CLASSPATH_FIRST true


    System variable :
    C:\Bigdata\apache-hive-3.1.2-bin\bin
    C:\Bigdata\db-derby-10.12.1.1-bin\bin




    Could you please help to solve the above issue.. I tried with cmd prompt running as administrator but still am getting same issue
    Thanks in advance..

    ReplyDelete
    Replies
    1. Hi Mahesh..
      First you try with Mentioned version in blog and the try for new versions, you will only come to know the solution.

      Delete
  11. you want me to use same version of hive that has showed in this blog.. .am i correct.. let me try and verify once....

    ReplyDelete
  12. Yes .... You are correct, because I used same version and successfully installed

    ReplyDelete
  13. Hi Gireesh,

    I have tried installing Hadoop and Hive as mentioned in blog.. Previously hadoop version am using 3.1.3 .. Now i have uninstalled and re-installed the older version

    Hadoop version :
    --------

    C:\Users\mahesh_timothy>hadoop version
    Hadoop 2.8.0
    Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r 91f2b7a13d1e97be65db92ddabc627cc29ac0009
    Compiled by jdu on 2017-03-17T04:12Z
    Compiled with protoc 2.5.0
    From source with checksum 60125541c2b3e266cbf3becc5bda666
    This command was run using /C:/Bigdata/hadoop-2.8.0/share/hadoop/common/hadoop-common-2.8.0.jar

    After post installation of hive.. When i try to connect hive .still am facing below issue.. I have tried running cmd prompt in administrator mode and still facing same.


    C:\Bigdata\apache-hive-2.1.0-bin\bin>hive
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/C:/Bigdata/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/C:/Bigdata/hadoop-2.8.0/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
    ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
    Connecting to jdbc:hive2://
    Error applying authorization policy on hive configuration: java.net.ConnectException: Call From W10FYRVRC2/172.31.237.241 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
    Beeline version 2.1.0 by Apache Hive
    Error applying authorization policy on hive configuration: java.net.ConnectException: Call From W10FYRVRC2/172.31.237.241 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused: no further information; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
    Connection is already closed.

    C:\Bigdata\apache-hive-2.1.0-bin\bin>

    Please help me

    ReplyDelete
    Replies
    1. Did you find a solution, because I got the same issue.

      Delete
    2. No.. I didn't find any solution till now.. could you please share if any info you have ..Thankyou

      Delete
  14. Thanks a lot for this article I was able to get it hive up and running on windows. The only caveat is the versions of all components have to match exactly too, i tried using a more recent version of derby and hive it bombed!

    ReplyDelete
  15. Getting this issue when i am trying to execute this command

    c:\Derby\db-derby-10.15.2.0\bin>startNetworkServer -h 0.0.0.0
    Error: A JNI error has occurred, please check your installation and try again
    Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/derby/drda/NetworkServerControl has been compiled by a more recent version of the Java Runtime (class file version 53.0), this version of the Java Runtime only recognizes class file versions up to 52.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:495)

    c:\Hive\apache-hive-3.1.2\bin>hive
    'hive' is not recognized as an internal or external command,
    operable program or batch file.

    ReplyDelete
  16. C:\Windows\System32>hive
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/C:/hive/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/C:/hadoop-3.0.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
    ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
    Connecting to jdbc:hive2://
    Error applying authorization policy on hive configuration: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    Beeline version 2.1.0 by Apache Hive
    Error applying authorization policy on hive configuration: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    Connection is already closed.

    Now getting this error when trying with above said version of Hive and Derby

    ReplyDelete
  17. I am also getting the same error.

    ReplyDelete
  18. I followed same steps but getting below mentioned error, using hadoop version 3.1.3. I can't solve the error, can you help?

    C:\apache-hive-2.1.0-bin\bin>hive
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/C:/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/C:/hadoop-3.1.3/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
    ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
    Connecting to jdbc:hive2://
    com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
    Beeline version 2.1.0 by Apache Hive
    com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
    Connection is already closed.

    ReplyDelete
    Replies
    1. make sure the hadoop version is compatible with the hive version. That can cause this too

      Delete
  19. I followed your steps and installation is done database and table is created I also load some data via txt file, but when I use "select * from employee" it gave me this error:

    ERROR org.apache.hadoop.hdfs.KeyProviderCache - Could not find uri with key [hadoop.security.key.provider.path] to create a keyProvider !!

    and now it's disappeared. when I re-type the same command it's showing something like this

    hive> select * from employee;
    -chgrp: 'LONEWOLF\Sudarshan' does not match expected pattern for group
    Usage: hadoop fs [generic options]
    .
    .
    .
    Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
    OK




    4 rows selected (3.61 seconds)

    it's taking space for selected rows. Don't understand what's happening.

    ReplyDelete
  20. I'm not able to run beeline why?

    File Not Found
    Error: A JNI error has occurred, please check your installation and try again
    Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
    at java.lang.Class.getDeclaredMethods0(Native Method)
    at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
    at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
    at java.lang.Class.getMethod0(Class.java:3018)
    at java.lang.Class.getMethod(Class.java:1784)
    at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
    at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
    Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 7 more

    ReplyDelete
  21. thanks man, greetings from chile

    ReplyDelete
  22. C:\hive\bin>jps -m
    12800 Jps -m
    3856 NodeManager
    9680 NameNode
    2188 ResourceManager
    492 DataNode

    C:\hive\bin>hive
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/C:/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/C:/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
    ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
    Connecting to jdbc:hive2://
    org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483) ~[datanucleus-rdbms-4.1.7.jar:?]
    ... 83 more
    Error applying authorization policy on hive configuration: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    Connection is already closed.

    C:\hive\bin>

    I already applied all things but still getting error. could someone please help me to resolve my error?

    ReplyDelete
  23. I installed

    hadoop-2.8.0.tar.gz
    db-derby-10.12.1.1-bin.tar.gz
    apache-hive-2.1.0-src.tar.gz

    hadoop is running fine. derby is doing its job. jps -m gives correct result but hive is not starting.

    but still getting error

    Error applying authorization policy on hive configuration: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    Connection is already closed.

    could you please help me ?
    Thanks,
    Ajendra.

    ReplyDelete
  24. Getting below error while copying data from my local to hive table

    FAILED: SemanticException Line 1:23 Invalid path '"D:\data files\sample.txt"': No files matching path file:/D:/data%20files/sample.txt
    16:57:28.513 [e65e6439-3c3e-4142-8192-0a4db221da42 main] ERROR org.apache.hadoop.hive.ql.Driver - FAILED: SemanticException Line 1:23 Invalid path '"D:\data files\sample.txt"': No files matching path file:/D:/data%20files/sample.txt

    Error: Error while compiling statement: FAILED: SemanticException Line 1:23 Invalid path '"D:\data files\sample.txt"': No files matching path file:/D:/data%20files/sample.txt (state=42000,code=40000)

    ReplyDelete
  25. Your article contains very much information about business law. Your article is very informative and useful to know more about the hive installation. Thank you. oracle cloud training in hyderabad

    ReplyDelete
  26. It's evident that the author has a strong understanding of the topic and has conducted thorough research. Well-written and informative. Try Now Digital Easy's google cloud storage pricing in india

    ReplyDelete