Blog
Hive Installation with MySQL Database Metastore
- October 13, 2021
- Posted by: Ankush Thavali
- Category: Hadoop

In this blog, We will see the installation of hive with mysql database.
Steps to Download and Configure MySql Metastore for Hive

Step 1: Download the Hive from it’s official website by clicking on this Link and download the tar file shown in the below image which is about 143MB in size.
tar -xvf apache-hive-2.1.1-bin.tar.gz
Step 2: Once you have downloaded the Hive now download the MySQL java connector from maven repository.
https://mvnrepository.com/artifact/mysql/mysql-connector-java/8.0.24
Then create a link between jar file and hive lib folder and copy jar to the lib folder.
rename mysql-connector-java-8.0.22 to mysql-connector-java.jar cp /home/hdoop/mysql-connector-java.jar $HIVE_HOME/lib/ sudo ln -s /usr/share/java/mysql-connector-java.jar $HIVE_HOME/lib/mysql-connector-java.jar
Step 3: Move to /lib folder inside your apache-hive-2.1.1-bin folder and then delete the file log4j-slf4j-impl-2.4.1.jar . We have deleted this file because the same file is also present in the Hadoop folder because of which sometimes it gives an error.
guava lib issue
rm $HIVE_HOME/lib/guava-19.0.jar cp $HADOOP_HOME/share/hadoop/common/lib/guava-27.0-jre.jar $HIVE_HOME/lib/
Now the most important part is to set path for Hive in our .bashrc file, so open .bashrc with below command.
cd /home/hdoop sudo vi ~/.bashrc
Copy the Hive path shown in the below image and update it according to your hive path (if different).
#Hive Path export HIVE_HOME=/home/hdoop/apache-hive-2.1.1-bin export PATH=$PATH:$HIVE_HOME/bin
source ~/.bashrc
Step 4: Now start the Hadoop with the below command:
start-dfs.sh start-yarn.sh
You can confirm all the daemons started working properly or not by using the command jps.
Step 5: Once your Hadoop gets started we will create Directories for the hive. Implement below commands in your terminal to make directories.
hdfs dfs -mkdir -p /user/hive/warehouse hdfs dfs -mkdir -p /tmp/hive
In Warehouse, we will store all data of our databases and tables. All this directory is default and already configured.
Step 6: Now we will change permission for all this directory with below command.
hdfs dfs -chmod 777 /tmp/ hdfs dfs -chmod 777 /user/hive/warehouse hdfs dfs -chmod 777 /tmp/hive
Step 7: Now we will install MySQL with below command.
sudo apt-get install mysql-server
Step 8: Create the Metastore Database after entering your MySQL terminal, implement all the below commands to do so (use root as password for SQL).
mysql> sudo mysql -u root -p mysql> CREATE DATABASE metastore_db; mysql> USE metastore_db;
Change the username according to you and path also if it is different. select metastore version .sql file based on hive version series
mysql> SOURCE /home/{user-name}/apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.14.0.mysql.sql; in my case hdoop is the username. mysql> source /home/hdoop/apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.14.0.mysql.sql
Step 9: Now make hive user and hive password with below command on mysql terminal.
mysql> create user 'hive'@'%' identified by 'hive'; Query OK, 0 rows affected (0.02 sec) mysql> grant all PRIVILEGES on *.* to 'hive'@'%' with grant option; Query OK, 0 rows affected (0.02 sec) mysql> create user 'hdfs'@'%' identified by 'hdfs'; Query OK, 0 rows affected (0.04 sec) mysql> grant all PRIVILEGES on *.* to 'hdfs'@'%' with grant option; Query OK, 0 rows affected (0.03 sec) mysql> FLUSH PRIVILEGES; Query OK, 0 rows affected (0.02 sec) mysql> exit
Step 10: Now open hive-env.sh and append your hadoop path inside it.
export HADOOP_HOME=/home/hdoop/hadoop-3.2.1
Step 11: create a new file as hive-site.xml and enter the following content in the file
cd $HIVE_HOME/conf vi hive-site.xml
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost/metastore_db?createDatabaseIfNotExist=true</value>
<description>metadata is stored in a MySQL server</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>MySQL JDBC driver class</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
<description>user name for connecting to mysql server</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
<description>hivepassword for connecting to mysql server</description>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/user/hive/warehouse</value>
<description>location of default database for the warehouse</description>
</property>
<property>
<name>hive.metastore.uris</name>
<value>thrift://localhost:9083</value>
<description>Thrift URI for the remote metastore.</description>
</property>
<property>
<name>hive.server2.enable.doAs</name>
<value>false</value>
</property>
</configuration>
Step 12: Starting Hive metastore… Now let’s start our metastore service, run the following command in your terminal,
hive --service metastore
Open the hive console by hive command.
hive
Test the hive If you were able to get into the hive console without any errors, you can verify your metastore configuration by following these steps.
hive Create a table in hive. create table test(id int, name string);
In this blog, We will see the installation of hive with mysql database.
Steps to Download and Configure MySql Metastore for Hive

Step 1: Download the Hive from it’s official website by clicking on this Link and download the tar file shown in the below image which is about 143MB in size.
tar -xvf apache-hive-2.1.1-bin.tar.gz
Step 2: Once you have downloaded the Hive now download the MySQL java connector from maven repository.
https://mvnrepository.com/artifact/mysql/mysql-connector-java/8.0.24
Then create a link between jar file and hive lib folder and copy jar to the lib folder.
rename mysql-connector-java-8.0.22 to mysql-connector-java.jar cp /home/hdoop/mysql-connector-java.jar $HIVE_HOME/lib/ sudo ln -s /usr/share/java/mysql-connector-java.jar $HIVE_HOME/lib/mysql-connector-java.jar
Step 3: Move to /lib folder inside your apache-hive-2.1.1-bin folder and then delete the file log4j-slf4j-impl-2.4.1.jar . We have deleted this file because the same file is also present in the Hadoop folder because of which sometimes it gives an error.
guava lib issue
rm $HIVE_HOME/lib/guava-19.0.jar cp $HADOOP_HOME/share/hadoop/common/lib/guava-27.0-jre.jar $HIVE_HOME/lib/
Now the most important part is to set path for Hive in our .bashrc file, so open .bashrc with below command.
cd /home/hdoop sudo vi ~/.bashrc
Copy the Hive path shown in the below image and update it according to your hive path (if different).
#Hive Path export HIVE_HOME=/home/hdoop/apache-hive-2.1.1-bin export PATH=$PATH:$HIVE_HOME/bin
source ~/.bashrc
Step 4: Now start the Hadoop with the below command:
start-dfs.sh start-yarn.sh
You can confirm all the daemons started working properly or not by using the command jps.
Step 5: Once your Hadoop gets started we will create Directories for the hive. Implement below commands in your terminal to make directories.
hdfs dfs -mkdir -p /user/hive/warehouse hdfs dfs -mkdir -p /tmp/hive
In Warehouse, we will store all data of our databases and tables. All this directory is default and already configured.
Step 6: Now we will change permission for all this directory with below command.
hdfs dfs -chmod 777 /tmp/ hdfs dfs -chmod 777 /user/hive/warehouse hdfs dfs -chmod 777 /tmp/hive
Step 7: Now we will install MySQL with below command.
sudo apt-get install mysql-server
Step 8: Create the Metastore Database after entering your MySQL terminal, implement all the below commands to do so (use root as password for SQL).
mysql> sudo mysql -u root -p mysql> CREATE DATABASE metastore_db; mysql> USE metastore_db;
Change the username according to you and path also if it is different. select metastore version .sql file based on hive version series
mysql> SOURCE /home/{user-name}/apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.14.0.mysql.sql; in my case hdoop is the username. mysql> source /home/hdoop/apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.14.0.mysql.sql
Step 9: Now make hive user and hive password with below command on mysql terminal.
mysql> create user 'hive'@'%' identified by 'hive'; Query OK, 0 rows affected (0.02 sec) mysql> grant all PRIVILEGES on *.* to 'hive'@'%' with grant option; Query OK, 0 rows affected (0.02 sec) mysql> create user 'hdfs'@'%' identified by 'hdfs'; Query OK, 0 rows affected (0.04 sec) mysql> grant all PRIVILEGES on *.* to 'hdfs'@'%' with grant option; Query OK, 0 rows affected (0.03 sec) mysql> FLUSH PRIVILEGES; Query OK, 0 rows affected (0.02 sec) mysql> exit
Step 10: Now open hive-env.sh and append your hadoop path inside it.
export HADOOP_HOME=/home/hdoop/hadoop-3.2.1
Step 11: create a new file as hive-site.xml and enter the following content in the file
cd $HIVE_HOME/conf vi hive-site.xml
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost/metastore_db?createDatabaseIfNotExist=true</value>
<description>metadata is stored in a MySQL server</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>MySQL JDBC driver class</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
<description>user name for connecting to mysql server</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
<description>hivepassword for connecting to mysql server</description>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/user/hive/warehouse</value>
<description>location of default database for the warehouse</description>
</property>
<property>
<name>hive.metastore.uris</name>
<value>thrift://localhost:9083</value>
<description>Thrift URI for the remote metastore.</description>
</property>
<property>
<name>hive.server2.enable.doAs</name>
<value>false</value>
</property>
</configuration>
Step 12: Starting Hive metastore… Now let’s start our metastore service, run the following command in your terminal,
hive --service metastore
Open the hive console by hive command.
hive
Test the hive If you were able to get into the hive console without any errors, you can verify your metastore configuration by following these steps.
hive Create a table in hive. create table test(id int, name string);