Sqoop command to import data from mysql database to HDFS
Step 1. Login to Mysql database.
cloudera@cloudera-vm:~$ mysql -uroot -p
Enter password:
Welcome to the MySQL monitor. Commands end with ; or \g.
Your MySQL connection id is 54
Server version: 5.1.61-0ubuntu0.10.10.1 (Ubuntu)
Copyright (c) 2000, 2011, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
Type ‘help;’ or ‘\h’ for help. Type ‘\c’ to clear the current input statement.
Step 2. Check the database name ,table name and data from table.
mysql> show databases;
+——————–+
| Database |
+——————–+
| information_schema |
| mysql |
| test |
+——————–+
3 rows in set (0.00 sec)
mysql> use test;
Reading table information for completion of table and column names
You can turn off this feature to get a quicker startup with -A
Database changed
mysql> show tables;
+—————-+
| Tables_in_test |
+—————-+
| employee |
+—————-+
1 row in set (0.00 sec)
mysql>
mysql>
mysql>
mysql> select * from employee;
+——-+——-+——–+
| empid | ename | salary |
+——-+——-+——–+
| 2 | Smith | 200 |
| 3 | Kathy | 5600 |
| 1 | John | 5000 |
+——-+——-+——–+
3 rows in set (0.00 sec)
Step 4. Fire the Sqoop command to import Mysql table data to HDFS.
cloudera@cloudera-vm:~$ sqoop import –connect jdbc:mysql://127.0.0.1:3306/test –username root –password tiger –table employee –m 1 –target-dir /user/cloudera/emp
Step 5. Crosscheck the output by going to HDFS level.
cloudera@cloudera-vm:~$ hadoop fs -ls /user/cloudera/emp
Found 3 items
-rw-r–r– 1 cloudera supergroup 0 2019-01-21 07:26 /user/cloudera/em p/_SUCCESS
drwxr-xr-x – cloudera supergroup 0 2019-01-21 07:26 /user/cloudera/em p/_logs
-rw-r–r– 1 cloudera supergroup 37 2019-01-21 07:26 /user/cloudera/em p/part-m-00000
cloudera@cloudera-vm:~$ hadoop fs -cat /user/cloudera/emp/part-m-00000
2,Smith,200
3,Kathy,5600
1,John,5000