Sqoop Installation
SQOOP
SQOOP= (SQ)L + HAD(OOP)
Relational database systems often store valuable in a company. If made available, that data can be managed and processed by Apache Hadoop, which is fast becoming the standard for big data processing. Several relational database vendors championed developing integration with Hadoop within one or more on there.
Transferring data to and from relational databases is challenging and laborious. Because data transfer requires careful handling, Apache Sqoop, short for “SQL to Hadoop,” was created to perform bidirectional data transfer between Hadoop and almost any external structured datastore. Taking advantage of MapReduce, Hadoop’s execution engine, Sqoop performs the transfers in a parallel manner.
Step 1: Download sqoop
Download the sqoop-1.4.5.bin.tar.gz file into your required location and extract it.
Step 2: Set path in bashrc
Open bashrc file and set the sqoop path
export SQOOP_HOME=$HOME/sqoop 1.4.5
export PATH= $SQOOP_HOME/bin
Step 3 : Copy mysql connector
Copy mysql-connector.jar file from the location of /usr/share/java
and copyto sqoop-1.4.5/lib
Step 4 : Open mysql
To open mysql type the following command in the terminal.
mysql -h localhost -u root –p
Step 5 : Create database in mysql
Sysntax : create database databasename;
> create database sqoopdb;
Step 6 : change the database to sqoopdb
Syntax : use databasename;
> use sqoopdb;
Step 7 : create table emp
Syntax : create table tablename(column 1 datatype,column 2 datatype …);
> create table emp(empno int,ename varchar(10),sal int);
Step 8 : insert values into emp table
Syntax : insert into tablename values(value1,value2 …);
> insert into emp values (11,’aaa’, 600);
Step 9 : Grant the sqoopdb
> grant all privileges on sqoopdb.* to ‘%’@’localhost’
Step 10 :
> grant all privileges on sqoopdb.* to ”@localhost ;
Step 11 : Run the sqoop commands
Go to Sqoop/bin folder in the terminal and run the following commands
To load the data from Mysql to HDFS
./sqoop-import –connect jdbc:mysql://localhost/sqoopdb -username root -password root -table emp -m 1
Step 12 : View imported table
To view the imported table from mysql to hdfs
Open localhost in the browser and go to the import path /user/geouser
Step 13:
To view the data in table using this bellow cat command in hadoop.
./hadoop fs -cat /user/geouser/emp/part-m-00000