0% found this document useful (0 votes)
13 views3 pages

HIVE installation

The document outlines the installation and configuration of Hive with a local metastore, including steps for setting up the environment and creating necessary directories. It details the creation of a database, managed and external tables for transactional records, and the process for loading data into these tables. Additionally, it provides commands for querying and describing the table metadata in HiveQL.

Uploaded by

Komal Kumar Sahu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views3 pages

HIVE installation

The document outlines the installation and configuration of Hive with a local metastore, including steps for setting up the environment and creating necessary directories. It details the creation of a database, managed and external tables for transactional records, and the process for loading data into these tables. Additionally, it provides commands for querying and describing the table metadata in HiveQL.

Uploaded by

Komal Kumar Sahu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

WEEK8: i)Hive installation

i) Working with HiveQL


HIVE Installation with local metastore configuration

1)copy the hive tar file to your home path (/home/hduser/install) and extract using
below command cd home/hduser/install/
tar xvzf apache-hive-0.14.0-bin.tar.gz
sudo mv apache-hive-0.14.0-bin /usr/local/hive
2)Make an entry in the bash profile like
below, vi .bashrc
export
HIVE HOME=/usr/local/hive
export
PATH=$PATH:$HIVE HOME/
bin

Then source the bash profile like

below, source /.bashrc

3)After you complete the above steps execute below commands to create directories and give
permissions

hadoop fs -mkdir p
/user/hivelwarehouse/ hadoop fs
chmod gtw /user/hive/warehouse
hadoop fs -chmod gtw /tmp
4) Cd to bin folder inside hive

pathcd /usr/local/hive/bin
put an entry like
below, vi hive
config.sh
export HADOOP HOME=/usr/local/hadoop
A. Create Database

create database retail:

B.Select Database
use retail;

set hive.cli.print.current.db-true;

c. Create table for storing transactional records

creating managed tables:

create table txnrecords(txnno INT, txndate STRING, custno INT,


amount DOUBLE,category STRING, product STRING, city
STRING, state STRING, spendby STRING) row format delimited
fields terminated by
"lines terminated
by n'stored as
textfile;
!mkdir -p home/hduser/hive/data;
Copy all content from data path in the pen drive to the above path.
/home/hduser/hiveldata
Creating External tables:

create external table externaltxnrecords(txnno INT, txndate STRING, custno


amount DOUBLE,category STRING, product STRING, city STRING, stateINT,
STRING, spendby STRING)
row format
delimited fields
terminated by'"
stored as textfil location
/user/hduser/hiveexternaldata';
D. Load the data into the table [From
Linux client)
LOAD DATALOCAL
INPATH'home/hduser/hive/data/txns' OVERWRITE INTO
TABLE txnrecords; Load the data into the table [ From HDFS]

cd /home/hduser/hive/data/
hadoop fs -copyFromLocal txns
txnsl

LOAD DATAINPATH '/user/hduser/txnsl' OVERWRITE INTO


TABLE txnrecords; Select the loaded data
Map/reduce execution
check? select count(l) from
txnrecords ;select * from
txnrecords limit 10;
select * from txnrecords where
category='Puzzles';
E. Describing metadata or schema of the table

describe formatted txnrecords;


describe formatted externaltxnrecords;

You might also like