This is the 25th day of my participation in the Gwen Challenge in November. Check out the details: The Last Gwen Challenge in 2021.”

Database level statement

Show the database

Grammar:

show databases;
Copy the code

Case study:

show databases;

database_name |
--------------+
default       |
hive_databases|
Copy the code

Creating a database

Grammar:

CREATE (DATABASE|SCHEMA) [IF NOT EXISTS] database name- the DATABASE | SCHEMA are equivalent[COMMENT]SQL > alter database;[LOCATION LOCATION where HDFS is stored]-- The location stored in HDFS
  [WITH DBPROPERTIES (property_name=property_value, ...) ] ;-- Specify additional attributes
Copy the code

Case study:

  CREATE DATABASE IF NOT EXISTS hive_databases
  COMMENT 'Hive database'
  WITH DBPROPERTIES ('create'='jacquesh');
Copy the code

Select database

Grammar:

Use Database name;Copy the code

Case study:

use hive_databases;
Copy the code

Deleting a Database

Grammar:


DROP (DATABASE|SCHEMA) [IF EXISTS] Database name [RESTRICT|CASCADE];	

**The default behavior is**RESTRICT,**The deletion fails if there are tables in the database. To delete the library and its tables, use the** CASCADE **Cascade deletion**.Copy the code

Case study:

DROP DATABASE IF EXISTS hive_databases CASCADE;
Copy the code

Display database details

Grammar:

DESCDATABASE [EXTENDED] DATABASE name;EXTENDED indicates whether additional attributes are displayed
Copy the code

Case study:

DESC DATABASE  EXTENDED hive_databases;
db_name       |comment|location              |owner_name|owner_type|parameters       |
--------------+-------+----------------------------------------------------+----------+----------+-----------------+
hive_databases|hive???|hdfs://cluster/user/hive/warehouse/hive_databases.db|hive2     |USER      |{create=jacquesh}|

Copy the code

Table level operation statement

Table creation operations

Internal and external tables

Statement:

CREATE [TEMPORARY] [EXTERNAL] TABLE [IF NOT EXISTS] [database.] Table name- the name of the table
  [(col_name data_type [COMMENT col_comment],
    ... [constraint_specification])]  -- Column name Column data type[COMMENT table Description]- table describes
  [PARTITIONED BY(col_name data_type [COMMENT],...) ]Partition table partitioning rules
  [
    CLUSTERED BY (col_name, col_name, ...) 
   [SORTED BY (col_name [ASC|DESC],...). ]INTO num_buckets BUCKETS
  ]  - Bucket division table Bucket division rules
  [SKEWED BY (col_name, col_name, ...) ON((col_value, col_value, ...) , (col_value, col_value, ...) ,...). [STOREDAS DIRECTORIES] 
  ]  -- Specifies slanted columns and values[[ROW FORMAT row_format]    
   [STORED AS file_format]
     | STORED BY 'storage.handler.class.name' [WITHSERDEPROPERTIES (...) ] ]-- Specify a line separator, store file format, or use a custom store format[LOCATION where the table is stored in HDFS]-- Specifies where the table is stored
  [TBLPROPERTIES (property_name=property_value, ...) ]-- Specifies the attributes of the table
  [AS select_statement];   Create tables from query results
Copy the code

The inner table

CREATE TABLE temps
(
The --------------- field configuration starts -------------------
    empno INT,
    ename STRING,
    job STRING,
    mgr INT,
    hiredate TIMESTAMP,
    sal DECIMAL(7.2),
    comm DECIMAL(7.2),
    deptno INT
--------------- field configuration end -------------------
)
 ROW FORMAT DELIMITED
 fields terminated by  "\t";         **Set the field separator to "\t"//After creating the table, you can view the table file in the HDFS directory (default configuration).**
Copy the code

External tables

CREATE  EXTERNAL  TABLE temps                      **External table creation needs to addexternalKeyword modification**
(
The --------------- field configuration starts -------------------
    empno INT,
    ename STRING,
    job STRING,
    mgr INT,
    hiredate TIMESTAMP,
    sal DECIMAL(7.2),
    comm DECIMAL(7.2),
    deptno INT
--------------- field configuration end -------------------
)
 ROW FORMAT DELIMITED FIELDS TERMINATED BY "\t"    **Set the field separator to "\t"**
  LOCATION  **'/hive/emp_external'; The configuration data path is also the HDFS path**
Copy the code