sequence

This paper mainly studies the HBM2DDL of Hibernate

SchemaManagementTool

Hibernate core – 5.0.12. Final – sources. The jar! /org/hibernate/tool/schema/spi/SchemaManagementTool.java

public interface SchemaManagementTool extends Service {
    public SchemaCreator getSchemaCreator(Map options);
    public SchemaDropper getSchemaDropper(Map options);
    public SchemaMigrator getSchemaMigrator(Map options);
    public SchemaValidator getSchemaValidator(Map options);
}Copy the code

This tool defines the create, drop, Migrate, and validate functions.

SchemaCreatorImpl

Hibernate core – 5.0.12. Final – sources. The jar! /org/hibernate/tool/schema/internal/SchemaCreatorImpl.java

public class SchemaCreatorImpl implements SchemaCreator {

    @Override
    public void doCreation(Metadata metadata, boolean createNamespaces, List<Target> targets) throws SchemaManagementException {
        doCreation( metadata, createNamespaces, targets.toArray( new Target[ targets.size() ] ) ); } / /... }Copy the code

The main logic is in doCreation, which is created in the following order:

  • Create the catalog/schema
  • Create the Before Table Auxiliary Objects
  • Create sequences
  • Create tables
    • Create indexes
    • Create uniques
  • Create the foreign keys
  • Create the After Table Auxiliary Objects

These are implemented mainly with the help of the various dialects’ exporters

Dialect

Hibernate core – 5.0.12. Final – sources. The jar! /org/hibernate/dialect/Dialect.java

public abstract class Dialect implements ConversionContext { //...... // DDL support ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ private StandardTableExporter tableExporter = new StandardTableExporter( this ); private StandardSequenceExporter sequenceExporter = new StandardSequenceExporter( this ); private StandardIndexExporter indexExporter = new StandardIndexExporter( this ); private StandardForeignKeyExporter foreignKeyExporter = new StandardForeignKeyExporter( this ); private StandardUniqueKeyExporter uniqueKeyExporter = new StandardUniqueKeyExporter( this ); private StandardAuxiliaryDatabaseObjectExporter auxiliaryObjectExporter = new StandardAuxiliaryDatabaseObjectExporter( this ); / /... }Copy the code

Table, Sequence, INDEX, Foreign key, unique key, and Auxiliary Database Object exporters are defined here

StandardTableExporter

Hibernate core – 5.0.12. Final – sources. The jar! /org/hibernate/tool/schema/internal/StandardTableExporter.java

public class StandardTableExporter implements Exporter<Table> {
   //.....
    @Override
    public String[] getSqlCreateStrings(Table table, Metadata metadata) {
        final QualifiedName tableName = new QualifiedNameParser.NameParts(
                Identifier.toIdentifier( table.getCatalog(), table.isCatalogQuoted() ),
                Identifier.toIdentifier( table.getSchema(), table.isSchemaQuoted() ),
                table.getNameIdentifier()
        );

        final JdbcEnvironment jdbcEnvironment = metadata.getDatabase().getJdbcEnvironment();
        StringBuilder buf =
                new StringBuilder( tableCreateString( table.hasPrimaryKey() ) )
                        .append( ' ' )
                        .append(
                                jdbcEnvironment.getQualifiedObjectNameFormatter().format(
                                        tableName,
                                        jdbcEnvironment.getDialect()
                                )
                        )
                        .append( "("); boolean isPrimaryKeyIdentity = table.hasPrimaryKey() && table.getIdentifierValue() ! = null && table.getIdentifierValue().isIdentityColumn( metadata.getIdentifierGeneratorFactory(), dialect ); // this is the much better form moving forward as we move to metamodel //boolean isPrimaryKeyIdentity = hasPrimaryKey //  && table.getPrimaryKey().getColumnSpan() == 1 // && table.getPrimaryKey().getColumn( 0 ).isIdentity(); // Try to find out the name of the primary keyin case the dialect needs it to create an identity
        String pkColName = null;
        if ( table.hasPrimaryKey() ) {
            Column pkColumn = (Column) table.getPrimaryKey().getColumns().iterator().next();
            pkColName = pkColumn.getQuotedName( dialect );
        }

        final Iterator columnItr = table.getColumnIterator();
        boolean isFirst = true;
        while ( columnItr.hasNext() ) {
            final Column col = (Column) columnItr.next();
            if ( isFirst ) {
                isFirst = false;
            }
            else {
                buf.append( "," );
            }
            String colName = col.getQuotedName( dialect );

            buf.append( colName ).append( ' ' );

            if ( isPrimaryKeyIdentity && colName.equals( pkColName ) ) {
                // to support dialects that have their own identity data type
                if ( dialect.getIdentityColumnSupport().hasDataTypeInIdentityColumn() ) {
                    buf.append( col.getSqlType( dialect, metadata ) );
                }
                buf.append( ' ' )
                        .append( dialect.getIdentityColumnSupport().getIdentityColumnString( col.getSqlTypeCode( metadata ) ) );
            }
            else {
                buf.append( col.getSqlType( dialect, metadata )  );

                String defaultValue = col.getDefaultValue();
                if( defaultValue ! = null ) { buf.append(" default " ).append( defaultValue );
                }

                if ( col.isNullable() ) {
                    buf.append( dialect.getNullColumnString() );
                }
                else {
                    buf.append( " not null"); }}if ( col.isUnique() ) {
                String keyName = Constraint.generateName( "UK_", table, col );
                UniqueKey uk = table.getOrCreateUniqueKey( keyName );
                uk.addColumn( col );
                buf.append(
                        dialect.getUniqueDelegate()
                                .getColumnDefinitionUniquenessFragment( col )
                );
            }

            if( col.getCheckConstraint() ! = null && dialect.supportsColumnCheck() ) { buf.append(" check (" )
                        .append( col.getCheckConstraint() )
                        .append( ")" );
            }

            String columnComment = col.getComment();
            if ( columnComment != null ) {
                buf.append( dialect.getColumnComment( columnComment ) );
            }
        }
        if ( table.hasPrimaryKey() ) {
            buf.append( "," )
                    .append( table.getPrimaryKey().sqlConstraintString( dialect ) );
        }

        buf.append( dialect.getUniqueDelegate().getTableCreationUniqueConstraintsFragment( table ) );

        applyTableCheck( table, buf );

        buf.append( ') ' );

        if( table.getComment() ! = null ) { buf.append( dialect.getTableComment( table.getComment() ) ); } applyTableTypeString( buf ); List<String> sqlStrings = new ArrayList<String>(); sqlStrings.add( buf.toString() ); applyComments( table, tableName, sqlStrings ); applyInitCommands( table, sqlStrings );returnsqlStrings.toArray( new String[ sqlStrings.size() ] ); }}Copy the code

The SQL generation above also implements different statements from different databases based on the dialect passed in.

summary

In order to implement a DDL function, one dialect should be used to filter out the differences between different databases, another dialect should be used to create indexes, sequences, primary keys, foreign keys, etc., and the other dialect should be used to map field types.