preface

It has been one month since I joined the new company, and I have finished the work at hand. A few days ago, I finally had time to study the code of the old project of the company. In the process of studying the code, I found that Spring Aop was used in the project to achieve the separation of reading and writing of the database, in line with my own love of learning (I do not believe myself…). Decided to write an example project to achieve the effect of spring AOP read and write separation.

The deployment environment

Database: MySql

Number of libraries: 2, one master and one slave

Mysql master-slave replication in Windows mysql master-slave replication in Windows

Start project

First, of course, start building a SpringBoot project and then introduce the following dependencies in poM files:

<dependencies> <dependency> <groupId>com.alibaba</groupId> <artifactId>druid-spring-boot-starter</artifactId> < version > 1.1.10 < / version > < / dependency > < the dependency > < groupId > org. Mybatis. Spring. The boot < / groupId > < artifactId > mybatis - spring - the boot - starter < / artifactId > < version > 1.3.2 < / version > < / dependency > < the dependency > < the groupId > tk. Mybatis < / groupId > < artifactId > mapper - spring - the boot - starter < / artifactId > < version > 2.1.5 < / version > < / dependency > < the dependency > < groupId > mysql < / groupId > < artifactId > mysql connector - Java < / artifactId > < version > 8.0.16 < / version > </dependency> <! Dependencies required for dynamic data sources### start--><dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-jdbc</artifactId> <scope>provided</scope> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-aop</artifactId> <scope>provided</scope> </dependency> <! Dependencies required for dynamic data sources### end-->
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>
    	<dependency>
            <groupId>org.projectlombok</groupId>
            <artifactId>lombok</artifactId>
            <optional>true</optional> </dependency> <dependency> <groupId>com.alibaba</groupId> <artifactId>fastjson</artifactId> < version > 1. < / version > < / dependency > < the dependency > < groupId > org. Springframework. Boot < / groupId > <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-data-jpa</artifactId>
        </dependency>
    </dependencies>
Copy the code

The directory structure

Once the basic dependencies are introduced and the directory structure is cleaned up, the finished project skeleton looks like this:

Build table

Create a table user that executes SQL statements in the master library and generates the corresponding table data in the slave library

DROP TABLE IF EXISTS `user`;
CREATE TABLE `user` (
  `user_id` bigint(20) NOT NULL COMMENT 'user id',
  `user_name` varchar(255) DEFAULT ' ' COMMENT 'User name',
  `user_phone` varchar(50) DEFAULT ' ' COMMENT 'User phone',
  `address` varchar(255) DEFAULT ' ' COMMENT 'address',
  `weight` int(3) NOT NULL DEFAULT '1' COMMENT 'Weight, the bigger the better',
  `created_at` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP COMMENT 'Creation time',
  `updated_at` datetime DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP COMMENT 'Update Time',
  PRIMARY KEY (`user_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

INSERT INTO `user` VALUES ('1196978513958141952'.Test '1'.'18826334748'.'Haizhu District, Guangzhou'.'1'.'the 2019-11-20 10:28:51'.'the 2019-11-22 14:28:26');
INSERT INTO `user` VALUES ('1196978513958141953'.'test 2'.'18826274230'.'Tianhe District, Guangzhou'.'2'.'the 2019-11-20 10:29:37'.'the 2019-11-22 14:28:14');
INSERT INTO `user` VALUES ('1196978513958141954'.'test 3'.'18826273900'.'Tianhe District, Guangzhou'.'1'.'the 2019-11-20 10:30:19'.'the 2019-11-22 14:28:30');
Copy the code

Configure primary and secondary data sources

Application. Yml, the main information is the data source configuration for the master and slave libraries

server:
  port: 8001
spring:
  jackson:
  	date-format: yyyy-MM-dd HH:mm:ss
  	time-zone: GMT+8
  datasource:
    type: com.alibaba.druid.pool.DruidDataSource driver-class-name: com.mysql.cj.jdbc.Driver master: url: JDBC: mysql: / / 127.0.0.1:3307 / user? serverTimezone=Asia/Shanghai&useUnicode=true&characterEncoding=UTF-8&autoReconnect=true&failOverReadOnly=false&useSSL=false&zeroDateTimeBehavior=convertToNull&allowMultiQueries=trueUsername: root password: slave: url: JDBC: mysql: / / 127.0.0.1:3308 / user? serverTimezone=Asia/Shanghai&useUnicode=true&characterEncoding=UTF-8&autoReconnect=true&failOverReadOnly=false&useSSL=false&zeroDateTimeBehavior=convertToNull&allowMultiQueries=true
      username: root
      password:
Copy the code

Because there is a master and a slave data source, we use enumeration classes instead, so that we can use the corresponding

@Getter
public enum DynamicDataSourceEnum {
    MASTER("master"),
    SLAVE("slave"); private String dataSourceName; DynamicDataSourceEnum(String dataSourceName) { this.dataSourceName = dataSourceName; }}Copy the code

The data source configuration information class DataSourceConfig is configured with two data sources, masterDb and slaveDb

@Configuration
@MapperScan(basePackages = "com.xjt.proxy.mapper", sqlSessionTemplateRef = "sqlTemplate"Public class DataSourceConfig {// the main library @bean @configurationProperties (prefix =)"spring.datasource.master")
      public DataSource masterDb() {
  returnDruidDataSourceBuilder.create().build(); } /** * from library */ @bean@conditionalonProperty (prefix ="spring.datasource", name = "slave", matchIfMissing = true)
    @ConfigurationProperties(prefix = "spring.datasource.slave")
    public DataSource slaveDb() {
        returnDruidDataSourceBuilder.create().build(); } @qualifier (@qualifier) @qualifier (@qualifier) @qualifier (@qualifier)"masterDb") DataSource masterDataSource,
        @Autowired(required = false) @Qualifier("slaveDb") DataSource slaveDataSource) {
        DynamicDataSource dynamicDataSource = new DynamicDataSource();
        Map<Object, Object> targetDataSources = new HashMap<>();
        targetDataSources.put(DynamicDataSourceEnum.MASTER.getDataSourceName(), masterDataSource);
        if(slaveDataSource ! = null) { targetDataSources.put(DynamicDataSourceEnum.SLAVE.getDataSourceName(), slaveDataSource); } dynamicDataSource.setTargetDataSources(targetDataSources); dynamicDataSource.setDefaultTargetDataSource(masterDataSource);return dynamicDataSource;
    }
    @Bean
    public SqlSessionFactory sessionFactory(@Qualifier("dynamicDb") DataSource dynamicDataSource) throws Exception {
        SqlSessionFactoryBean bean = new SqlSessionFactoryBean();
        bean.setMapperLocations(
            new PathMatchingResourcePatternResolver().getResources("classpath*:mapper/*Mapper.xml"));
        bean.setDataSource(dynamicDataSource);
        return bean.getObject();
    }
    @Bean
    public SqlSessionTemplate sqlTemplate(@Qualifier("sessionFactory") SqlSessionFactory sqlSessionFactory) {
        return new SqlSessionTemplate(sqlSessionFactory);
    }
    @Bean(name = "dataSourceTx")
    public DataSourceTransactionManager dataSourceTx(@Qualifier("dynamicDb") DataSource dynamicDataSource) {
        DataSourceTransactionManager dataSourceTransactionManager = new DataSourceTransactionManager();
        dataSourceTransactionManager.setDataSource(dynamicDataSource);
        returndataSourceTransactionManager; }}Copy the code

Set the routing

To facilitate the search for the corresponding data source, we can use ThreadLocal to store data source information in each thread for easy access

public class DataSourceContextHolder {
    private static final ThreadLocal<String> DYNAMIC_DATASOURCE_CONTEXT = new ThreadLocal<>();
    public static void set(String datasourceType) {
        DYNAMIC_DATASOURCE_CONTEXT.set(datasourceType);
    }
    public static String get() {
        return DYNAMIC_DATASOURCE_CONTEXT.get();
    }
    public static void clear() { DYNAMIC_DATASOURCE_CONTEXT.remove(); }}Copy the code

To get the routing

public class DynamicDataSource extends AbstractRoutingDataSource {
    @Override
    protected Object determineCurrentLookupKey() {
        returnDataSourceContextHolder.get(); }}Copy the code

AbstractRoutingDataSource role is based on the search key is routed to the corresponding data source, inside it maintains a group of target data sources, and make the routing between the key and the target data source mapping, provides the method based on the key lookup the data source.

Annotations to the data source

To make it easy to switch data sources, we can write an annotation that contains the enumeration value for the data source, default is the master library,

@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.METHOD)
@Documented
public @interface DataSourceSelector {

    DynamicDataSourceEnum value() default DynamicDataSourceEnum.MASTER;
    boolean clear() default true;
}
Copy the code

Aop switch data source

Here, AOP finally comes into play. Here we define an AOP class that switches data sources for annotated methods as follows:

@Slf4j
@Aspect
@Order(value = 1)
@Component
public class DataSourceContextAop {

 @Around("@annotation(com.xjt.proxy.dynamicdatasource.DataSourceSelector)")
    public Object setDynamicDataSource(ProceedingJoinPoint pjp) throws Throwable {
        boolean clear = true;
        try {
            Method method = this.getMethod(pjp);
            DataSourceSelector dataSourceImport = method.getAnnotation(DataSourceSelector.class);
            clear = dataSourceImport.clear();
            DataSourceContextHolder.set(dataSourceImport.value().getDataSourceName());
            log.info("======== Data source to: {}", dataSourceImport.value().getDataSourceName());
            return pjp.proceed();
        } finally {
            if (clear) {
                DataSourceContextHolder.clear();
            }

        }
    }
    private Method getMethod(JoinPoint pjp) {
        MethodSignature signature = (MethodSignature)pjp.getSignature();
        returnsignature.getMethod(); }}Copy the code

Now that we’re done with the setup, let’s test the results.

Write the Service file, which contains both read and update methods,

@Service
public class UserService {

    @Autowired
    private UserMapper userMapper;

    @DataSourceSelector(value = DynamicDataSourceEnum.SLAVE)
    public List<User> listUser() {
        List<User> users = userMapper.selectAll();
        return users;
    }

    @DataSourceSelector(value = DynamicDataSourceEnum.MASTER)
    public int update() {
        User user = new User();
        user.setUserId(Long.parseLong("1196978513958141952"));
        user.setUserName("Modified name 2");
        return userMapper.updateByPrimaryKeySelective(user);
    }

    @DataSourceSelector(value = DynamicDataSourceEnum.SLAVE)
    public User find() {
        User user = new User();
        user.setUserId(Long.parseLong("1196978513958141952"));
        returnuserMapper.selectByPrimaryKey(user); }}Copy the code

The update object is userId 1196978513958141953. The update object is userId 1196978513958141953. The update object is userId 1196978513958141953.

And then let’s write a test class to see if it works,

@RunWith(SpringRunner.class)
@SpringBootTest
class UserServiceTest {

    @Autowired
    UserService userService;

    @Test
    void listUser() {
        List<User> users = userService.listUser();
        for (User user : users) {
            System.out.println(user.getUserId());
            System.out.println(user.getUserName());
            System.out.println(user.getUserPhone());
        }
    }
    @Test
    void update() { userService.update(); User user = userService.find(); System.out.println(user.getUserName()); }}Copy the code

Test results:

1. Reading method

2. Update method

After execution, the database can be compared to find that the master and slave libraries have modified the data, indicating that our read and write separation is successful. Of course, the update method can point to the slave library, so that only the data to the slave library is modified, not the master library.

Pay attention to

The example tested above, while relatively simple, is consistent with a normal read/write separation configuration. It is worth noting that the function of read/write separation is to relieve the pressure of the write library, that is, the master library, but it must be based on the principle of data consistency, that is, to ensure that the data between the master and slave library must be consistent. If a method involves write logic, then all database operations in that method go to the main library.

Suppose that the data is not synchronized to the slave library after the write operation is completed, and then the read operation also starts to execute, if the read program is still from the slave library, then there will be data inconsistency, which is not allowed.

Finally, I will send you the github address of the project. If you are interested, you can take a look at it and give it a star

Address: github.com/Taoxj/mysql…

Reference:

www.cnblogs.com/cjsblog/p/9…