How to Migrate Data from Amazon Aurora MySQL to TiDB Cloud

  1. Make sure your environment meets the migration prerequisites.
  2. Prepare your working environment.
  3. Create an Amazon’s Elastic Compute Cloud (EC2) instance in which to run the migration tools.
  4. Use Dumpling to export the data from Amazon Aurora.
  5. Use TiDB to import the data to TiDB Cloud.

Migration prerequisites

Amazon Aurora MySQL database requirements

  • utf8
  • utf8mb4
  • ASCII
  • latin1
  • binary
  • ascii_bin
  • binary
  • latin1_bin
  • utf8mb4_general_bin
  • utf8_general_bin

Stop writing data while exporting data to the Aurora database

TiDB Cloud cluster requirements

EC2 instance to run tools for migration

  • Network access
  • Storage
  • Computing resources

Prepare the working environment for data migration

Ensure Amazon Aurora is accessible

Launch the EC2 instance

  • The instance should be in the same VPC as your Amazon Aurora service. This helps you smoothly connect to Amazon Aurora.
  • Ensure that the free disk space is larger than the size of your data.
  1. Open the Amazon EC2 console. Choose Launch Instances.
  2. Choose an Amazon Machine Image (AMI). In this procedure, we use the AMI named Red Hat Enterprise Linux 8 (HVM), SSD Volume Type.

Check the connectivity with the Amazon Aurora database

sudo yum install -y mysql
# Replace the endpoint with your Aurora database endpoint
mysql -h database-1-instance-1.cbrlnigpiufa.us-west-2.rds.amazonaws.com -u admin -p --ssl-mode=DISABLED
[ec2-user@ip-172-30-1-86 ~]$ mysql -h database-1-instance-1.cbrlnigpiufa.us-west-2.rds.amazonaws.com -u admin -p --ssl-mode=DISABLED
Enter password:
Welcome to the MySQL monitor. Commands end with ; or \g.
Your MySQL connection id is 24
Server version: 5.6.10 MySQL Community Server (GPL)
Copyright (c) 2000, 2019, Oracle and/or its affiliates. All rights reserved.Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.mysql>

Check the database collation settings

select * from ((select table_schema, table_name, column_name, collation_name from information_schema.columns where character_set_name is not null) union all (select table_schema, table_name, null, table_collation from information_schema.tables)) x where table_schema not in ('performance_schema', 'mysql', 'information_schema') and collation_name not in ('utf8_bin', 'utf8mb4_bin', 'ascii_bin', 'latin1_bin', 'binary', 'utf8_general_ci', 'utf8mb4_general_ci');
Empty set (0.04 sec)

Set up VPC peering for network access to TiDB Cloud

mysql -h tidb.265f3598.23110bc6.us-west-2.prod.aws.tidbcloud.com -P 4000 -p

Migrate data from Amazon Aurora to TiDB Cloud

Export data from the source database

  1. Download the TiDB Toolkit using the following commands.
mkdir tidb-toolkit-latest-linux-amd64 && \
wget -qO- https://download.pingcap.org/tidb-toolkit-latest-linux-amd64.tar.gz|tar -xzv -C tidb-toolkit-latest-linux-amd64 --strip-components 1
export_username=<Aurora username>
export_password=<Aurora password>
export_endpoint=<the endpoint for Amazon Aurora MySQL>
backup_dir=<backup directory>

./tidb-toolkit-latest-linux-amd64/bin/dumpling \
-u "$export_username" \
-p "$export_password" \
-P 3306 \
-h "$export_endpoint" \
--filetype sql \
--threads 8 \
-o "$backup_dir" \
-f "*.*" -f '!/^(mysql|INFORMATION_SCHEMA|PERFORMANCE_SCHEMA|METRICS_SCHEMA|INSPECTION_SCHEMA)$/.*' \
--consistency="none" \
-F 256MiB

Import data into TiDB Cloud

backup_dir=<backup directory>
tidb_endpoint=<endpoint of the cluster in TiDB Cloud>
tidb_username=<TiDB username>
tidb_password=<TiDB password>
tidb_port=<TiDB port>
./tidb-toolkit-latest-linux-amd64/bin/tidb-lightning --backend tidb -check-requirements=false \
-d=$backup_dir \
-server-mode=false \
-tidb-host="$tidb_endpoint" \
-tidb-port="$tidb_port" \
-tidb-user="$tidb_username" \
-tidb-password="$tidb_password"

Conclusion

--

--

--

PingCAP is the team behind TiDB, an open-source MySQL compatible NewSQL database. Official website: https://pingcap.com/ GitHub: https://github.com/pingcap

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

How To Get Started With Sass Or SCSS — Part I

Sending mails from EC2 : Boto3 (AWS SDK for Python ) & Amazon SES

PHP Web Developer Services a Must to Your Business’s Success

We have completed AirDrop distribution for the whitelist winners.✅

How to install window 10 by using pendrive?

Running a Tight Ship: Deploying Kubernetes on a Windows Server

Real-Time Alerting with Go, Kafka and Telegram

Metadata: What is it, and How Does it Boost Your Business?

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
PingCAP

PingCAP

PingCAP is the team behind TiDB, an open-source MySQL compatible NewSQL database. Official website: https://pingcap.com/ GitHub: https://github.com/pingcap

More from Medium

Distributed Databases

Nightmare Of a Distributed Monolithic Service As a Micro-service And How EDA Is The Best Approach

Advantage of layline.io

What is OpenTelemetry? A Straightforward Guide for Devs