Connecting to MySQL
MySQL is a reliable, high-performance open-source database with cross-platform support for seamless deployment across operating systems. It offers robust security with SSL and access control, while its compatibility with multiple programming languages enhances its versatility for application development.
MySQL seamlessly integrates with SkyPoint AI Platform (AIP) via its inbuilt connector, enabling centralized data synchronization and management. This integration simplifies operations like managing data tables, enhances analytics and governance, breaks data silos, and ensures secure, efficient data handling for improved compliance and productivity.
This document will guide you through the process of connecting My SQL to the Skypoint AI.
Prerequisite
You need the following details to configure and import data using the MySQL⇗ connector:
- Server name
- Database name
- Port number
- Username
- Password
To import data using the MySQL connector
Follow the below steps to create and configure a new dataflow for the MySQL import connector:
- Navigate to Dataflow > Imports.
- Click New dataflow as indicated by an arrow.
The Set dataflow name page appears.
- Enter the desired name for the dataflow in the Name text field.
- Click Next to proceed.
The Choose connector page appears.
Add MySQL connector
- On the Choose Connector page, use the Search feature to locate and select the MySQL Connector.
- Enter the Display Name for your dataflow in the provided text field.
- Optionally, add a Description in the designated text area.
- Click Next to proceed.
The Configuration page appears.
Connect to the MySQL account
- Fill in the required details on the Configuration page.
- Click Connect.
Once the connection is established, the connector can be used to import data from MySQL tables.
- Scroll down to the Table Details section, select the checkboxes for the tables you wish to import, and use the dropdown menu to label them as either Data or Metadata.
By default, all tables in the Table Details section are selected. You can choose to import only specific tables that are relevant to your data processing needs. For example, to import customer data, select tables containing details like name, email, address, and contact information.
Item | Description |
---|---|
Purpose | Option to assign a purpose (Data or Metadata) for each table. |
Data | Loads customer data |
Metadata | Loads Metadata |
File name | Displays the name of the file that you imported. |
Entity name | Displays the imported table name by default. You can rename it if required. |
- Please scroll down to the bottom of the page to ensure all required tables are selected.
- Click Save to apply the changes.
Congratulations ! for saving the MySQL connector dataflow, which appears on the Dataflow > Imports page.
Run, Edit, and Delete the imported data
Once the table is imported, you can execute, modify, and remove the imported table from the Dataflow. Follow the below steps:
- Go to the Dataflow > Imports page.
Item | Description |
---|---|
Name | Displays the name of the imported Dataflow. |
Type | Displays connector type symbol. |
Connector Name | Displays connector name. |
Status | Indicates whether the data is imported successfully. |
Tables Count | Displays the number of tables imported. |
Created Date | Displays date of creation. |
Last Refresh Type | Displays the refresh value: Full or Incremental. |
Updated Date | Displays last modified date. |
Last Refresh | Displays the latest refresh date, which updates each time you refresh the data. |
Actions | Provides multiple options for managing dataflows. |
- Select the horizontal ellipsis under the Actions column and do the following:
If you want to | Then |
---|---|
Modify the Dataflow | Select Edit and modify the Dataflow. Click Save to apply your changes. |
Execute the Dataflow | Select Run. |
Bring the data to its previous state | Select Rollback. |
Delete the Dataflow | Select Remove and then click the Delete button. All tables in the data source get deleted. |
See the run history of the dataflow | Select Run history. |
- Click Run to execute the dataflow. Once the execution is successful, the data pipeline status will update to Completed, as illustrated in the figure below.
In the Dataflow's Run History Description, you can view error messages related to data import failures from a data source. Additionally, you can check the status, start time, and end time of the data pipeline execution.
Next step
After completing the data import, start the Master Data Management (MDM) - Resolve process to create a single, unified view of the data. With Skypoint MDM, you can ensure that your data is accurate, consistent, and reliable, making it easier to use for business processes, analytics, and reporting.