Connecting to Azure Synapse Analytics
This guide provides a step-by-step approach to effortlessly connecting Azure Synapse Analytics with Skypoint AI. The Azure Synapse Connector allows your platform to connect with Azure Synapse Analytics, enabling the ingestion of structured data from Synapse SQL pools. It simplifies data integration by securely extracting data from Synapse tables or views and loading it into your platform for further transformation, reporting, or AI-driven insights. The Azure Synapse Analytics Connector in Skypoint AI enables seamless integration with Azure Synapse Analytics, a cloud-based data warehouse and analytics service. It allows organizations to ingest and synchronize structured data such as transactions, general ledger entries, and vendor records using secure connections. The connector supports service principal authentication, automated scheduling, and data transformation, making it easy to integrate data into Skypoint’s unified data models. This enhances data accessibility, analytics capabilities, and enables AI-driven financial and operational insights.Prerequisite
You will need the following details to configure and import data using Azure Synapse Analytics- Server Name
- Database Name
- Port Number
- User Name
- Password
To Import data using Azure Synapse Analytics
Follow the below steps to create a new dataflow for the Azure Synapse Analytics import connector:- Go to Dataflow > Imports.

- Click New dataflow.

- Enter Dataflow name in the Name text area.
- Click Next.

Add Azure Synapse Analytics connector
- On the Choose Connector page, use the Search feature to locate and select the Azure Synapse Analytics Connector.

- Enter the Display Name for your dataflow in the text area.
- You can add a Description in the text area.
- Click Next to proceed.

Connect to the Azure Synapse Analytics account
- Fill in the required details on the Configuration page.
- Click Connect.

- Scroll down to the Table Details section, select the checkboxes for the tables you wish to import, and use the dropdown menu to label them as either Data or Metadata.

| Item | Description |
|---|---|
| Purpose | Option to assign a purpose (Data or Metadata) for each table. |
| Data | Loads customer data |
| Metadata | Loads Metadata |
| File name | Displays the name of the file that you imported. |
| Entity name | Displays the imported table name by default. You can rename it if required. |
- Click Save to apply the changes.

Run, edit, and delete the imported data
Once the table is imported, you can execute, modify, and remove the imported table from the Dataflow. Follow the below steps:- Go to the Dataflow > Imports page.

| Item | Description |
|---|---|
| Name | Displays the name of the imported Dataflow. |
| Type | Displays connector type symbol. |
| Status | Indicates whether the data is imported successfully. |
| Tables count | Displays the number of tables imported. |
| Created Date | Displays date of creation. |
| Last refresh type | Displays the refresh value: Full or Incremental. |
| Updated Date | Displays last modified date. |
| Last Refresh | Displays the latest refresh date, which updates each time you refresh the data. |
| Group by | Option to view the items in a specific group. For example, type, status, tables count, etc. |
- Select the horizontal ellipsis under the Actions column and do the following:
| If you want to | Then |
|---|---|
| Modify the Dataflow | Select Edit and modify the Dataflow. Click Save to apply your changes. |
| Execute the Dataflow | Select Run. |
| Bring the data to its previous state | Select Rollback. |
| Delete the Dataflow | Select Remove and then click the Delete button. All tables in the data source get deleted. |
| See the run history of the Dataflow | Select Run history. |

