Connecting to DockHealth
This guide covers the steps to seamlessly connect DockHealth with Skypoint AI, including setup, configurations, and best practices.
Dock Health is a HIPAA-compliant platform designed to streamline healthcare operations through efficient task management and workflow automation. It enables healthcare teams to securely capture, assign, and track both clinical and administrative tasks, fostering collaboration and accountability. Key features include robust task management, workflow automation, seamless team communication tools, and strong compliance with HIPAA regulations.
Skypoint AI Platform (AIP) integrates with DockHealth via an inbuilt connector, automating the secure processing of daily CSV files from SFTP sites, ensuring reliable data ingestion, transformation, and storage into platforms like Databricks, enhancing operational efficiency and data accuracy.
Prerequisite
You need the following details to configure and import data using the DockHealth⇗ connector:
- Host
- Username
- Private Key
- Directory
To import data using the DockHealth connector
Follow the below steps to create and configure a new dataflow for the DockHealth import connector:
- Navigate to Dataflow > Imports.
- Click New dataflow as indicated by an arrow.
The Set dataflow name page appears.
- Enter the desired name for the dataflow in the Name text field.
- Click Next to proceed.
The Choose connector page appears.
Add DockHealth connector
- On the Choose Connector page, use the Search feature to locate and select the DockHealth Connector.
- Enter the Display Name for your dataflow in the provided text field.
- Optionally, add a Description in the designated text area.
- Click Next to proceed.
The Configuration page appears.
Connect to the DockHealth account
- Fill in the required details on the Configuration page.
- Click Connect.
Once the connection is established, the connector can be used to import data from DockHealth tables.
- Scroll down to the Table Details section, select the checkboxes for the tables you wish to import, and use the dropdown menu to label them as either Data or Metadata.
By default, all tables in the Table Details section are selected. You can choose to import only specific tables that are relevant to your data processing needs. For example, to import customer data, select tables containing details like name, email, address, and contact information.
Item | Description |
---|---|
Purpose | Option to assign a purpose (Data or Metadata) for each table. |
Data | Loads customer data |
Metadata | Loads Metadata |
File name | Displays the name of the file that you imported. |
Entity name | Displays the imported table name by default. You can rename it if required. |
- Click Save to apply the changes.
Congratulations! for saving the DockHealth connector dataflow, which appears on the Dataflow > Imports page.
Run, Edit, and Delete the imported data
Once the table is imported, you can execute, modify, and remove the imported table from the Dataflow. Follow the below steps:
- Go to the Dataflow > Imports page.
Item | Description |
---|---|
Name | Displays the name of the imported Dataflow. |
Type | Displays connector type symbol. |
Connector Name | Displays connector name. |
Status | Indicates whether the data is imported successfully. |
Tables Count | Displays the number of tables imported. |
Created Date | Displays date of creation. |
Last Refresh Type | Displays the refresh value: Full or Incremental. |
Updated Date | Displays last modified date. |
Last Refresh | Displays the latest refresh date, which updates each time you refresh the data. |
Actions | Provides multiple options for managing dataflows. |
- Select the horizontal ellipsis under the Actions column and do the following:
If you want to | Then |
---|---|
Modify the Dataflow | Select Edit and modify the Dataflow. Click Save to apply your changes. |
Execute the Dataflow | Select Run. |
Bring the data to its previous state | Select Rollback. |
Delete the Dataflow | Select Remove and then click the Delete button. All tables in the data source get deleted. |
See the run history of the dataflow | Select Run history. |
- Click Run to execute the dataflow. Once the execution is successful, the data pipeline status will update to Completed, as illustrated in the figure below.
In the Dataflow's Run History Description, you can view error messages related to data import failures from a data source. Additionally, you can check the status, start time, and end time of the data pipeline execution.
Next step
After completing the data import, start the Master Data Management (MDM) - Resolve process to create a single, unified view of the data. With Skypoint MDM, you can ensure that your data is accurate, consistent, and reliable, making it easier to use for business processes, analytics, and reporting.