Skip to main content

Amazon S3

Dataflow lets you integrate your data from different sources.

Step 01 - Add Dataflow

  • Click on Add Dataflow on the right top corner of the screen.
    • Enter a Name. (Name starts with a letter and only Letters and Numbers are allowed. No Spcaes)
    • Click Next.
    • You will be directed to choose a Connector.
    • Search Amazon S3 and click on it.

Step 02 - Access Source Data

In order to allow Skypoint access to your Amazon S3 account, you'll need the following details:

  • Access key ID (Check with Amazon S3 source credentials.)
  • Secret Access Key
  • S3 Bucket
    • Navigate to the folder you want download the data from.
    • Click Save.
    • Once you select the S3 Bucket you will see a notification Data loaded successfuly.
    • Data from the source will be loaded to the table with headers File Name, Entity Name, Datetime Format, Delimiter,
    • First Row as Header, Advanced Settings.
      • File Name
        The file name will be the name of file which exist in the source of Amazon S3.
      • Entity Name
        Entity Name is the unique name that is created for the data collected from the source.
      • Datetime Format
        There are a number of available Datetime Formats and Skypoint Modern Data Stack Platform is set to automatically detect them.
      • Delimiter
        Delimiter is the key that is used for setting up a boundary or a separator of variables in the data imported. Currently the available seperators are, Comma (,), Semicolon (;), Pipe (|), Tab (\t), Start of Heading (\u0001) and No Delimiter".
      • First Row as Header
        This is a checkbox, which call the first row as Header for the table sheet we are importing. The system will automatically collect the data according to the Header Contents.
      • Advacned Settings
        Advanced Settings will allow you to fine tune the import process with minute details.
  • Advanced Settings
    When you click on the **Advanced Settings** link, you will see a pop-up window with certain additional information. All these informations are key factors of the source data and how it is organized. If you are getting any errors, please check with the data source for these additional information. - **Compression Type**
    Compression type in **Advanced Settings** is the method that is used for compressing the details from the source, Amazon S3. (To identify what is the compression methods that's used, please check with your credentials.) - **Row Delimiter**
    The data stream will have a seperator that identify the boundaries of it's flow. If there were any different separator is used in it, that information need to be changed to get more accuracy in data ingestion. - **Encoding**
    Since the data is coming in data stream there is always a kind of encoding used for decyphering it. If any seperate value of mentioned in the encoding process, you may need to select the appropriate encoding. Default encoding is **UTF-8**. - **Escape Character**
    An escape character is a particular case of metacharacters. Which given an identification of sequence start or end. You can manually select the **Escape Character** from the dropdown. - \"/" **Forward Slash** - "\\" **Backshlash** - **No Escape Character** - **Quote Character**
    There are three **Quote Character** types are mentioned in the advanced **Quote Character** dropdown. - (") **Double Quote** - (') **Single Quote** - **No Quote Character**

Step 03 - Action Button.

Action button includes specific actions on the imported lists.

  • Show All
    When you select Show All dropdown, the list will show all the files available in the folder.
  • Show Selected
    Once you click on the Show Selected option, the list will show only the selected files from all of the data in the folder.
  • Select All
    With one click you can Select All the files in that list imported form selceted folder.
  • Clear All
    If you don't want to get all the folders selected, with just one click of the Clear All button all the selections will be removed form the selection.

Step 04 - Save

  • Click Save Button.
    • You will be taken to the Dataflow page.