SL2: SAP Data Services Setup

SAP Data Services => Data Source creation:

  1. Open the SAP Data Services Designer application.

  2. Right-click on your SAP Data Services project name in Project Explorer.

  3. Select New > Datastore.

  4. Fill in Datastore Name. For example, NPL.

  5. In the Datastore type field, select SAP Applications.

  6. In the Application server name field, provide the instance name of the SAP SLT Application server,

  7. Specify the access credentials. Please create a separate user in SAP for Data services if possible.

  8. Open the Advanced tab.

  9. In ODP Context, enter SLT~ALIAS, where ALIAS is the queue alias that you specified during SLT configuration.

  10. Click OK.

Preparation menu

SAP Data Services => Target Data Store creation:

Configure a File Location object to point to the S3 Bucket

  1. Select File Formats

  2. Select File Locations and Right click on new file location.

  3. Choose Protocol as Amazon Cloud Storage and enter the access key and secret key credentials saved earlier.

  4. Save the credentials and test to ensure connectivity works

Preparation menu

SAP Data Services => Creating Import:

Following steps import ODP objects from the source datastore for the initial and delta loads and make them available in SAP Data Services.

  1. Open the SAP Data Services Designer application.

  2. Expand the source datastore for replication load in the Project Explorer.

  3. Select the External Metadata option in the upper portion of the right panel. The list of nodes with available tables and ODP objects appears.

  4. Click on the ODP objects node to retrieve the list of available ODP objects. The list might take a long time to display.

  5. Click on the Search button.

  6. In the dialog, select External data in the look in menu and ODP object in the Object type menu.

  7. In the Search dialog, select the search criteria to filter the list of source ODP object(s).

  8. Select the ODP object to import from the list.

  9. Right-click and select the Import option.

  10. Fill in the Name of Consumer.

  11. Fill in the Name of Project.

  12. Select Changed-data capture (CDC) option in Extraction mode.

  13. Click Import. This starts the import of the ODP object into Data Services. The ODP object is now available in the object library under the NPLT node.

➡️ For more information, see Importing ODP source metadata section in SAP Data Services documentation.

Preparation menu

➡️ Now that the source and connections are created, it is time to create the data flow.

SAP Data Services => Creating Data Flow:

  1. Open the SAP Data Services Designer application.

  2. Right-click on your SAP Data Services project name in Project Explorer.

  3. Select Project > New > Data flow.

  4. Fill in the Name field. For example, DF_S3.

  5. Click on Finish.

Create a Batch job

  1. Right click on the project and select Create new batch job.
  2. Drag and Drop the dataflow on the canvas and double click.

Build your data flow

  1. Build your data flow by Dragging the ODP object on to the data flow workspace.

  2. Double click on the ODP object and select intial load to no.

  3. Drag a query object from the transforms tab to the data flow workspace and connect it to the ODP object.

  4. Open the query object, Right click on the schema out and create a file format. « Insert screenshot»

  5. In the location, Select the S3 location saved earlier. In the directory name , you can enter the folder name in your S3 bucket.

  6. A new file object is created in the File format tab. Please drag that on to the data flow workspace.

  7. Connect the output of the Query to the file format.

  8. Save your changes.

Execute your data flow

  1. Right click on the job name and execute your data flow. The initial set of records will be loaded.

  2. Add more records to SNWD_PD via SEPM_PD.

  3. Run the job again, it should load the deltas.

➡️ The jobs can be scheduled in the background for large tables.


With this session you have learned how you can use SLT to extract your SAP data to AWS Analytics Solutions for post processing analytics activities.