August 15, 2023

tAzureStorageInputTable – Docs for ESB 6.x

tAzureStorageInputTable

Retrieves a set of entities that satisfy the specified filter criteria from an Azure
storage table.

tAzureStorageInputTable Standard properties

These properties are used to configure tAzureStorageInputTable
running in the Standard Job framework.

The Standard
tAzureStorageInputTable component belongs to the Cloud family.

The component in this framework is available when you are using one of the Talend solutions with Big Data.

Basic settings

Property Type

Select the way the connection details
will be set.

  • Built-In: The connection details will be set
    locally for this component. You need to specify the values for all
    related connection properties manually.

  • Repository: The connection details stored
    centrally in Repository > Metadata will be reused by this component. You need to click
    the […] button next to it and in the pop-up
    Repository Content dialog box, select the
    connection details to be reused, and all related connection
    properties will be automatically filled in.

This property is not available when other connection component is selected
from the Connection Component drop-down list.

Connection Component

Select the component whose connection details will be
used to set up the connection to Azure storage from the drop-down list.

Account Name

Enter the name of the storage account you need to access. A storage account
name can be found in the Storage accounts dashboard of the Microsoft Azure Storage
system to be used. Ensure that the administrator of the system has granted you the
appropriate access permissions to this storage account.

Account Key

Enter the key associated with the storage account you need to access. Two
keys are available for each account and by default, either of them can be used for
this access.

Protocol

Select the protocol for this connection to be created.

Use Azure Shared Access Signature

Select this check box to use a shared access signature (SAS) to access the
storage resources without need for the account key. For more information,
see Using Shared Access Signatures
(SAS)
.

In the Azure Shared Access Signature field displayed,
enter your account SAS URL between double quotation marks. You can get the
SAS URL for each allowed service on Microsoft Azure portal after generating
SAS. The SAS URL format is
https://<$storagename>.<$service>.core.windows.net/<$sastoken>,
where <$storagename> is the storage account name,
<$service> is the allowed service name
(blob, file,
queue or table), and
<$sastoken> is the SAS token value. For more
information, see Constructing the Account SAS
URI
.

Note that the SAS has valid period, you can set the start time at which the
SAS becomes valid and the expiry time after which the SAS is no longer valid
when generating it, and you need to make sure your SAS is still valid when
running your Job.

Table name

Specify the name of the table from which the entities will be retrieved.

Schema and Edit schema

A schema is a row description. It defines the number of fields (columns) to
be processed and passed on to the next component. The schema is either Built-In or stored remotely in the Repository.

  • Built-In: You create and store the
    schema locally for this component only. Related topic: see
    Talend Studio

    User Guide.

  • Repository: You have already created
    the schema and stored it in the Repository. You can reuse it in various projects and
    Job designs. Related topic: see
    Talend Studio

    User Guide.

The schema of this component is predefined with the following columns that describe
the three system properties of each entity:

  • PartitionKey: the partition key for the partition that the
    entity belongs to.

  • RowKey: the row key for the entity within the
    partition.

    PartitionKey and RowKey are string
    type values that uniquely identify every entity in a table, and the user must
    include them in every insert, update, and delete operation.

  • Timestamp: the time that the entity was last modified.
    This DateTime value is maintained by the Azure server and it can not be modified
    by the user.

For more information about these system properties, see Understanding the Table Service Data
Model
.

Click Edit schema to make changes to the schema.
If the current schema is of the Repository type, three
options are available:

  • View schema: choose this option to view the
    schema only.

  • Change to built-in property: choose this
    option to change the schema to Built-in for
    local changes.

  • Update repository connection: choose this
    option to change the schema stored in the repository and decide whether to propagate
    the changes to all the Jobs upon completion. If you just want to propagate the
    changes to the current Job, you can select No
    upon completion and choose this schema metadata again in the [Repository Content] window.

Use filter expression

Select this check box and complete the Filter expressions
table displayed to specify the conditions used to filter the entities to be retrieved
by clicking the [+] button to add as many rows as needed, each
row for a condition, and setting the value for the following parameters for each
condition.

  • Column: specify the name of the property on which you want
    to apply for the condition.
  • Function: click the cell and select the comparison operator
    you want to use from the drop-down list.
  • Value: specify the value used to compare the property
    to.
  • Predicate: select the predicate used to combine the
    conditions.
  • Field type: click the cell and select the type of the
    column from the drop-down list.

The generated filter expression will be displayed in the read-only
Effective filter field.

For more information about the filter expressions, see Querying Tables and Entities.

Die on error

Select the check box to stop the execution of the Job when an error
occurs.

Advanced settings

Name mappings

Complete this table to map the column name of the
component schema with the property name of the Azure table entity if they are
different.

  • Schema column name: enter the column name of the
    component schema between double quotation marks.
  • Entity property name: enter the property name of the
    Azure table entity between double quotation marks.

For example, if there are three schema columns
CompanyID, EmployeeID, and
EmployeeName that are used to feed the values for the
PartitionKey, RowKey, and
Name entity properties respectively, since the
PartitionKey and RowKey columns have
already been added to the schema automatically and you do not need to specify the
mapping relationship for them, you only need to add one row and set the value of the
Schema column name cell with
"EmployeeName" and the value of the Entity
property name
cell with "Name" to specify the
mapping relationship for the EmployeeName column when
retrieving data from the Azure table.

tStatCatcher Statistics

Select this check box to gather the Job processing metadata at the Job level
as well as at each component level.

Global variables

NB_LINE

The number of rows processed. This is an After variable and it returns an
integer.

ERROR_MESSAGE

The error message generated by the component when an error occurs. This
is an After variable and it returns a string.

Usage

Usage rule

This component is usually used as a start component of a Job or Subjob and it
always needs an output link.

Handling data with Microsoft Azure Table storage

Here is an example of using Talend components to connect to a
Microsoft Azure storage account that gives you access to Azure storage table service, write
some employee data into an Azure storage table, and then retrieve the employee data from the
table and display it on the console.

The employee data used in this example is as follows:

Creating a Job for handling data with Azure Table storage

Create a Job to connect to an Azure storage account, write some employee data
into an Azure storage table, and then retrieve that information from the table and
display it on the console.

tazurestoragexxxtable_job.png

  1. Create a new Job and add a tAzureStorageConnection
    component, a tFixedFlowInput component, a
    tAzureStorageOutputTable component, a
    tAzureStorageInputTable component, and a
    tLogRow component by typing their names in the design
    workspace or dropping them from the Palette.
  2. Link the tFixedFlowInput component to the
    tAzureStorageOutputTable component using a
    Row > Main
    connection.
  3. Do the same to link the tAzureStorageInputTable
    component to the tLogRow component.
  4. Link the tAzureStorageConnection component to the
    tFixedFlowInput component using a
    Trigger > OnSubjobOk
    connection.
  5. Do the same to link the tFixedFlowInput component to the
    tAzureStorageInputTable component.

Connecting to an Azure Storage account

Configure the tAzureStorageConnection component to open
the connection to an Azure Storage account.

The Azure Storage account, which allows you to access the Azure Table storage service
and store the provided employee data, has already been created. For more information
about how to create an Azure Storage account, see About Azure storage accounts.

  1. Double-click the tAzureStorageConnection component to
    open its Basic settings view on the
    Component tab.

    tazurestoragexxxtable_tazurestorageconnection.png

  2. In the Account Name field, specify the name of the
    storage account you need to access.
  3. In the Account Key field, specify the key associated
    with the storage account you need to access.

Writing data into an Azure Storage table

Configure the tFixedFlowInput component and the
tAzureStorageOutputTable component to write the employee data
into an Azure Storage table.

  1. Double-click the tFixedFlowInput component to open its
    Basic settings view on the
    Component tab.

    tazurestoragexxxtable_tfixedflowinput.png

  2. Click components-button_three_dot.png next to Edit schema to open the
    schema dialog box and define the schema by adding six columns:
    Id, Name,
    Site, and Job of String type,
    Date of Date type, and Salary
    of Double type. Then click OK to save the changes and
    accept the propagation prompted by the pop-up dialog box.

    tazurestoragexxxtable_tfixedflowinput_schema.png

    Note that in this example, the Site and
    Id columns are used to feed the values of the
    PartitionKey and RowKey
    system properties of each entity and they should be of String type, and the
    Name column is used to feed the value of the
    EmployeeName property of each entity.

  3. In the Mode area, select Use Inline
    Content(delimited file)
    and in the
    Content field displayed, enter the employee data that
    will be written into the Azure Storage table.
  4. Double-click the tAzureStorageOutputTable component to
    open its Basic settings view on the
    Component tab

    tazurestoragexxxtable_tazurestorageoutputtable_basic.png

  5. From the connection component drop-down list, select the component whose
    connection details will be used to set up the connection to the Azure Storage
    service, tAzureStorageConnection_1 in this example.
  6. In the Table name field, enter the name of the table
    into which the employee data will be written, employee in
    this example.
  7. From the Action on table drop-down list, select the
    operation to be performed on the specified table, Drop table if exist
    and create
    in this example.
  8. Click Advanced settings to open its view.

    tazurestoragexxxtable_tazurestorageoutputtable_advanced.png

  9. Click Button_Plus.png under the
    Name mappings table to add three rows and map the
    schema column name with the property name of each entity in the Azure table. In
    this example,

    • the Site column is used to feed the value of the
      PartitionKey system property, in the first
      row you need to set the Schema column name cell
      with the value "Site" and the Entity
      property name
      cell with the value
      "PartitionKey".
    • the Id column is used to feed the value of the
      RowKey system property, in the second row you
      need to set the Schema column name cell with the
      value "Id" and the Entity property
      name
      cell with the value
      "RowKey".
    • the Name column is used to feed the value of the
      EmployeeName property, in the third row you
      need to set the Schema column name cell with the
      value "Name" and the Entity property
      name
      cell with the value
      "EmployeeName".

Retrieving data from the Azure Storage table

Configure the tAzureStorageInputTable component and the
tLogRow component to retrieve the employee data from the
Azure Storage table.

  1. Double-click the tAzureStorageInputTable component to
    open its Basic settings view.

    tazurestoragexxxtable_tazurestorageinputtable_basic.png

  2. From the connection component drop-down list, select the component whose
    connection details will be used to set up the connection to the Azure Storage
    service, tAzureStorageConnection_1 in this example.
  3. In the Table name field, enter the name of the table
    from which the employee data will be retrieved, employee
    in this example.
  4. Click components-button_three_dot.png next to Edit schema to open
    the schema dialog box.

    tazurestoragexxxtable_tazurestorageinputtable_schema.png

    Note that the schema has already been predefined with two read-only columns
    RowKey and PartitionKey of
    String type, and another column Timestamp of Date
    type. The RowKey and
    PartitionKey columns correspond to the
    Id and Site columns of the
    tAzureStorageOutputTable schema.

  5. Define the schema by adding another four columns that hold other employee data,
    Name and Job of String type,
    Date of Date type, and Salary
    of Double type. Then click OK to save the changes and
    accept the propagation prompted by the pop-up dialog box.
  6. Click Advanced settings to open its view.

    tazurestoragexxxtable_tazurestorageinputtable_advanced.png

  7. Click Button_Plus.png under the
    Name mappings table to add one row and set the
    Schema column name cell with the value
    "Name" and the Entity property
    name
    cell with the value "EmployeeName"
    to map the schema column name with the property name of each entity in the Azure
    table.

    Note that for the tAzureStorageInputTable component,
    the PartitionKey and RowKey
    columns have already been added automatically to the schema and you do not
    need to specify the mapping relationship for them.

  8. Double-click the tLogRow component to open its
    Basic settings view and in the
    Mode area, select Table (print values in
    cells of a table)
    for a better display of the result.

Executing the Job to handle data with Azure Table storage

After setting up the Job and configuring the components used in the Job for
handling data with Azure Table storage, you can then execute the Job and verify the Job
execution result.

  1. Press Ctrl + S to save the Job.
  2. Press F6 to execute the Job.

    tazurestoragexxxtable_job_result.png

    As shown above, the Job is executed successfully and the employee data is
    displayed on the console, with the timestamp value that indicates when each
    entity was inserted.

  3. Double-check the employee data that has been written into the Azure Storage
    table employee using Microsoft Azure Storage Explorer if
    you want.

    tazurestoragexxxtable_job_result_explorer.png


Document get from Talend https://help.talend.com
Thank you for watching.
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x