August 15, 2023

tAzureStorageOutputTable – Docs for ESB 6.x

tAzureStorageOutputTable

Performs the defined action on a given Azure storage table and inserts, replaces,
merges or deletes entities in the table based on the incoming data from the preceding
component.

tAzureStorageOutputTable Standard properties

These properties are used to configure tAzureStorageOutputTable
running in the Standard Job framework.

The Standard
tAzureStorageOutputTable component belongs to the Cloud family.

The component in this framework is available when you are using one of the Talend solutions with Big Data.

Basic settings

Property Type

Select the way the connection details
will be set.

  • Built-In: The connection details will be set
    locally for this component. You need to specify the values for all
    related connection properties manually.

  • Repository: The connection details stored
    centrally in Repository > Metadata will be reused by this component. You need to click
    the […] button next to it and in the pop-up
    Repository Content dialog box, select the
    connection details to be reused, and all related connection
    properties will be automatically filled in.

This property is not available when other connection component is selected
from the Connection Component drop-down list.

Connection Component

Select the component whose connection details will be
used to set up the connection to Azure storage from the drop-down list.

Account Name

Enter the name of the storage account you need to access. A storage account
name can be found in the Storage accounts dashboard of the Microsoft Azure Storage
system to be used. Ensure that the administrator of the system has granted you the
appropriate access permissions to this storage account.

Account Key

Enter the key associated with the storage account you need to access. Two
keys are available for each account and by default, either of them can be used for
this access.

Protocol

Select the protocol for this connection to be created.

Use Azure Shared Access Signature

Select this check box to use a shared access signature (SAS) to access the
storage resources without need for the account key. For more information,
see Using Shared Access Signatures
(SAS)
.

In the Azure Shared Access Signature field displayed,
enter your account SAS URL between double quotation marks. You can get the
SAS URL for each allowed service on Microsoft Azure portal after generating
SAS. The SAS URL format is
https://<$storagename>.<$service>.core.windows.net/<$sastoken>,
where <$storagename> is the storage account name,
<$service> is the allowed service name
(blob, file,
queue or table), and
<$sastoken> is the SAS token value. For more
information, see Constructing the Account SAS
URI
.

Note that the SAS has valid period, you can set the start time at which the
SAS becomes valid and the expiry time after which the SAS is no longer valid
when generating it, and you need to make sure your SAS is still valid when
running your Job.

Table name

Specify the name of the table into which the entities will be written.

Schema and Edit schema

A schema is a row description. It defines the number of fields (columns) to
be processed and passed on to the next component. The schema is either Built-In or stored remotely in the Repository.

  • Built-In: You create and store the
    schema locally for this component only. Related topic: see
    Talend Studio

    User Guide.

  • Repository: You have already created
    the schema and stored it in the Repository. You can reuse it in various projects and
    Job designs. Related topic: see
    Talend Studio

    User Guide.

Click Edit schema to make changes to the schema.
If the current schema is of the Repository type, three
options are available:

  • View schema: choose this option to view the
    schema only.

  • Change to built-in property: choose this
    option to change the schema to Built-in for
    local changes.

  • Update repository connection: choose this
    option to change the schema stored in the repository and decide whether to propagate
    the changes to all the Jobs upon completion. If you just want to propagate the
    changes to the current Job, you can select No
    upon completion and choose this schema metadata again in the [Repository Content] window.

Partition Key

Select the schema column that holds the partition key value from the drop-down
list.

Row Key

Select the schema column that holds the row key value from the drop-down list.

Action on data

Select an action to be performed on data of the table defined.

  • Insert: insert a new entity into the table.
  • Insert or replace: replace an existing entity or insert a
    new entity if it does not exist. When replace an entity, any properties from the
    previous entity will be removed if the new entity does not define them.
  • Insert or merge: merge an existing entity or insert a new
    entity if it does not exist. When merge an entity, any properties from the previous
    entity will be retained if the new entity does not define or include them.
  • Merge: update an existing entity without removing the
    property value of the previous entity if the new entity does not define its
    value.
  • Replace: update an existing entity and remove the property
    value of the previous entity if the new entity does not define its value.
  • Delete: delete an existing entity.

For performance reasons, the incoming data is processed in parallel and in random
order. Therefore, it is not recommended to perform any order-sensitive data operation
(for example, insert or replace) if there are duplicated rows in your data.

Action on table

Select an operation to be performed on the table defined.

  • Default: No operation is carried out.

  • Drop and create table: The table is removed and
    created again.

  • Create table: The table does not exist and gets
    created.

  • Create table if does not exist: The table is
    created if it does not exist.

  • Drop table if exist and create: The table is
    removed if it already exists and created again.

Process in batch

Select this check box to process the input entities in batch.

Note that the entities to be processed in batch should belong to the same partition
group, which means, they should have the same partition key value.

Die on error

Select the check box to stop the execution of the Job when an error
occurs.

Advanced settings

Name mappings

Complete this table to map the column name of the
component schema with the property name of the Azure table entity if they are
different.

  • Schema column name: enter the column name of the
    component schema between double quotation marks.
  • Entity property name: enter the property name of the
    Azure table entity between double quotation marks.

For example, if there are three schema columns
CompanyID, EmployeeID, and
EmployeeName that are used to feed the values for the
PartitionKey, RowKey, and
Name entity properties respectively, then you need to add the
following rows for the mapping when writing data into the Azure table.

  • the Schema column name cell with the value
    "CompanyID" and the Entity property
    name
    cell with the value "PartitionKey".
  • the Schema column name cell with the value
    "EmployeeID" and the Entity property
    name
    cell with the value "RowKey".
  • the Schema column name cell with the value
    "EmployeeName" and the Entity property
    name
    cell with the value "Name".
tStatCatcher Statistics

Select this check box to gather the Job processing metadata at the Job level
as well as at each component level.

Global variables

NB_LINE

The number of rows processed. This is an After variable and it returns an
integer.

NB_SUCCESS

The number of rows successfully processed. This is an After variable and it returns
an integer.

NB_REJECT

The number of rows rejected. This is an After variable and it returns an
integer.

ERROR_MESSAGE

The error message generated by the component when an error occurs. This
is an After variable and it returns a string.

Usage

Usage rule

This component is usually used as an end component of a Job or Subjob and it
always needs an input link.

Related scenario


Document get from Talend https://help.talend.com
Thank you for watching.
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x