tLoop
Depending on the Talend
product you are using, this component can be used in one, some or all of the following
Job frameworks:
-
Standard: see tLoop Standard properties.
The component in this framework is available in all Talend
products. -
Spark Batch:
see tLoop properties for Apache Spark Batch.The component in this framework is available in all subscription-based Talend products with Big Data
and Talend Data Fabric.
tLoop Standard properties
These properties are used to configure tLoop running in the Standard Job framework.
The Standard
tLoop component belongs to the Orchestration family.
The component in this framework is available in all Talend
products.
Basic settings
Loop Type |
Select a type of loop to be carried out: either For or While.
For: The task or Job is carried out
While: The task or Job is carried |
For |
|
While |
|
Global Variables
Global Variables |
ERROR_MESSAGE: the error message generated by the
CURRENT_VALUE: the current value. Only available for a
CURRENT_ITERATION: the sequence number of the current A Flow variable functions during the execution of a component while an After variable To fill up a field or expression with a variable, press Ctrl + For further information about variables, see |
Usage
Usage rule |
tLoop is to be used as a start |
Connections |
Outgoing links (from this component to another): Row: Iterate.
Trigger: On Subjob Ok; On Subjob Incoming links (from one component to this one): Row: Iterate;
Trigger: On Subjob Ok; On Subjob For further information regarding connections, see |
Executing a Job multiple times using a loop
implements a loop which executes a child Job five times, with a pause of three seconds
between each two executions.
Procedure
- Create a Parent Job, and drop tLoop, tRunJob, and tSleep components to the workspace.
-
Connect tLoop to tRunJob using a Row > Iterate connection, and connect tRunJob to tSleep using a
Row > Main
connection. -
Create a Child Job, and drop tRowGenerator and tLogRow
components to the workspace. -
Connect tRowGenerator to tLogRow using a Row
> Main connection. -
On the Basic settings view of the tLoop component, choose For loop type and type in the instance number to start from
(1), to finish with (5) and the
step (1). -
On the Basic settings view of the tRunJob component, select Child
Job in the Job field. -
On the Basic settings view of the tSleep component, enter 3 in the
Pause field. -
Double-click tRowGenerator to open the schema
editor. Then click the plus button to add four new columns:-
id, to generate sequence numbers
-
firstname, to generate random first names
-
lastname, to generate random last names
-
city, to generate random city names
-
-
Press F6 to run the
Parent Job.The Child Job will be executed five times with a
three-second pause between each two executions, displaying random personal
information on the Run console as configured in
the tRowGenerator component.
tLoop properties for Apache Spark Batch
These properties are used to configure tLoop running in the Spark Batch Job framework.
The Spark Batch
tLoop component belongs to the Orchestration family.
The component in this framework is available in all subscription-based Talend products with Big Data
and Talend Data Fabric.
Basic settings
Loop Type |
Select a type of loop to be carried out: either For or While.
For: The task or Job is carried out
While: The task or Job is carried |
For |
|
While |
|
Global Variables
Global Variables |
ERROR_MESSAGE: the error message generated by the
CURRENT_VALUE: the current value. Only available for a
CURRENT_ITERATION: the sequence number of the current A Flow variable functions during the execution of a component while an After variable To fill up a field or expression with a variable, press Ctrl + For further information about variables, see |
Usage
Usage rule |
tLoop is to be used as a start |
For |
From |
 |
To |
 |
Step |
 |
Values are increasing |
While |
Declaration |
 |
Condition |
 |
Iteration |
Spark Connection |
In the Spark
Configuration tab in the Run view, define the connection to a given Spark cluster for the whole Job. In addition, since the Job expects its dependent jar files for execution, you must specify the directory in the file system to which these jar files are transferred so that Spark can access these files:
This connection is effective on a per-Job basis. |
Related scenarios
No scenario is available for the Spark Batch version of this component
yet.