tOracleOutputBulkExec
Executes the Insert action in the specified Oracle database.
As a dedicated component, it allows gains in performance during Insert operations to an
Oracle database.
The tOracleOutputBulk and tOracleBulkExec components are used together in a two
step process. In the first step, an output file is generated. In the second step, this
file is used in the INSERT operation used to feed a database. These two steps are fused
together in the tOracleOutputBulkExec
component.
tOracleOutputBulkExec Standard properties
These properties are used to configure tOracleOutputBulkExec running in the Standard Job framework.
The Standard
tOracleOutputBulkExec component belongs to the Databases family.
The component in this framework is available in all Talend
products.
connector. The properties related to database settings vary depending on your database
type selection. For more information about dynamic database connectors, see Dynamic database components.
Basic settings
Database |
Select a type of database from the list and click |
Property type |
Either Built-in or Repository |
 |
Built-in: No property data stored |
 |
Repository: Select the repository |
Use an existing connection |
Select this check box and in the Component List click the relevant connection component to Note: When a Job contains the parent Job and the child Job, if you
need to share an existing connection between the two levels, for example, to share the connection created by the parent Job with the child Job, you have to:
For an example about how to share a database connection |
Connection type |
Drop-down list of available drivers:
Oracle OCI: Select this connection type
Oracle Service Name: Select this
Oracle SID: Select this connection type
Oracle Custom: Select this connection |
DB Version |
Select the Oracle version in use |
Host |
Database server IP address. Currently, only localhost, |
Port |
Listening port number of DB server. |
Database |
Name of the database |
Schema |
Name of the schema. |
Username and Password |
DB user authentication data. To enter the password, click the […] button next to the |
Table |
Name of the table to be written. Note that only one table can be |
Action on table |
On the table defined, you can perform one of the following
None: No operations is carried
Drop and create table: The table is
Create table: The table does not exist
Create table if not exists: The table
Drop table if exists and create: The
Clear table: The table content is
Truncate table: The table content is |
File Name |
Name of the file to be generated and loaded. Warning:
This file is generated on the machine specified by the URI in the |
Create directory if not exists |
This check box is selected by default. It creates a directory to hold |
Append |
Select this check box to add the new rows at the end of the |
Action on data |
On the data of the table defined, you can perform:
Insert: Insert data to an empty table.
Update: Update the existing data. You Append: Append data to the table, whether Replace: if the table already contains
Truncate: If the table already contains |
Schema and Edit |
A schema is a row description, it defines the number of fields to be Click Edit
|
 |
Built-In: You create and store the schema locally for this component |
 |
Repository: You have already created the schema and stored it in the When the schema to be reused has default values that are You can find more details about how to |
Field separator |
Character, string or regular expression to separate fields. |
Advanced settings
Advanced separator (for number) |
Select this check box to change data separators for numbers:
Thousands separator: define separators
Decimal separator: define separators |
Use existing control file |
Select this check box and browse to the .ctl control file you want to |
Field separator |
Character, string or regular expression to separate fields. |
Row separator |
String (ex: ” |
Specify .ctl file’s INTO TABLE clause manually |
Select this check box to enter manually the INTO TABLE clause of the |
Use schema’s Date Pattern to load Date field |
Select this check box to use the date model indicated in the schema |
Specify field condition |
Select this check box to define a condition for loading data. |
Preserve blanks |
Select this check box to preserve blank spaces. |
Trailing null columns |
Select this check box to load data with all empty columns. |
Load options |
Click + to add data loading
Parameter: select a loading parameter
Value: enter a value for the parameter |
NLS Language |
From the drop-down list, select the language for your data if the data |
Set Parameter NLS_TERRITORY |
Select this check box to modify the conventions used for date and time |
Encoding |
Select the encoding from the list or select Custom and define it manually. This field is compulsory |
Oracle encoding type |
Select the Oracle-specific encoding type for the data to be processed. |
Output |
Select the type of output for the standard output of the Oracle to console, to global variable. |
Convert columns and table names to uppercase |
Select this check box to put columns and table names in upper case. |
Bulk file parameters |
Set the parameters Buffer Size and |
tStatCatcher Statistics |
Select this check box to gather the job processing metadata at a job |
Usage
Usage rule |
This component is mainly used when no particular transformation is |
Dynamic settings |
Click the [+] button to add a row in the table The Dynamic settings table is For examples on using dynamic parameters, see Reading data from databases through context-based dynamic connections and Reading data from different MySQL databases using dynamically loaded connection parameters. For more information on Dynamic |
Limitation |
The database server/client must be installed on the same machine where |
Related scenarios
For use cases in relation with tOracleOutputBulkExec,
see the following scenarios: