tSalesforceOutputBulkExec
Bulk-loads data in a given file into a Salesforce object.
The tSalesforceOutputBulk and
tSalesforceBulkExec components are used together in a two
step process. In the first step, an output file is generated. In the second step, this file is
used to feed the Salesforce database. These two steps are fused together in the tSalesforceOutputBulkExec component. The advantage of using two
separate steps is that the data can be transformed before it is loaded into Salesforce.
tSalesforceOutputBulkExec Standard properties
These properties are used to configure tSalesforceOutputBulkExec running in the Standard Job framework.
The Standard
tSalesforceOutputBulkExec component belongs to the Business and the Cloud families.
The component in this framework is available in all Talend
products.
Basic settings
Property Type |
Select the way the connection details
This property is not available when other connection component is selected |
Connection Component |
Select the component that opens the database connection to be reused by this |
Connection type |
Select the type of the connection from the drop-down list. The
|
User Id |
The Salesforce username. This property is available |
Password |
The Salesforce password associated with the username. This property is available |
Security Token |
The Salesforce security token. For more information, see Reset Your Security Token. This property is available |
Issuer |
The OAuth Consumer Key, generated This property is available only |
Subject |
The Salesforce username. This property is available only |
Audience |
Json Web Token audience. You can set your own Json Web Token This property is available only |
Expiration time (in seconds) |
The expiration time of the assertion (in seconds) within five This property is available only |
Key store |
The path to the keystore file in Java Keystore (JKS) format. The keystore file can be generated by creating a certificate signed by This property is available only |
Key store password |
The keystore password. This property is available only |
Certificate alias |
The unique name of the certificate signed by Salesforce. This property is available only |
Client Id |
The OAuth Consumer Key, generated This property is available |
Client Secret |
The OAuth Consumer Secret, generated when your connected app is This property is available |
Callback Host |
The host value in the OAuth authentication callback URL that is This property is available |
Callback Port |
The port value in the OAuth authentication callback URL that is This property is available |
Token File |
The path to the token file that stores the refresh token used to get This property is available |
Module Name |
Click the […] button next to |
Schema and Edit schema |
A schema is a row description. It defines the number of fields Click Edit
Click Sync This This |
Output Action |
Select one of the following operations to be performed from the
|
Bulk File Path |
Specify the path to the file that stores the data to be processed. |
Append |
Select this check box to append new data at the end of the file if it |
Ignore Null |
Select this check box to ignore NULL values. |
Advanced settings
Salesforce URL |
The Webservice URL required to connect to Salesforce. |
API version |
The Salesforce API version. This property is available only when the |
Need compression |
Select this check box to activate SOAP message compression, which can |
Trace HTTP message |
Select this check box to |
Client Id |
Enter the ID of the real user to differentiate between those who use |
Timeout |
Enter the intended number of query timeout in milliseconds in |
Use Proxy |
Select this check box to use a proxy server, and in the Host, Port, |
Bulk API V2 |
Select this checkbox to create a Bulk API V2 job, and then select a Compared with Bulk API V1, Bulk API v2 simplifies the way of This property is available only when the |
Relationship mapping for upsert |
Click the
This property is |
Concurrency Mode |
Select the concurrency mode for the job.
|
Rows to Commit |
Specify the number of lines per data batch to be processed. |
Bytes to Commit |
Specify the number of bytes per data batch to be processed. |
Wait Time Check Batch |
Specify the wait time (in milliseconds) for checking whether the |
tStatCatcher Statistics |
Select this check box to gather the Job processing metadata at the Job level |
Global Variables
ERROR_MESSAGE |
The error message generated by the component when an error occurs. This is an After |
Usage
Usage rule |
This component is mainly used when no particular |
Limitation |
The bulk data to be processed in Salesforce should be in |
Inserting bulk data into Salesforce
This scenario describes a four-component Job that bulk-loads data in a file
into Salesforce, performs the intended action on the data, and ends up with displaying the Job
execution results on the console.
The content of the input file SalesforceAccount.txt used
in this example is as follows:
1 2 3 4 5 6 |
Name;ParentId;Phone;Fax Burlington Textiles Corp of America;;(336) 222-7000;(336) 222-8000 Dickenson plc;; (785) 241-6200;(785) 241-6201 GenePoint;;(650) 867-3450;(650) 867-9895 Edge Communications;talend;(512) 757-6000;(512) 757-9000 Grand Hotels & Resorts Ltd;talend;(312) 596-1000;(312) 596-1500 |
Setting up the Job for inserting bulk data into Salesforce
-
Create a new Job and add a tFileInputDelimited component, a tSalesforceOutputBulkExec component, and two tLogRow components by typing their names on the
design workspace or dropping them from the Palette. - Link the tFileInputDelimited component to the tSalesforceOutputBulkExec component using a Row > Main connection.
- Link the tSalesforceOutputBulkExec component to the first tLogRow component using a Row > Main connection.
-
Link the tSalesforceOutputBulkExec
component to the second tLogRow component
using a Row > Reject connection.
Configuring the Job for inserting bulk data into Salesforce
-
Double-click the tFileInputDelimited component to open its Basic settings view.
-
In the File name/Stream
field, browse to or enter the path to the input data file. In this example, it
is D:/SalesforceAccount.txt. -
Click the […] button
next to Edit schema and in the pop-up
schema dialog box, define the schema by adding four columns
Name, ParentId,
Phone and Fax of String type.
When done, click OK to save the changes
and close the dialog box. -
Double-click the tSalesforceOutputBulkExec component to open its Basic settings view.
-
In the User Id,
Password and Security Key fields, enter the user
authentication information required to access Salesforce. -
Click the […] button
next to the Module Name field and in the
pop-up dialog box, select the object you want to access. In this example, it is
Account. -
In the Bulk File Path
field, browse to or enter the path to the CSV file that stores the data for bulk
processing. The bulk file to be processed must be in csv format. -
Double-click the first tLogRow component to open its Basic
settings view. -
In the Mode area, select
Table (print values in cells of a
table) for better readability of the results. - Do the same to configure the second tLogRow component.
Executing the Job to insert bulk data into Salesforce
- Press Ctrl + S to save the Job.
-
Press F6 to execute the
Job.On the console of the Run view, you
can check the execution result.In the tLogRow_1 table, you can read the data inserted
into Salesforce.In the tLogRow_2 table, you can read the rejected data
due to the incompatibility with the Account objects you have accessed.Note that if you want to transform the input data before loading them into
Salesforce, you need to use tSalesforceOutputBulk and tSalesforceBulkExec in cooperation to achieve this
purpose.