tGreenplumGPLoad
Bulk loads data into a Greenplum table either from an existing data file, an input
flow, or directly from a data flow in streaming mode through a named-pipe.
tGreenplumGPLoad inserts data
into a Greenplum database table using Greenplum’s gpload
utility.
tGreenplumGPLoad Standard properties
These properties are used to configure tGreenplumGPLoad running in the Standard Job framework.
The Standard
tGreenplumGPLoad component belongs to the Databases family.
The component in this framework is available in all Talend
products.
Basic settings
Property type |
Either Built-in or |
 |
Built-in: No property data stored |
 |
Repository: Select the repository |
Host |
Database server IP address. |
Port |
Listening port number of the DB server. |
Database |
Name of the Greenplum database. |
Schema |
Exact name of the schema. |
Username and |
DB user authentication data. To enter the password, click the […] button next to the |
Table |
Name of the table into which the data is to be inserted. |
Action on table |
On the table defined, you can perform one of the following
None: No operation is carried
Clear table: The table content is
Create table: The table does not
Create table if not exists: The
Drop and create table: The table is
Drop table if exists and create:
Truncate table: The table content |
Action on data |
On the data of the table defined, you can perform:
Insert: Add new entries to the
Update: Make changes to existing
Merge: Updates or adds data to the Warning:
It is necessary to specify at least one |
Schema and Edit schema |
A schema is a row description. It defines the number of fields |
 |
Built-In: You create and store the schema locally for this component |
 |
Repository: You have already created the schema and stored it in the |
 |
Click Edit
|
Data file |
Full path to the data file to be used. If this component is used |
Use named-pipe |
Select this check box to use a named-pipe. This option is only Note:
This component on named-pipe mode uses a JNI interface to |
Named-pipe name |
Specify a name for the named-pipe to be used. Ensure that the name |
Die on error |
This check box is selected by default. Clear the check box to skip |
Advanced settings
Use existing control file (YAML formatted) |
Select this check box to provide a control file to be used with |
Control file |
Enter the path to the control file to be used, between double |
CSV mode |
Select this check box to include CSV specific parameters such as |
Field separator |
Character, string, or regular expression used to separate Warning:
This is gpload’s delim argument. The |
Escape char |
Character of the row to be escaped. |
Text enclosure |
Character used to enclose text. |
Header (skips the first row of data file) |
Select this check box to skip the first row of the data |
Additional options |
Set the gpload arguments in the corresponding table. Click |
 |
LOCAL_HOSTNAME: The host name or |
 |
PORT (gpfdist port): The specific |
 |
PORT_RANGE: Can be used instead |
 |
NULL_AS: The string that |
 |
FORCE_NOT_NULL: In CSV mode, |
 |
ERROR_LIMIT (2 or higher): |
 |
ERROR_TABLE: When ERROR_LIMIT is declared, specifies an |
Log file |
Browse to or enter the access path to the log file in your |
Encoding |
Define the encoding type manually in the field. |
Specify gpload path |
Select this check box to specify the full path to the gpload |
Full path to gpload executable |
Full path to the gpload executable on the machine in use. It is |
tStatCatcher Statistics |
Select this check box to collect log data at the component |
Global Variables
Global Variables |
NB_LINE: the number of rows processed. This is an After
GPLOAD_OUTPUT: the output information when the gpload
ERROR_MESSAGE: the error message generated by the A Flow variable functions during the execution of a component while an After variable To fill up a field or expression with a variable, press Ctrl + For further information about variables, see |
Usage
Usage rule |
This component is mainly used when no particular transformation is This component can be used as a standalone or an output |
Limitation |
Due to license incompatibility, one or more JARs required to use |
Related scenario
For a related use case, see Inserting data in bulk in MySQL database.