tImpalaOutput
Executes the action defined on the data contained in the table, based on the flow
incoming from the preceding component in the Job.
tImpalaOutput connects to an Impala database (the Impala
data warehouse system) and writes data in an Impala table.
tImpalaOutput Standard properties
These properties are used to configure tImpalaOutput running in the Standard Job framework.
The Standard
tImpalaOutput component belongs to the Big Data family.
The component in this framework is available in all Talend products with Big Data
and in Talend Data Fabric.
Basic settings
Property type |
Either Built-in or Repository. |
 |
Built-in: No property data stored |
 |
Repository : Select the repository file in which the properties are stored. The fields that follow are completed automatically using the data retrieved. |
Use an existing connection |
Select this check box and in the Component List click the relevant connection component to Note: When a Job contains the parent Job and the child Job, if you
need to share an existing connection between the two levels, for example, to share the connection created by the parent Job with the child Job, you have to:
For an example about how to share a database connection |
Distribution |
Select the cluster you are using from the drop-down list. The options in the
list vary depending on the component you are using. Among these options, the following ones requires specific configuration:
|
Impala version |
Select the version of the Hadoop distribution you are using. The available |
Host |
Database server IP address. |
Port |
Listening port number of DB server. |
Database |
Fill this field with the name of the database. |
Username |
DB user authentication data. |
Use kerberos authentication |
If you are accessing an Impala system running with Kerberos security,
select this check box and then enter the Kerberos principal of this Impala system.
This check box is available depending on the Hadoop distribution you are |
Schema and Edit |
A schema is a row description. It defines the number of fields Click Edit
|
 |
Built-in: The schema is created |
 |
Repository: The schema already |
Table Name |
Name of the table you need to write data in. |
Action |
Select whether you want to OVERWRITE the old data already existing in the |
Extended insert |
Select this check box to combine multiple rows of data into one |
Advanced settings
tStatCatcher Statistics |
Select this check box to collect log data at the component |
Global Variables
Global Variables |
ERROR_MESSAGE: the error message generated by the A Flow variable functions during the execution of a component while an After variable To fill up a field or expression with a variable, press Ctrl + For further information about variables, see |
Usage
Usage rule |
This component offers the benefit of flexible DB queries and |
Dynamic settings |
Click the [+] button to add a row in the table The Dynamic settings table is For examples on using dynamic parameters, see Reading data from databases through context-based dynamic connections and Reading data from different MySQL databases using dynamically loaded connection parameters. For more information on Dynamic |
Prerequisites |
The Hadoop distribution must be properly installed, so as to guarantee the interaction
For further information about how to install a Hadoop distribution, see the manuals |
Related scenarios
For a scenario about how an output component is used in a Job, see Inserting a column and altering data using tMysqlOutput.