The three ELT Hive components are closely related, in terms of their operating
conditions. These components should be used to handle Hive DB schemas to generate Insert
statements, including clauses, which are to be executed in the DB output table
defined.
Component family |
ELT/Map/Hive |
|
Function |
This component executes the query built by the preceding tELTHiveMap component to write data into |
|
Purpose |
This component works alongside tELTHiveMap to write data into the Hive |
|
Basic settings |
Action on data |
Select the action to be performed on the data to be written in the Hive table. With the Insert option, the data to be written in the |
Schema |
A schema is a row description. It defines the number of fields to |
|
|
|
Built-In: You create and store the schema locally for this Repository: You have already created the schema and Since version 5.6, both the Built-In mode and the Repository mode are |
Edit schema |
Click Edit schema to make changes to the schema. If the
|
|
|
Default table name |
Enter the default name of the output table you want to write data |
Default schema name |
Enter the name of the default database schema to which the output |
|
|
Use different table name |
Select this check box to define a different output table name, If this table is related to a different database schema from the |
The target table uses the Parquet |
If the table in which you need to write data is a Parquet table, select this check box. Then from the Compression list that appears, select the |
|
Field Partition |
In Partition Column, enter the In Partition Value, enter the |
|
Advanced settings |
tStatCatcher Statistics |
Select this check box to collect log data at the component |
Global Variables |
ERROR_MESSAGE: the error message generated by the A Flow variable functions during the execution of a component while an After variable To fill up a field or expression with a variable, press Ctrl + For further information about variables, see Talend Studio |
|
Usage |
tELTHiveMap is used along with a If the Studio used to connect to a Hive database is operated on Windows, you must manually NoteThe ELT components do not handle actual data flow but only |
For a related scenario, see Scenario: Joining table columns and writing them into Hive