Warning
This component will be available in the Palette of
Talend Studio on the condition that you have subscribed to one of
the Talend
solutions with Big Data.
Component family |
MapReduce / Output |
|
Function |
tFileOutputParquet receives This component, along with the MapReduce family it belongs to, appears only when you are |
|
Purpose |
tFileOutputParquet creates |
|
Basic settings |
Property type |
Either Built-in or Repository. |
Built-in: no property data stored |
||
Repository: reuse properties The fields that come after are pre-filled in using the fetched For further information about the Hadoop |
||
Schema and Edit |
A schema is a row description. It defines the number of fields to be processed and passed on Click Edit schema to make changes to the schema. If the
|
|
Built-In: You create and store the schema locally for this |
||
Repository: You have already created the schema and |
||
|
Folder/File |
Browse to, or enter the directory in HDFS where the data you need to use is. Note that you need |
Action |
Select an operation for writing data: Create: Creates a file and write Overwrite: Overwrites the file |
|
Compression |
By default, the Uncompressed Hadoop provides different compression formats that help reduce the space needed for |
|
Global Variables |
ERROR_MESSAGE: the error message generated by the A Flow variable functions during the execution of a component while an After variable To fill up a field or expression with a variable, press Ctrl + For further information about variables, see Talend Studio |
|
Usage |
In a Talend Map/Reduce Job, it is used as an end component and requires Once a Map/Reduce Job is opened in the workspace, tParquetOutput as well as the MapReduce Note that in this documentation, unless otherwise explicitly stated, a scenario presents |
|
Hadoop Connection |
You need to use the Hadoop Configuration tab in the This connection is effective on a per-Job basis. |
This component is used in the similar way as the tAvroOutput
component. For a scenario using tAvroOutput, see Scenario: Filtering Avro format employee data.