Component family |
Databases/Amazon Redshift |
|
Function |
This component runs a specified query in Amazon Redshift and then unloads the result of |
|
Purpose |
This component allows you to unload data on Amazon Redshift to files on Amazon |
|
Basic settings |
Property Type |
Either Built-In or Repository. Since version 5.6, both the Built-In mode and the Repository mode are |
|
|
Built-In: No property data stored |
|
|
Repository: Select the repository file in which the |
Database settings |
Use an existing connection |
Select this check box and in the Component List click the |
|
Host |
Type in the IP address or hostname of the database server. |
|
Port |
Type in the listening port number of the database server. |
|
Database |
Type in the name of the database. |
|
Schema |
Type in the name of the schema. |
|
Username and Password |
Type in the database user authentication data. To enter the password, click the […] button next to the |
|
Table Name |
Type in the name of the table from which the data will be |
|
Schema and Edit schema |
A schema is a row description. It defines the number of fields to be processed and passed on Since version 5.6, both the Built-In mode and the Repository mode are |
|
|
Built-In: You create and store the schema locally for this |
|
|
Repository: You have already created the schema and |
|
|
Click Edit schema to make changes to the schema. If the
|
|
Query Type and Query |
Enter the database query paying particularly attention to the proper sequence of the |
|
Guess Query |
Click the button to generate the query which corresponds to the |
S3 Setting |
Access Key |
Specify the Access Key ID that uniquely identifies an AWS Account. |
|
Secret Key |
Specify the Secret Access Key, constituting the security To enter the secret key, click the […] button next to |
|
Bucket |
Type in the name of the Amazon S3 bucket to which the data is unloaded. |
|
Key prefix |
Type in the name prefix for the unload files on Amazon S3. By default, the unload files |
Advanced settings |
File type |
Select the type of the unload files on Amazon S3 from the list:
|
|
Fields terminated by |
Enter the character used to separate fields. This field appears only when Delimited file |
|
Enclosed by |
Select the character in a pair of which the fields are This list appears only when Delimited file |
|
Fixed width mapping |
Enter a string that specifies a user-defined column label and
Note that the column label in the string has no relation to the This field appears only when Fixed |
|
Compressed by |
Select this check box and from the list displayed select the |
|
Encrypt |
Select this check box to encrypt unload file(s) using Amazon S3 client-side encryption. |
|
Encryption key |
Enter the encryption key used to encrypt unload file(s). This field appears only when the Encrypt check box is selected. |
|
Specify null string |
Select this check box and from the list displayed select a string |
|
Escape |
Select this check box to place an escape character () before every occurrence of the |
|
Overwrite s3 object if exist |
Select this check box to overwrite the existing Amazon S3 object |
|
Parallel |
Select this check box to write data in parallel to multiple unload |
|
tStatCatcher Statistics |
Select this check box to gather the Job processing metadata at the |
Dynamic settings |
Click the [+] button to add a row in the table and fill The Dynamic settings table is available only when the For more information on Dynamic settings and context |
|
Global Variables |
ERROR_MESSAGE: the error message generated by the A Flow variable functions during the execution of a component while an After variable To fill up a field or expression with a variable, press Ctrl + For further information about variables, see Talend Studio |
|
Usage |
This component covers all possible SQL queries for the Amazon Redshift database. |
|
Log4j |
The activity of this component can be logged using the log4j feature. For more information on this feature, see Talend Studio User For more information on the log4j logging levels, see the Apache documentation at http://logging.apache.org/log4j/1.2/apidocs/org/apache/log4j/Level.html. |
For a related scenario, see Loading/unloading data from/to Amazon S3.