August 15, 2023

Loading the data from the local file – Docs for ESB 6.x

Loading the data from the local file

  1. Double-click tHDFSPut to define the
    component in its Basic settings
    view.

    Use_Case_tHDFSGet5.png

  2. Select, for example, Apache 0.20.2 from the Hadoop
    version
    list.
  3. In the NameNode URI, the
    Username and the Group fields, enter the connection parameters to
    the HDFS. If you are using WebHDFS, the location should be
    webhdfs://masternode:portnumber; if this WebHDFS is secured
    with SSL, the scheme should be swebhdfs and you need to use
    a tLibraryLoad in the Job to load the library required by
    the secured WebHDFS.
  4. Next to the Local directory field, click
    the three-dot […] button to browse to the
    folder with the file to be loaded into the HDFS. In this scenario, the
    directory has been specified while configuring tFileOutputDelimited:
    C:/hadoopfiles/putFile/.
  5. In the HDFS directory field, type in the
    intended location in HDFS to store the file to be loaded. In this example,
    it is /testFile.
  6. Click the Overwrite file field to stretch
    the drop-down.
  7. From the menu, select always.
  8. In the Files area, click the plus button
    to add a row in which you define the file to be loaded.
  9. In the File mask column, enter
    *.txt to replace newLine
    between quotation marks and leave the New
    name
    column as it is. This allows you to extract all the
    .txt files in the specified directory without
    changing their names. In this example, the file is
    in.txt
    .

Document get from Talend https://help.talend.com
Thank you for watching.
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x