Arranging the data
flow
-
In the
Integration
perspective of the Studio, create an empty
Spark Batch Job, named rf_model_creation
for example, from the Job Designs node in
the Repository tree view.For further information about how to create a Spark Batch Job, see the Getting Started Guide of the Studio. -
In the workspace, enter the name of the component to be used and select this component from the list that appears. In this scenario, the components are tHDFSConfiguration, tFileInputDelimited, tRandomForestModel component, and 4 tModelEncoder components.
It is recommended to label the 4 tModelEncoder components to
different names so that you can easily recognize the task each of them is
used to complete. In this scenario, they are labelled Tokenize, tf,
tf_idf and features_assembler, respectively. -
Except tHDFSConfiguration, connect the other
components using the Row > Main link as is
previously displayed in the image.
Document get from Talend https://help.talend.com
Thank you for watching.
Subscribe
Login
0 Comments