You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I had searched in the issues and found no similar feature requirement.
Problem Description
Users hope to customize and configure the -- conf parameter of Spark in the workflow node:
Description
This parameter is only added to the Spark node configuration: SQL, PySpark, Scala nodes—— UI for backend database control conf input box
Add a spark. conf input box to the attribute information. If there are multiple parameters, use a semicolon; Perform segmentation. Restrict input to 500 characters through UI's regular verification.
Note: DSS does not verify the parameters entered by the user, that is, if there is an error in the input of parameters and corresponding values, it may result in engine startup failure or incorrect engine parameter settings.
Use case
No response
solutions
Add ui nodes to the database:
Add a record to the dss_workflow_node_ui table:
Where: lable_name field is spark.conf, ui_type field is Text, position field is startup.
INSERT INTO dss_workflow_node_ui
( key, description, description_en, lable_name, lable_name_en, ui_type, required, value, default_value, is_hidden, condition, is_advanced, order, node_menu_type, is_base_info, position)
VALUES('spark.conf', 'spark User-defined parameter configuration input ', 'Input spark params config', 'spark.conf', 'spark.conf', 'Text', 0, NULL, NULL, 0, NULL, 0, 1, 1, 0, 'startup');
Insert spark node records associated with the ui and sql, pyspark, and scala types into the dss_workflow_node_to_ui table.
Associate ui_validate records with validate_type with a regular limit of 1-500 characters in the dss_workflow_node_ui_to_validate table.
The parameters configured by the user for spark. conf will be added to the startup parameters when saving the workflow. The key value is spark. conf, and the value is a semicolon separated string.
When submitting tasks to linkis, dss needs to split the spark. conf parameter into a key value format similar to startup and spark. driver. memory.
Resolve the spark. conf parameter in the BuildJobAction. getSubmitAction() method called by LinkisNodeExecutionImpl. runJob() and set it to be passed to Linkis.
The parameter format submitted to linkis is:
Anything else
No response
Are you willing to submit a PR?
Yes I am willing to submit a PR!
The text was updated successfully, but these errors were encountered:
Search before asking
Problem Description
Users hope to customize and configure the -- conf parameter of Spark in the workflow node:
Description
This parameter is only added to the Spark node configuration: SQL, PySpark, Scala nodes—— UI for backend database control conf input box
Add a spark. conf input box to the attribute information. If there are multiple parameters, use a semicolon; Perform segmentation. Restrict input to 500 characters through UI's regular verification.
Note: DSS does not verify the parameters entered by the user, that is, if there is an error in the input of parameters and corresponding values, it may result in engine startup failure or incorrect engine parameter settings.
Use case
No response
solutions
Where: lable_name field is spark.conf, ui_type field is Text, position field is startup.
INSERT INTO dss_workflow_node_ui
(
key
, description, description_en, lable_name, lable_name_en, ui_type, required, value, default_value, is_hidden,condition
, is_advanced,order
, node_menu_type, is_base_info,position
)VALUES('spark.conf', 'spark User-defined parameter configuration input ', 'Input spark params config', 'spark.conf', 'spark.conf', 'Text', 0, NULL, NULL, 0, NULL, 0, 1, 1, 0, 'startup');
The parameters configured by the user for spark. conf will be added to the startup parameters when saving the workflow. The key value is spark. conf, and the value is a semicolon separated string.
When submitting tasks to linkis, dss needs to split the spark. conf parameter into a key value format similar to startup and spark. driver. memory.
Resolve the spark. conf parameter in the BuildJobAction. getSubmitAction() method called by LinkisNodeExecutionImpl. runJob() and set it to be passed to Linkis.
The parameter format submitted to linkis is:
Anything else
No response
Are you willing to submit a PR?
The text was updated successfully, but these errors were encountered: