Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a containerized mode to the ECM service #5201

Merged

Conversation

sjgllgh
Copy link
Contributor

@sjgllgh sjgllgh commented Nov 12, 2024

What is the purpose of the change

Add a containerized mode to the ECM service, which allows assigning specific IPs and ports for communication with the outside world to particular engines in this mode. For instance, a Spark engine requires at least two ports: spark.driver.port and spark.driver.blockManager.port.

Related issues/PRs

Related issues: #5199

Checklist

  • I have read the Contributing Guidelines on pull requests.
  • I have explained the need for this PR and the problem it solves
  • I have explained the changes or the new features added to this PR
  • I have added tests corresponding to this change
  • I have updated the documentation to reflect this change
  • I have verified that this change is backward compatible (If not, please discuss on the Linkis mailing list first)
  • If this is a code change: I have written unit tests to fully verify the new behavior.

…pecific IPs and ports for communication with the outside world to particular engines in this mode. For instance, a Spark engine requires at least two ports: spark.driver.port and spark.driver.blockManager.port.
SPARK_DRIVER_BLOCK_MANAGER_PORT.key,
SPARK_DRIVER_BLOCK_MANAGER_PORT.getValue(options)
)
sparkConfig.setConf(conf)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line needs to delete?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line is necessary because the sparkConf object does not have a directly available method to set parameters such as spark.driver.port. Instead, these parameters need to be assigned values through sparkConf.setConf

@sjgllgh
Copy link
Contributor Author

sjgllgh commented Nov 15, 2024

Why did my integration test fail when tested with linkis-cli? How can I check the failure logs? I have executed the integration test in my local environment and it passed. The specific execution command is as follows:

  1. mvn install -Pdocker -Dmysql.connector.scope=compile -Dmaven.javadoc.skip=true -Dmaven.test.skip=true -Dlinkis.build.ldh=true
  2. sh ./linkis-dist/helm/scripts/create-kind-cluster.sh
  3. sh ./linkis-dist/helm/scripts/install-mysql.sh false
  4. sh ./linkis-dist/helm/scripts/install-ldh.sh true
  5. sh ./linkis-dist/helm/scripts/install-charts-with-ldh.sh linkis linkis-demo true
  6. sh ./linkis-dist/helm/scripts/remote-proxy.sh start
  7. sh ./linkis-dist/helm/scripts/login-pod.sh mg-gatewaysh
  8. sh /opt/linkis/bin/linkis-cli -engineType shell-1 -codeType shell -code "pwd"


kubectl exec -it -n linkis ${POD_NAME} -- bash -c " \
sh /opt/linkis/bin/linkis-cli -engineType python-python2 -codeType python -code 'print(\"hello\")' "
sh /opt/linkis/bin/linkis-cli --async true -engineType shell-1 -codeType shell -code \"pwd\" ";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Asynchronous cannot be used and the task needs to be successful.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have rolled back this part of the code.

…_history within the configmap-init-sql.yaml file.
@sjgllgh sjgllgh force-pushed the issue-5199_add_support_for_containerization branch from 7713798 to e4b7ee6 Compare November 19, 2024 14:04
@sjgllgh sjgllgh force-pushed the issue-5199_add_support_for_containerization branch from 9f1e7c2 to 29005f2 Compare November 27, 2024 01:30
# Conflicts:
#	linkis-engineconn-plugins/spark/src/main/scala/org/apache/linkis/engineplugin/spark/config/SparkConfiguration.scala
Copy link
Contributor

@peacewong peacewong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

@peacewong peacewong merged commit bca9fe2 into apache:master Nov 27, 2024
12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants