Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There are too many Spark sessions and the session timeout parameter is not effective #6855

Open
2 of 4 tasks
A-little-bit-of-data opened this issue Dec 20, 2024 · 3 comments
Labels
kind:question A question will be move to discussions

Comments

@A-little-bit-of-data
Copy link

A-little-bit-of-data commented Dec 20, 2024

Code of Conduct

Search before asking

  • I have searched in the issues and found no similar issues.

Describe the bug

When using kyuubi to start a spark SQL ON K8S cluster, the number of internal sessions increases. The parameter kyuubi.engine.user.isolated.spark.session.idle.timeout=PT30M is found in the documentation. The default value is 6H. After changing it to 30M, there are still sessions that exceed 30M and are not closed, and the SQL of the session has been executed. I would like to ask whether the creation of this session is based on a SQL or how it is created. I found that there are not so many SQLs in my cluster, but there are many sessions. I don't know how they are created.

image

image

image

Affects Version(s)

1.9.1

Kyuubi Server Log Output

No response

Kyuubi Engine Log Output

No response

Kyuubi Server Configurations

No response

Kyuubi Engine Configurations

No response

Additional context

No response

Are you willing to submit PR?

  • Yes. I would be willing to submit a PR with guidance from the Kyuubi community to fix.
  • No. I cannot submit a PR at this time.
@pan3793
Copy link
Member

pan3793 commented Dec 23, 2024

you should read the docs carefully, the docs of kyuubi.engine.user.isolated.spark.session.idle.timeout say

If kyuubi.engine.user.isolated.spark.session is false ...

and the default value of kyuubi.engine.user.isolated.spark.session is true, further, the docs of kyuubi.engine.user.isolated.spark.session say

... if the engine is running in a group or server share level ... Note that, it does not affect if the share level is connection or user.

while the screenshot shows you are using USER share level

@pan3793
Copy link
Member

pan3793 commented Dec 23, 2024

for your cases, maybe you should change kyuubi.session.idle.timeout

@pan3793 pan3793 added kind:question A question will be move to discussions and removed kind:bug This is a clearly a bug priority:major labels Dec 23, 2024
@pan3793 pan3793 changed the title [Bug] There are too many Spark sessions and the session timeout parameter is not effective There are too many Spark sessions and the session timeout parameter is not effective Dec 23, 2024
@A-little-bit-of-data
Copy link
Author

you should read the docs carefully, the docs of kyuubi.engine.user.isolated.spark.session.idle.timeout say

If kyuubi.engine.user.isolated.spark.session is false ...

and the default value of kyuubi.engine.user.isolated.spark.session is true, further, the docs of kyuubi.engine.user.isolated.spark.session say和默认值 kyuubi.engine.user.isolated.spark.sessiontrue ,此外,文档 kyuubi.engine.user.isolated.spark.session

... if the engine is running in a group or server share level ... Note that, it does not affect if the share level is connection or user.

while the screenshot shows you are using USER share level

Thank you very much. Can I understand it this way? When using the user sharing level, kyuubi.session.idle.timeout is used to control the session timeout. When using the group or service sharing level, kyuubi.engine.user.isolated.spark.session.idle.timeout is used to control the session timeout. Whether it is spark, flink, or trino engine

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind:question A question will be move to discussions
Projects
None yet
Development

No branches or pull requests

2 participants