There are too many Spark sessions and the session timeout parameter is not effective #6855
Open
2 of 4 tasks
Labels
kind:question
A question will be move to discussions
Code of Conduct
Search before asking
Describe the bug
When using kyuubi to start a spark SQL ON K8S cluster, the number of internal sessions increases. The parameter
kyuubi.engine.user.isolated.spark.session.idle.timeout=PT30M
is found in the documentation. The default value is 6H. After changing it to 30M, there are still sessions that exceed 30M and are not closed, and the SQL of the session has been executed. I would like to ask whether the creation of this session is based on a SQL or how it is created. I found that there are not so many SQLs in my cluster, but there are many sessions. I don't know how they are created.Affects Version(s)
1.9.1
Kyuubi Server Log Output
No response
Kyuubi Engine Log Output
No response
Kyuubi Server Configurations
No response
Kyuubi Engine Configurations
No response
Additional context
No response
Are you willing to submit PR?
The text was updated successfully, but these errors were encountered: