docker – Pyspark – MySQL is not showing the history queries after the ETL os complete

I am using docker container for MySQL and Spark. Both containers are on AWS Ec2 instance.

Pyspark ETL connects to MySQL with JDBC and start extracting the data. When the ETL is running i can see the long running threads/queries in MySQL with command (show full processlist). I think this data comes from metadata table information_schema.processlist table.

Problem I am facing is after the ETL is complete, I am not able to find the queries of previous runs in any of information_Schema tables. Do I need to do any configuration ? how can I solve this issue. Which schema/tables do I need to refer. Appreciate your response