I am working on a web project, creating a tool that allows the user to submit some small amount of data (a ~30mb text file), from which I want to create several thousand database entries that can be searched through.
The database entries will then be searched through depending on the user’s search parameters.
Right now my idea for implementing this is to accept the text file onto my server –> process the data within the file using Python –> create a MySQL database table from the Python output –> search this table with the user’s parameters –> delete the table when the job is finished and the data is outputted.
The problem that I foresee is that each time a user creates a submission, a new table will be created, so I will need to have some way of cleaning these tables up (deleting them) after they are no longer needed. On top of that, if I want to store user jobs, my database will get cluttered with hundreds and hundreds of tables from user submissions, so I feel like my architecture/design is totally wrong.
I haven’t implemented the database/backend fully yet but wanted to know if there was a more efficient way of creating, storing, and searching ‘temporary’ tables. My experience is with PHP/MySQL, but if anyone knows of a better way to do this, I would appreciate the info.