found this table has 800,000 rows and does not have any indexes
Is there any safer and more sufficient way to solve this problem?
Only really re-emptive work:
Proper design work up-front so there are no large tables with no indexes, or not common queries that are not well-supported by the existing indexes. This might be out of your hands if you don’t work directly with the development team for the application, but as a DBA you could monitor your databases for potentially worrying structures (i.e. a table with no keys/indexes, regular slow running queries if you log those, …). Recheck after application updates, in case a structure migration has failed silently and left indexes undefined that the developers have added.
Load testing on the application to make sure no such issues are likely (you often can’t rule them out entirely, but you can certainly minimise the risk). Again, this may be out of your hands.
but I think this approach is risky
A long index build, especially if done as an offline operation so could hold up all other access to the table, could be problematic, but when this happens adding the index is the only solution. Perhaps go for an online index build if possible – that will take longer but won’t completely block part of the application while it does the job.
Also, if you have a dev copy of the system, test the new indexing changes there before applying to production to avoid making a change that will take ages to apply and have little of the benefit you are needing.