Prevent the subdomain of robots and prevent Google from indexing them

Hello,

I've created a subdomain to test a WordPress site. It's a huge website that takes up a lot of space. I want to ban robots, prevent Google from indexing the site, etc. Basically, you ban everything except the human visit to the site so I can still see it.
SEMrush

If I put this in the robots.txt and upload it to the files of the subdomain, it will only affect the subdomain or the Full Page?

User Agent: *
Do not allow: /

Because the main domain should be left alone.

I know that this code prohibits robots from visiting.
But should I add something else? Will this also prevent Google from indexing pages or images of the subdomain?