sitemap – How to tell bots to forget a site and index it from scratch

It does not work You need to map your old URLs to the new ones with redirects for SEO and user experience.

Google never forgets about old URLs, even after a decade. If you are migrating to a new CMS, you must implement page-level redirects

If there is no match for a particular page, you can allow it 404 and Google will remove it from the index. If you use "410 Gone" instead, Google deletes the URLs from the index as soon as they have been crawled without the Google-defined "404 Not Found" deadline of 24 hours.

There is no instruction that instructs bots to forget an old site in the search console or robots.txt.

What if you do not redirect?

Redirecting may be too time-consuming, or your new CMS may not simplify the implementation of the redirect.

If you do not implement the redirects, it will start from scratch. Google recognizes that your legacy URLs return the status 404 and removes them from the search index.

Your new URLs may be indexed, but it may take a while. Changing all of your redirect URLs is a big sign that your site is not stable and can not be trusted. All your placements will be lost and your website will be restarted.

Googlebot will search the old URLs for years to come. The hope is eternal that you can open these pages again someday.

If you redirect, all inbound links, users' bookmarks, and most of your current leaderboards will be preserved.

Why?

Why do search engines have no "reset" button? Because there are almost always better options. In your case, it is much better to divert.

In the event of a site being penalized, Google will not offer a reset button as it may remove all penalties.

As?

How do you implement the redirects? You need a list of your old URLs. You may have a sitemap from your old website that you can start with. You can also retrieve the list from your server logs, Google Analytics, or even from Google's search panel.

If you've planned in advance, your URLs will be similar in your new CMS and you can implement a rewrite rule to handle them. If there is a pattern between the old and the new URL, it can be a one-liner in a URL .Access File to output the redirects for the entire website.

If you have to manually search for the new URLs and assign thousands of them one after the other, you can look it up RewriteMap Functionality.