sitemap – How to tell bots to forget a site and index it from scratch

It does not work You need to map your old URLs to the new ones with redirects for SEO and user experience.

Google never forgets about old URLs, even after a decade. If you are migrating to a new CMS, you must implement page-level redirects

If there is no match for a particular page, you can allow it 404 and Google will remove it from the index. If you use "410 Gone" instead, Google deletes the URLs from the index as soon as they have been crawled without the Google-defined "404 Not Found" deadline of 24 hours.

There is no instruction that instructs bots to forget an old site in the search console or robots.txt.

What if you do not redirect?

Redirecting may be too time-consuming, or your new CMS may not simplify the implementation of the redirect.

If you do not implement the redirects, it will start from scratch. Google recognizes that your legacy URLs return the status 404 and removes them from the search index.

Your new URLs may be indexed, but it may take a while. Changing all of your redirect URLs is a big sign that your site is not stable and can not be trusted. All your placements will be lost and your website will be restarted.

Googlebot will search the old URLs for years to come. The hope is eternal that you can open these pages again someday.

If you redirect, all inbound links, users' bookmarks, and most of your current leaderboards will be preserved.

Why?

Why do search engines have no "reset" button? Because there are almost always better options. In your case, it is much better to divert.

In the event of a site being penalized, Google will not offer a reset button as it may remove all penalties.

As?

How do you implement the redirects? You need a list of your old URLs. You may have a sitemap from your old website that you can start with. You can also retrieve the list from your server logs, Google Analytics, or even from Google's search panel.

If you've planned in advance, your URLs will be similar in your new CMS and you can implement a rewrite rule to handle them. If there is a pattern between the old and the new URL, it can be a one-liner in a URL .Access File to output the redirects for the entire website.

If you have to manually search for the new URLs and assign thousands of them one after the other, you can look it up RewriteMap Functionality.

seo – GSC: Sitemap could not be retrieved

I'm trying to send a very simple sitemap (for testing only) to the Google Search Console, but I'm getting the following error message:

╔══════════════╤═════════╤══════════════╤═════════ ══╤══════════════════╤═════════════════╗
║ Sitemap │ Type │ Submitted │ Last read │ Status │ Found URLs ║
╠══════════════╪═════════╪══════════════╪═════════ ══╪══════════════════╪═════════════════╣
║ /sitemap.txt │ Unknown │ July 17, 2019 │ │ * Could not retrieve * │ 0 ║
╚══════════════╧═════════╧══════════════╧═════════ ══╧══════════════════╧═════════════════╝

If you click on it, an additional error message will be displayed: "(!) Sitemap could not be read".
However, clicking on "OPEN SITEMAP" will open it normally.

question
Any idea what's going on?


Domain: world-hello.ddns.net
Sitemap file: sitemap.txt
Server: Apache (Debian)

Simple Grabber Youtube No Api Key & Car SITEMAP | WJunktion

Properties :
Attractive and clear design
Mobile & amp; desktop
Auto Grab Content (AGC)
Top Charts iTunes
Youtube search
Youtube video result on the search page
Last search
No Youtube API Key v3 needed (No Limit)
No need DB
DMCA page, privacy page, contact page
Block Bad Keyword
ADS: top banner, bottom banner, popup
Automatic sitemap, insert keyword with TXT
Download Format Video & amp; MP3
Fast loading

Requirement :
Apache / Nginx Server (Mod Rewrite enabled)
PHP 5.6+ (PHP curl enabled)

Demo: https://freevideo.sch.gdn

Installation:
Upload the script to the root directory.
The script is already in use until this step.
If you want to change that
Site name, email, direct link ads, etc., please open Readme.txt

DOWNLOAD

,

Simple AGC MP3 No Api Key & Auto SITEMAP | WJunktion

Simple AGC MP3 No Api Key & Auto SITEMAP

Properties :
Simple, clean and attractive design
Mobile and desktop
Auto Grab Content (AGC)
No database
No API
Fast loading
Top songs from Itunes
Search result from Youtube
Inject keywords
Filtering Badwords
Clean code and easy to use
Easy to custome
ADS: Top, Bottom, Popup

Requirement :
Apache / Nginx Server (Mod Rewrite enabled)
PHP 5.6+ (PHP curl enabled)

Demo: https://mp3.csyoutube.com

Installation:
Upload the script to the root directory.
The script is already in use until this step. If you want to change that
Site name, email, direct link ads, etc., just edit "config_edit.php"File in the root directory The file contains all configurations with description.

DOWNLOAD

,

SEO – Should I use an XML Sitemap instead of a TXT sitemap for a site with deeply nested product pages?

context

I have a B2B spare parts website with the following content:

  • 25 parent categories (hierarchically organized)
  • 150 sheet categories (models)
  • 250 products (one-of-a-kind with a quantity = 1)

targeted Visitors search for a specific spare part,
As a rule, they do not hesitate between several brands and products like the consumer segment.

The website is aimed at specialists (niche market).

Despite several optimizations The website continues to be badly referenced in the search results compared to those of the competitors.

I have to admit that I'm not a fan of social networks, so there are few links to the site that come from forums.

Publishing many products on the home page may make it easier to referencing the site, but would also create duplicate content with the dedicated product pages.

There is general consensus in this thread that there is no harm in using a txt sitemap instead of an xml sitemap. However, I am not sure about this because the sites to be indexed are buried deep in the hierarchy and search engines ignore intermediates.


How pages are currently indexed

Google was able to index the pages by leaf categories and products served by two text sitemaps (URL lists):

Sitemap with leaf categories:

https://example.com/A_model
https://example.com/Another_model
(...)

Sitemap with products:

https://example.com/A_product
https://example.com/Another_product

Access to the products is usually via a search boxwhere the visitor enters the model he wants to purchase for spare part (s). The model name is used as friendly URL and .Access The file forwards directly to the leaf category page.

# Currently no friendly URLs for intermediate categories.

# Friendly URL for leaf categories (models)
RewriteRule ^ A_model $ /index.php?cmd=category&cat_id=123 [L]
RewriteRule ^ Another_model $ /index.php?cmd=category&cat_id=124 [L]

On the category pages there are links to the individual spare parts.

There are also user-friendly URLs used and the forwarding is done with .Access File.

# Friendly URL for unique products
RewriteRule ^ A_product $ /index.php?cmd=products&prod_id=456
RewriteRule ^ Another_product $ /index.php?cmd=products&prod_id=789

If only one spare part is available for a given model, user convenience is automatically redirected from the leaf category page to the unique product page so that the category address acts as a tiny URL (or as a gateway, if you prefer). to the product page.


If the visitor wants Browse categoriesbut he can do it ajaxified tree their extended nodes load the subcategories during operation. (The site uses dynatree.js with delayed loading.)

So, Robots know the corresponding landing pages for sale (leaf categories and product pages), but because they do not have an XML sitemap, The site may appear unstructured (no hierarchical structure you know).


Why I used .txt sitemaps So far no .xml files:

  • Easier Maintenance: I just need to add a new link when a new product or category is released
  • Target groups are experts in their field.
    who know from the beginning which model / part they are looking for.
  • Intermediate categories (branches) are – apart from – almost irrelevant
    See the different product families available – and therefore do not need to be referenced.

Ask:

  1. Should I create friendly URLs for? Intermediate categories and add
    she on the sitemap
    to make the website more structured
    that these pages would create duplicate content with the sheet
    Categories and product pages?
  2. In this particular case I should Switch from TXT Sitemaps to Sitemaps? (although the maintenance would be much more difficult).
  3. I plan to replace an ajaxified tree with ajaxified navigation based on tags (filters). Would this make referencing even worse?
  4. Since the homepage is more or less similar to a search engine (ie with little content), you advise to add something "bla bla bla" – even if this is useless for the visitor – to attract more visitors?

How do I configure a Sitemap for a MediaWiki site?

I mainly refer to this line:

If you want to display a human readable sitemap, grant read access to
sitemap.xsl file in your site configuration (.htacces file or other)

What is related to a MediaWiki extension:

https://www.mediawiki.org/wiki/Extension:AutoSitemap

I have the extension installed, but I want to know how to see the contents of the sitemap file.

Information Architecture – Are tabs and / or steps in a wizard displayed as separate fields in a Sitemap chart?

I'm creating a sitemap for a business application.

For a section in the application, there is a function to edit the calendar. Once you have clicked on it, you can set up three sections / or different types of calendars.

  1. Start / end date for the entire project
  2. Excluded dates (holidays and days off, etc.)
  3. Start / end dates for specific tasks within the project

We are currently using a step wizard to edit the calendar so that the user has to set up the data in that order.

Will I mark each step as a separate box in my sitemap, or would that be done on a separate user flowchart?

Sitemap Sketch

SEO – I use curl to submit sitemap, but get no reflection on the web portal

That's TWO things that puzzled me on how to submit my sitemap.

First.
It is understood that a sitemap can be submitted with cURL. https://support.google.com/webmasters/answer/183668?hl=de

Every time the Sitemap submission is not displayed in the web portal. The timestamp in the web portal only shows the time at which I submit the sitemap via the web portal.

Why are they different? Does the submission via curl work?

Second
I've found that Google only crawls my site after I submit the sitemap for the first time. I regularly update my website and Sitemap and hope Google can crawl my website. However, it seems that Google will never come back.

What's wrong with it?

If you can see the image, you will see several peaks immediately after sending the sitemap through the search panel. If you leave it alone, it will be flat.

Enter image description here

Google Search Console – Add domain / subdomain and sitemap for each

The best way is mentioned in point 2.

You must add two separate properties to your search console

  1. http://example.com
  2. http://subdomain.example.com

The procedure in point 1 is now not possible because you can not add a Sitemap for a subdomain on the Internet in the new search console http://example.com Property.

The new search console automatically adds a domain property that consolidates all the properties in your domain, including all subdomains. Within the domain property, you can add any sitemap that matches your site, whether it's a subdomain or a root domain