Google – Is a URL required when creating an image sitemap?

The reason I'm asking is that I'm writing a script that scans a folder for pictures so I do not necessarily know where they're used or the exact page. Example from Google


Is it possible to do that and would google / other search engines still read it correctly?




How do I submit an Sitemap to Google Webmaster? | Forum Promotion

The first step would be to generate a sitemap, then add your site to the Google Search Console and paste the link into your sitemap.

Sitemap is indeed a great idea to get your pages indexed faster and better. If you use WordPress, there are some plugins that support the creation of sitemaps. Apart from that, you can also use your RSS feed as a sitemap :)

Google Search Console – Sitemap could not be retrieved

I'm trying to submit my sitemap on my Google console, but every time I add it, I've always been told that the fetch was not possible. I've uploaded my sitemap.xml to my domain root, but Google still does not see it. Please, how can I solve this problem? I spent the last two days with it and it's like I'm going into the circle.

Below is my full URL

My sitemap is in the public_html folder named sitemap.xml

sitemap – In the case of 404 Page, the URL should be 404 or we can change the content to 404 and the URL stays the same

I'm confused about the 404 page

Suppose I have a landing page

The page does not exist anymore. What should I do with it?

Case 1: Should I redirect the URL?

Case 2: Should I simply replace the content with a 404 page?
URL stays the same

Which case should be used

Which Vimeo URLs can I include in a video sitemap that are not redirected or banned?

I'm not sure which Vimeo URLs I can use for video sitemaps because they are either "redirect" or "forbidden" – both of which would not be really acceptable in Google Video XML Sitemaps.

How to create video sitemaps / Which URLs do you use for Vimeo hosted videos?

In other words, my whole:

  • … redirects below
  • … is not found
  • … is prohibited

seo – pages with irrelevant content on a sitemap bad signals to search engines?

I have many WordPress sites with content that automatically generates a sitemap.xml file. By default, the sitemap includes pages such as "categories" or "author" that are either empty or of irrelevant content.

Is it harmful to allow these pages as part of the sitemap.xml file? Is this a bad signal for Google?

Removing them from the sitemap is possible, but requires some work and it wants to know if it's worth the effort or not.

I guess this looks like a stupid question, but in fact I've done some research and it seems that sending a sitemap does not mean that all pages are indexed. Google will decide which one is best, and the rest will be ignored. But I could not tell if ignoring is harmless or a bad signal.

Thank you, Mihai

seo – Is another sitemap per language ok? How can I tell Google about this?

You can have multiple sitemaps per site. This is an excellent example of when this makes sense.

Make sure you have a sitemap index listing each of your sitemaps. It will probably look like this:


Remember to link this index in your robots.txt file, for example:


There is also the option to specify alternative pages in your sitemap itself. The setup is a bit more complicated and inherently does not answer the original question about the user-suggested settings.

Sitemap errors

Discussion in & # 39; Internet Marketing & # 39; started by australiataxi, December 17, 2018 at 08:13,

  1. australia taxi

    australia taxi



    New member

    November 3, 2018
    Likes received:

    How can I find and fix sitemap errors?

    # 1

    australia taxi,
    December 17, 2018 at 08:13

Share this page

This is how you hide my Sitemap from competitors, not from search engines

The clever solution is to create two sitemaps. The first is for your competitors, the second for your favorite search engines. In military parlance, this first sitemap is a pretense.

The & # 39; feint & # 39; contains your basic website structure, homepage, contact, about us, main categories. It looks like the real deal and works well in dark search engines that you do not care about. It will not benefit your competitors either. Let it be indexed to find it, and give it a unique name like sitemap.xml.

Create your real sitemap with code now. Give it a name like "product-information-sitemap.xml" so it's a reasonable name, but not easier to guess than your password.

Add something to your Sitemap folder in your Apache configuration so that this second Sitemap can be called by search engines, but not indexed:

        Header X-Robots tag "noindex"

Now create the code to maintain this update and look at a third sitemap for images. Dowwngrade it as needed to get the & # 39; feint & # 39; to create. Also pay attention to the timestamps, Google is aware of these and this is important if your sitemap is large.

Now create a & # 39; cron & # 39; job to periodically submit your Sitemap to Google. Add something in your crontab entry to submit your real sitemap every week:

0 0 * * 0 wget

Note that the URL is URL encoded.

You can also gzip your sitemap if size is a problem, even though your web server should provide the gzip, if you have enabled it.

Your robots.txt need not be fancy, as long as it does not allow access to your sitemaps, this should be fine. There is really no need to send different robots.txt files based on user agent strings or other such complicated elements. Just drag your valuable content into an additional unpromoted file and send it to Google on a cron job (instead of waiting for the bot). Easy.