reactjs – Adsense and sitemap.xml

I’m trying to add ads on my website (using react js). Adsense says to me that I don’t have enough data to implements ads on my website. So I decide to add publics (or not) profiles. These profiles are accessibles (or not) using this kind of paths : /profile/thomas, /profile/mark, etc.

In my sitemap.xml, does I need to add EACH user profiles like /profile/thomas, etc. (each sign up ad another profile, that can be complicated ?), or just the route /profile ? (in order that google adsense recognize that I’ve some data into these profiles)

7 – Drupal conditional .htaccess rule for multisiting sitemap.xml

We have multiple sites hosted with Domain access module. I was planning to host sitemap.xml files like



I am applied something like this

RewriteCond %{HTTP_HOST} ^ [NC]
RewriteRule ^sitemap.xml$ /

But this rule is not universal and requires to duplicate this rules every time new domain added. How this rule can be improved to serve sitemaps from web_root/website_name/sitemap.xml?

any help appreciated.

Sitemap.xml | Forum Promotion – Best Webmaster, Admin and Internet Marketing Forum

I am working on onpage SEO for my new site and I am trying add sitemap.xml into root. I test few sitemap generators and none didn’t work. Whenever I test my SEO score, one of errors is not existing or bad form of sitemap.xml.

What free generator you use or you make it manual?

If it can help, I would provide my site in inbox. I don’t advertise it yet.

Sitemap.xml giving error when access

Server : Shared Bigrock
Trying to Rewrite sitemap.php to sitemap.xml
But when going to access (Showing blank in firefox and HTTP ERROR 500 in chrome).

Note : Working fine on local server (xampp).

My HTACCESS Code is as below

Options +FollowSymLinks -MultiViews
RewriteEngine on
# ensure www.
RewriteCond %{HTTP_HOST} !^www. (NC)
RewriteRule ^ https://www.%{HTTP_HOST}%{REQUEST_URI} (L,R=301)
# ensure https
RewriteCond %{HTTP:X-Forwarded-Proto} !https
RewriteCond %{HTTPS} off
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} (L,R=301)
RewriteRule places-(.*).php$ places.php?page=$1 (NC,L)
RewriteRule blog-(.*).php$ blog.php?page=$1 (NC,L)
Rewriterule sitemap.xml$ sitemap.php (L)
Rewriterule sitemap1.xml$ sitemap1.php (L)
Rewriterule sitemap2.xml$ sitemap2.php (L)

Code (markup):

I am using same code to one of our website at VPS and working fine but not at shared server. Please guide me that what i am missing here.

I am trying to find error and put code also at sitemap.php

ini_set('display_errors', 1);
header('Content-Type: text/xml');


But seems like its not entering to page ….. because showing either blank page or 500 error only.


Theme development – how do i add url and homepage modification date to sitemap.xml in wordpress?

I use this code to add sitemap.xml to my WordPress site without a plugin. that works great. but is incomplete. Please add the URL and the modification date of the WordPress homepage (index.php) to this sitemap. I display my recently published and recently changed contributions on my homepage.

    function xml_sitemap(){$postsForSitemap=get_posts(array('numberposts'=>-1,'orderby'=>'modified','post_type'=>array('post'),'order'=>'DESC'));$sitemap='';$sitemap.='';foreach($postsForSitemap as $post){setup_postdata($post);$sitemap.=''.''.get_permalink($post->ID).''.''.get_the_modified_date('c',$post->ID).''.''}$sitemap.='';$fp=fopen(ABSPATH."sitemap.xml",'w');fwrite($fp,$sitemap);fclose($fp)}add_action("publish_post","xml_sitemap")

Indexing – How do I test multiple Sitemap.xml files?

Is there a way to test multiple sitemap.xml files?
The validation works flawlessly. Google accepts all subfiles, but "Server Response Checker" in Yandex returns "Document does not contain any text".

The crawl rate and overall indexing progress give me the impression that both search engines can not read content from the sitemap files. Both, because there is a large amount of "discovered – currently unindexed" = 2/3 of the entire content, they were never crawled, as well as because indexing in Yandex is low.

This website contains about 750,000 links in Sitemap files. If I generate 50,000 links per file (about 11 MB), the crawl chart goes up and then drops. If there are 10,000 links per file, the graphics sink much faster and stay at about the same level.

We've done a lot of testing and technically everything seems to be fine, but in terms of performance, it's pretty dubious.
Robots.txt provides full access. Robot meta tags too.

  • Can anyone suggest a way to check why "Server Response Check" returns an error if the file exists?
  • Is there any way to test if the entire system of sitemap files really works – that is, if it's read correctly by the search engines?
  • Can this issue be related to the settings specified in the .htaccess file?

Please see screenshots below.
Location of the sitemap file:
Yandex Server Check link:

thank you in advance
Enter image description here
Enter image description here
Enter image description here
Enter image description here
Enter image description here
Enter image description here

seo – Internal links to pages vs. sitemap.xml links to pages

I have a page that has no link to it, but is listed in sitemap.xml.

It is crawled and displayed in the SERP.

From a SEO point of view, would it be better to link the page in the natural flow of the website?

For example, I have 30,000 products on my website. Apart from the search results and the sitemap.xml file, there is no way to naturally link to all 30k files.

Should I develop a way to navigate to all these files by clicking alone (no search)?