Sitemap.xml giving error when access

Server : Shared Bigrock
Trying to Rewrite sitemap.php to sitemap.xml
But when going to access https://www.indiaztour.com/sitemap.xml (Showing blank in firefox and HTTP ERROR 500 in chrome).
SEMrush

Note : Working fine on local server (xampp).

My HTACCESS Code is as below

Options +FollowSymLinks -MultiViews
RewriteEngine on
# ensure www.
RewriteCond %{HTTP_HOST} !^www. (NC)
RewriteRule ^ https://www.%{HTTP_HOST}%{REQUEST_URI} (L,R=301)
# ensure https
RewriteCond %{HTTP:X-Forwarded-Proto} !https
RewriteCond %{HTTPS} off
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} (L,R=301)
RewriteRule places-(.*).php$ places.php?page=$1 (NC,L)
RewriteRule blog-(.*).php$ blog.php?page=$1 (NC,L)
Rewriterule sitemap.xml$ sitemap.php (L)
Rewriterule sitemap1.xml$ sitemap1.php (L)
Rewriterule sitemap2.xml$ sitemap2.php (L)

Code (markup):

I am using same code to one of our website at VPS and working fine but not at shared server. Please guide me that what i am missing here.

I am trying to find error and put code also at sitemap.php

error_reporting(E_ALL);
ini_set('display_errors', 1);
header('Content-Type: text/xml');

PHP:

But seems like its not entering to page ….. because showing either blank page or 500 error only.

 

Theme development – how do i add url and homepage modification date to sitemap.xml in wordpress?

I use this code to add sitemap.xml to my WordPress site without a plugin. that works great. but is incomplete. Please add the URL and the modification date of the WordPress homepage (index.php) to this sitemap. I display my recently published and recently changed contributions on my homepage.

    function xml_sitemap(){$postsForSitemap=get_posts(array('numberposts'=>-1,'orderby'=>'modified','post_type'=>array('post'),'order'=>'DESC'));$sitemap='';$sitemap.='';foreach($postsForSitemap as $post){setup_postdata($post);$sitemap.=''.''.get_permalink($post->ID).''.''.get_the_modified_date('c',$post->ID).''.''}$sitemap.='';$fp=fopen(ABSPATH."sitemap.xml",'w');fwrite($fp,$sitemap);fclose($fp)}add_action("publish_post","xml_sitemap")

Indexing – How do I test multiple Sitemap.xml files?

Is there a way to test multiple sitemap.xml files?
The validation works flawlessly. Google accepts all subfiles, but "Server Response Checker" in Yandex returns "Document does not contain any text".

The crawl rate and overall indexing progress give me the impression that both search engines can not read content from the sitemap files. Both, because there is a large amount of "discovered – currently unindexed" = 2/3 of the entire content, they were never crawled, as well as because indexing in Yandex is low.

This website contains about 750,000 links in Sitemap files. If I generate 50,000 links per file (about 11 MB), the crawl chart goes up and then drops. If there are 10,000 links per file, the graphics sink much faster and stay at about the same level.

We've done a lot of testing and technically everything seems to be fine, but in terms of performance, it's pretty dubious.
Robots.txt provides full access. Robot meta tags too.

  • Can anyone suggest a way to check why "Server Response Check" returns an error if the file exists?
  • Is there any way to test if the entire system of sitemap files really works – that is, if it's read correctly by the search engines?
  • Can this issue be related to the settings specified in the .htaccess file?

Please see screenshots below.
Location of the sitemap file: https://www.rusinfo.eu/sitemap.xml
Yandex Server Check link: https://webmaster.yandex.ru/tools/server-response/

thank you in advance
Enter image description here
Enter image description here
Enter image description here
Enter image description here
Enter image description here
Enter image description here

seo – Internal links to pages vs. sitemap.xml links to pages

I have a page that has no link to it, but is listed in sitemap.xml.

It is crawled and displayed in the SERP.

From a SEO point of view, would it be better to link the page in the natural flow of the website?

For example, I have 30,000 products on my website. Apart from the search results and the sitemap.xml file, there is no way to naturally link to all 30k files.

Should I develop a way to navigate to all these files by clicking alone (no search)?

seo – Can I delete sitemap.xml from my website? Does that have a negative impact?

I'll post a controversial response that many probably will not agree with. Is sitemap.xml still necessary? It is no longer decisive, no. Is it good to have it and does it help? For sure. Should you delete it? I can not say, because I do not know what your above reasons are.

View this article written by my former colleague with years of SEO, UX, and Web Dev experience. I agree with many things:

https://www.imarc.com/blog/do-sites-really-need-a-sitemap-for-seo

If you do not actively update your Sitemap and do not submit to GSC or Bing Webmasters, you probably do not need them. A sitemap is a suggestion for search engines how to crawl your site. So think of websites like news sites with hundreds of thousands of pages: Sitemaps have size limits. For news sites, they do not contain every published article. However, these articles are found, indexed and classified.

If your website is updated regularly; has a clear information architecture and category hierarchy; and contains a well-planned internal link structure. The search engines index and arrange the content independently of the existence of the sitemap.

Note, however, that you will lose the ability to manually submit the Sitemap to GSC, which will trigger a crawl again. You also lose the ability to suggest search engines which parts of your website are most important or which ones are current.

Many SEOs still say that Sitemaps are critical, but that's traditional SEO wisdom. What matters is not how I would define it; it is more helpful, as I have observed in recent years. In your case, it depends on your reasons for deleting and how much you currently manage.

seo – Temporarily hosting a sitemap.xml file to retrieve URLs

I have a GZ sitemap file that contains 4 different sitemaps, each 50 MB in size. I want to get all the URLs in these 4 sitemap files. I would like to use https://robhammond.co/tools/xml-extract for this, but this tool requires inserting the URL of the sitemap. However, with sitemap.xml.gz, you can only download the XML files so that a locally saved file and no URL is displayed at the end (for example, example.com/sitemap1.xml)

How can I do that?
1) Host the 4 sitemap.xml files for the above tool
or
2) Extract the URLs from the downloaded sitemap files