Reports on backlinks that have halved in GSC could be a mistake.
Fix any Google Webmaster Tools or GSC errors
To get a quick index of Google search results, you need to manage your Google webmaster correctly. Google sends a variety of messages to site owners that you find in your website messages. These messages can alert you to problems with your website or simply give tips for improving your website.
Google Search Console and Mobile-Friendly Test give me the following two alerts for my WordPress-based website:
- Content wider than the screen
- Clickable elements too close together
The screenshot that these sites provide from my website looks completely broken, as if no CSS was applied.
My case was different. The following status is what my robots.txt file looks like, and I still receive the same alerts. I am an SEO Framework user and have created my own static version of robots.txt.
User-agent: * Allow: / Sitemap: https://*****
Along with the two warnings, I almost always get "Page Loading Issue" for the test results. Could it be that this is a server speed issue? I am currently in Japan and my website is mainly aimed at Japanese. However, I use a SiteGround server, not a Japanese server. I am well aware that this generally gives me a speed problem for my website. Does this also affect the results of the above Google tests?
I'm trying to send a very simple sitemap (for testing only) to the Google Search Console, but I'm getting the following error message:
╔══════════════╤═════════╤══════════════╤═════════ ══╤══════════════════╤═════════════════╗ ║ Sitemap │ Type │ Submitted │ Last read │ Status │ Found URLs ║ ╠══════════════╪═════════╪══════════════╪═════════ ══╪══════════════════╪═════════════════╣ ║ /sitemap.txt │ Unknown │ July 17, 2019 │ │ * Could not retrieve * │ 0 ║ ╚══════════════╧═════════╧══════════════╧═════════ ══╧══════════════════╧═════════════════╝
If you click on it, an additional error message will be displayed: "(!) Sitemap could not be read".
However, clicking on "OPEN SITEMAP" will open it normally.
Any idea what's going on?
Sitemap file: sitemap.txt
Server: Apache (Debian)
I have been fighting for some time against the bug in Google's canonical page algorithm. One piece of advice I got is to set the GSC URL parameters to "Any URL," since we're using a Page Generation script with the Page parameter and not "Let Googlebot decide." If I stop this and click "Show sample URLs", GSC will display the following for recently crawled URLs:
I also attached a screenshot: Certainly none of these pages are available on our web server. As far as I can tell, our GSC account has not been hacked, at least I do not see any evidence that anyone except me has made indexing requests. Entering one of these parameters causes our site to return a hard 404 value. Why would Google crawl with random page parameter values? And another question: Could this affect Google's canonical site selection?
& # 39);
var sidebar_align = & right; & # 39 ;;
var content_container_margin = parseInt (& # 39; 350px & # 39;);
var sidebar_width = parseInt (& # 39; 330px & # 39;);
GSC time to populate?
Just a quick question: how long does it take Google Search Console / Webmaster Tools to populate a new website?
Last contribution: 09-02-2016, 10:18
Last contribution: 15.03.2002, 2:24
Last contribution: 02-14-2001, 23:39
Last contribution: 31/01/2001 2:06 am
Last contribution: 12/31/2000, 3:47
So, my indexing has dropped dramatically in the site search, but constant at 14,000 pages in GSC. What causes the difference?
This is a problem that has been going on for a month and a half.
Every day, the live site search has decreased by 100-200 indexed sites.
There was no news from Google. There is no increase in errors in GSC.
Bing's live search results show us 34,000.
There are no warnings of manual or automated penalties for GSC.
We have not detected any increase or decrease in links to our website.
What could possibly cause this deindexing in the live site search?
The move GSC sites feature is for domain modification purposes, not the paths of those domains. There's absolutely no way around this part, since your site address, not the path of the pages, makes sense, why Google does not allow this.
Just change the domain into the new domain without the new homepage path. On the old domain, just redirect the old URLs to the new ones. This will propagate the leaderboards and permissions, while GSC will index the content.
I have a very small site with only about 70 pages. About 4 days ago, the GSC shows that the number of indexed pages has dropped 40%. When I search for which pages, pages are displayed for which I am still good.
Is that something to worry about? Is this a bug in GSC, or are these pages ultimately being removed from Google's search results?
GSC shows AMP errors in its AMP section. However, with manual validation with and validator, the test passes the test.
Who is faced with this problem?
What is the reason for that?
And I'm also worried if it shows all errors?
Please let me know more about it