I will:
- add theory around calculators
- update old pages
- add links to sources
- try to get more backlinks
Something I feel helpless about is that my Google Search Console shows some pages with 500 errors. But I can visit these pages fine in the browser and with vpn. I also did a crawl test with the Googlebot command line utility, and it was successful. Yet Search Console's crawling always finds errors in these pages. I have tried Reindexing, validating fixes, but it keeps failing. I don't know what to fix, as there are no errors or issues noticeable.
Can you either download the individual reports from GSC (or better yet a full site audit from Ahref's), upload it to Google Drive, then DM it to me? I can give you way better advice and help you fix the errors that way.
Can you also easily access the server logs? We can cross-reference when Google crawled the site and see when errors were recorded. We'll get lots of extra info from this as well.
Is there any IP blocking, rate limiting, or bot filtering active? That could make errors pop up when there aren't actually any there.
Have you already used Google Search Console’s URL Inspection Tool to capture data from live pages and compare it to what Googlebot sees? If there's a discrepancy, we're likely looking at some firewall issue. (It sounds like this might be the case since Googlebot got an error but your browser didn't, but test it again, just to make sure.)
If alllll that checks out, then it's probably on the server side, so contact whoever you get hosting with to check resource limits or faulty scripts.
That's good advice.
I will:
- add theory around calculators
- update old pages
- add links to sources
- try to get more backlinks
Something I feel helpless about is that my Google Search Console shows some pages with 500 errors. But I can visit these pages fine in the browser and with vpn. I also did a crawl test with the Googlebot command line utility, and it was successful. Yet Search Console's crawling always finds errors in these pages. I have tried Reindexing, validating fixes, but it keeps failing. I don't know what to fix, as there are no errors or issues noticeable.
Some of these pages are:
- rref-calculator.com/calculato…
- rref-calculator.com/calculato…
- rref-calculator.com/calculato…
- rref-calculator.com/calculato…
Can you either download the individual reports from GSC (or better yet a full site audit from Ahref's), upload it to Google Drive, then DM it to me? I can give you way better advice and help you fix the errors that way.
Can you also easily access the server logs? We can cross-reference when Google crawled the site and see when errors were recorded. We'll get lots of extra info from this as well.
Is there any IP blocking, rate limiting, or bot filtering active? That could make errors pop up when there aren't actually any there.
Have you already used Google Search Console’s URL Inspection Tool to capture data from live pages and compare it to what Googlebot sees? If there's a discrepancy, we're likely looking at some firewall issue. (It sounds like this might be the case since Googlebot got an error but your browser didn't, but test it again, just to make sure.)
If alllll that checks out, then it's probably on the server side, so contact whoever you get hosting with to check resource limits or faulty scripts.