Can you either download the individual reports from GSC (or better yet a full site audit from Ahref's), upload it to Google Drive, then DM it to me? I can give you way better advice and help you fix the errors that way.
Can you also easily access the server logs? We can cross-reference when Google crawled the site and see when errors were recorded. We'll get lots of extra info from this as well.
Is there any IP blocking, rate limiting, or bot filtering active? That could make errors pop up when there aren't actually any there.
Have you already used Google Search Console’s URL Inspection Tool to capture data from live pages and compare it to what Googlebot sees? If there's a discrepancy, we're likely looking at some firewall issue. (It sounds like this might be the case since Googlebot got an error but your browser didn't, but test it again, just to make sure.)
If alllll that checks out, then it's probably on the server side, so contact whoever you get hosting with to check resource limits or faulty scripts.
Can you either download the individual reports from GSC (or better yet a full site audit from Ahref's), upload it to Google Drive, then DM it to me? I can give you way better advice and help you fix the errors that way.
Can you also easily access the server logs? We can cross-reference when Google crawled the site and see when errors were recorded. We'll get lots of extra info from this as well.
Is there any IP blocking, rate limiting, or bot filtering active? That could make errors pop up when there aren't actually any there.
Have you already used Google Search Console’s URL Inspection Tool to capture data from live pages and compare it to what Googlebot sees? If there's a discrepancy, we're likely looking at some firewall issue. (It sounds like this might be the case since Googlebot got an error but your browser didn't, but test it again, just to make sure.)
If alllll that checks out, then it's probably on the server side, so contact whoever you get hosting with to check resource limits or faulty scripts.