Back
Question
Do you keep track of 404 errors?
When people or crawlers request a page that doesn't exist on your site they will get a 404 Not Found error. I'm wondering if I should keep track of these.
Pros: - Let's you monitor potential issues with the site that will otherwise go unnoticed - You can follow up with any blogs/etc that might include an invalid link to your site (e.g. adding a period at the end of the URL).
Cons: - Lots of noise, especially with heavy crawling - Many 404 can be user mistakes
Pros: - Let's you monitor potential issues with the site that will otherwise go unnoticed - You can follow up with any blogs/etc that might include an invalid link to your site (e.g. adding a period at the end of the URL).
Cons: - Lots of noise, especially with heavy crawling - Many 404 can be user mistakes
👋 Join WIP to participate
This seems like an elegant compromise: andycroll.com/ruby/stop-robot…
It tracks 404's, except those generated by crawlers that might crawl outdated links.
Monitoring erroneous backlinks is a great idea.
Wouldn't HTTP redirects on those links help with SEO?
Just use webmaster tools to know what 404 googlebot finds and add redirects to important ones. You'll only handle the 404 that crawlers know about, which should be enough.