Back
Question
Time it takes for Google to update 200k links?
Hi,
I added a sitemap for ☒ a few months ago, which contains a few 100k pages (inc. user profiles) - but Google is still indexing it.
Not sure if it just takes this long, or if something is wrong.
Has anyone else seen this, is there a way to speed it up?
This is the current status: https://pbs.twimg.com/media/F39visGWsAEF4m0?format=jpg&name=medium
I added a sitemap for ☒ a few months ago, which contains a few 100k pages (inc. user profiles) - but Google is still indexing it.
Not sure if it just takes this long, or if something is wrong.
Has anyone else seen this, is there a way to speed it up?
This is the current status: https://pbs.twimg.com/media/F39visGWsAEF4m0?format=jpg&name=medium
👋 Join WIP to participate
It will take a long time. Here's an example of a friend of mines site with 4.7k pages i.imgur.com/6iYLNII.png
I've been working with Martijn on this and my initial suggestion was to focus on the creation of new, fresh high quality content - which we are currently working on a plan for.
My hope is those additional positive signals will help Google index us faster, and potentially getting some more authority backlinks would help too.
But there doesn't seem to be a simple explanation for this or any indication from Google on how long this should take.
It does seem to be indexing at a faster rate than the screenshot by @bdlowery below though with a lot less pages, so it does seem to some extent that more pages > more pages being indexed over similar time period.
The joys of historic data.
Thanks for the replies! I've also been made aware that it might be crucial to reflect the "lastmod" timestamp on each link to the last time the page was modified, and not use date the sitemap is generated. Mostly to prevent the crawl from restarting and using up the crawl budget.
I'm going to refactor the sitemap generator to use the real modification dates and see what happens. 🤞
That's what we do for our sitemap. We have the lastmod date be the last time the page was actually modified (we can get that data from Prismic), rather than updating it everytime the sitemap generates.
I also have 150k pages with #appwatch but I chose not to submit them all at once. I'm wondering what is the better option.