We’ve been repeatedly asked about where we got the idea that Google Webmaster Tools should be enough to perform link cleanup. After a few hours of video digging we found it at: 37:17
John: So generally speaking, you can find all the links you need for webspam or algorithmic reasons in webmaster tools. So it’s not the case that you would need a third party or an external tool to kind of dig up all of those links to find the ones that are problematic, because usually if they are not showing up in webmaster tools then they are generally not something that you need to worry about. Often this also includes things like sitewide links where if you find one of those links in webmaster tools than usually that’s enough to notice that there’s a sitewide link there and you can remove that sitewide link from the other website if that’s a problematic link and that’s enough. You don’t really need to see the thousands of individual URLs from this other website that you found linking to your website, because once that sitewide link is gone it’ll be removed from all of those anyway.
Me: This also counts for the “soft” version of the link warning?
(Explanation: Google ignores some links even when they are not penalising your website on a larger scale, do ignored links also show in webmaster tools?)
John: They should be there, yeah.
I don’t care i’m sticking with no GWT is not good enough just because Google doesn’t have the links listed today doesn’t mean they won’t be added at some point in the future, also it’s about reputation management if you have really bad links out there with your clients brand or pointing to a clients website
My Grandmother taught me a long time ago to “believe none of what you hear and only half of what you see”.
You only have to look at the links surfaced by other tools to see that either these comments don’t gel, or Google’s ability to recognize the crap is way less developed than they would have us all believe.
Our rmoov users are reporting good results by following the “start by analyzing as many links as you can find” method, and I would not recommend otherwise.
All this video does for me is highlight yet another instance of a high profile Googler telling us how they believe things work at Google, when we can see for ourselves that something, somewhere is not working as they say it should 🙁
GWMT has a limit to the backlinks it reports (Last I tested 10k) so if you have a high backlink count you will not see everything Google knows. The good side is that those 10k get spread over all domains so you should get a good domain level knowledge. Also, it refreshes and changes every few weeks so over time you can get a more complete picture,
My opinion is data from GWMT should take priority, but if you have to dig deeper use other sources like Majestic SEO.
Wow, thanks Dan, for digging this up!
Like the other commenters here, I’m skeptical, not only due to the 10k limit, or the potential for more links being added “tomorrow”. but also because, as Sha says, “You only have to look at the links surfaced by other tools to see that either these comments don’t gel…” –
We’ve all seen crappy links that we KNOW should be removed and have been puzzled why they’e not listed in in GWT, haven’t we? What about links from a domain that gets penalized badly and deindexed?
Is John Mueller actually saying “It’s okay to leave your crappy links up as long as they’re not listed in GWT?” – I don’t think so, I think when doing cleanups we’ll continue to do what I believe is the “right thing” and remove as much crap as possible, using all the tools at our disposal.
thanks again for finding it!
Thank your for share. I like it.