Link Love Notes: Putting the luurve back into links #LinkLove

(Notes from London’s Link Building conference from Distilled)

Tom Anthony
from distilled.com

Back in the day, the problem of information retrival and search hadn’t been solved. Then, in 1997, Google discovers links and changes the game.

But links very quickly began to be gamed. Now Google have to ‘pre-filter’ links; they’re not all considered equal anymore.

The belief in the SEO community is that the value of links is decreasing in Google’s algorithm. However, it’s still strong and the spam tactics can still work. For example, anchor text is an incredibly strong signal.

changing the Google algorithm

So what are Google waiting for in changing how they view links? SEOs know a spammy link profile when we see one – why doesn’t Google seem to?

Link profile tool: dis.tl/link-profiler
Something like this can be built in very little time, so why can’t Google see it?

The problem is false positives. There are perfectly good and acceptable marketing tactics that will result in anomalous and ‘odd looking’ link profiles. It could even be that everyone in an industry niche is doing bad things, and one guy is getting good links. He’ll have a dodgy-looking link profile in comparison!

The problem is that the SEO black hats, the spammers, advance their tactics and technology at an incredibly fast rate and can keep ahead of Google. The link web isn’t clear, so it’s hard to ‘prune’ dodgy-looking areas.

Panda update

The Panda update is Google advancing into this area, effectively trimming large areas of the web that they consider ‘dodgy looking’. Overall, the effect of these changes have had positive user effect

they’re coming: link algorithm updates

So far Panda has really looked at content on site, but it’s only a matter of time until a similar process is taken on links and link signals. In fact, the SEO community believes a big algorithm will come in 2012.

Google recently revealed that they will begin to penalise ‘over optimised’ sites, but the big news was that paid link networks such as BuildMyRank have been massively hit by an algorithm update. If effectively deindexed 20-30,000 domains and millions of links.

These updates will kep on coming, and warnings are now getting sent out to Google Webmaster Tools  for those with suspicious-looking link profiles. You’re gonna get hit sooner or later chaps…

putting the love back into links

Tom believes the way Google will go is to add trust to certain types of links.

An example of this is the rel=”author” markup.

Now Google can see who is responsible for a link, and thus how trustworthy it is. For Tom the whole idea of rel-author is valuing links, and as much about defending search as entering into social.

You’ll find author stats being drawn through into Webmaster Tools, and if they include it they clearly think it’s important.

Google want to fix the web graph. Google+ and author markup gives Google access to social signals. While shares etc are valuable signals Tom doesn’t believe they’re enough, and they simply won’t replace links. However authorship is far more fundamental, as it’s combining the social and the link graph into a new, more valuable metric: AuthorPageRank.

AuthorPageRank

AuthorRank x PageRank = AuthorPageRank

Basically a fair author will mean more link equity than an unrecognised author, and a superstar author will mean even more link equity and thus AuthorPageRank.

But can it be spammed?

Well, the amount of data Google has is pretty frightening, and for social signals it’s pretty extensive. It means there a high barrier of entry for abusing these signals, as to get a fake social profile, for example, that has the signals of a trusted author will be very difficult.

focus on authors

  1. Become a trusted author
  2. Target trusted authors for links

Tom thinks we need to shift from thinking about “Where is a link coming from?” to “Who is a link coming from?”.

Of course it needs to be scalable to become an executable SEO strategy, and we need the tools to be able to analyse the information and make decisions. Well, by using author markup and APIs we can – we can see his social activity, his followers, his other profiles, what sites he writes for…it goes on.

Tools such as the link intercept tool from SEOmoz enable you to cross-reference sites that are linking to multiple competitors but not your site. That’ll help you find places that are writing about your niche, ripe for the ‘befriending’.

the final word

There’s so much information available about authors, we need to use it for working with people.

There might be something out there that would help find out more about authorship in content, but unfortunately it’s a secret for us LinkLove folks for now  🙂

What SEOs do need to do is think about the future effects of authorship and social connections upon links. Although at the moment rel-author is only in the closed Google environment and not hugely widespread, it could be that signals such as OpenID will be incorporated.

The link algorithm will change to consider people and social linking; that is clear.

 

 

 

If you need any help in doing "digital" better don't hesitate to contact us.

Want to hear more from Attacat?

Sign up to our newsletter and receive our latest articles.

View our latest newsletter here.