Last week Corey’s impeccable timing informed us about the introduction of Google Application search. He pointed out that every page result includes the Schema.org markup for software. So it’s confirmed. 2013 will be the year of markup, meta data, schema.org, and authorship. But for some strange reason, I can’t help but feel like we’ve been here before.
That’s right, now I remember…
That’s infoseek in 1997 (courtesy of SixRevisions). Infoseek was one of several popular search engines in the 1990s that took advantage of the meta keywords attribute in order to identify what pages were about, and rank them accordingly. The keywords attribute achieved popularity in 1995, and by 1997 it was so heavily abused by spammers that the web started looking very ugly.
By 1998, most search engines were already dropping support for the keywords attribute, and by the early 2000s it was a thing of the past.
I can’t help but wonder if the same thing is going to happen to rich snippets, schema.org, and, dare I say it, Google+ authorship.
“But It’s Not About Rankings This Time”
I know, and this is not a minor point at all. Spammers can’t use the new wave of markup to improve their rankings. This markup is all about click through rates, building familiarity with memorable profile pictures, and so on.
I certainly share Corey’s opinion that you should “get as many of these tags on your sites as you can without spamming / being irrelevant.” It would be flat out irrational to leave that revenue on the table. There are justifiable marketing reasons for doing it.
I mean it. Set it up, now.
But I still think it’s worth asking the question: how long will the benefits last? Consider this post from UnMaskParasites, which details how spammers used a combination of structured review data, website hacking, and cloaking to make this happen:
These pages then work as doorways to sites they promote, and reference other doorways to boost their PageRank.
And consider some of the other ways that markup could be abused:
- Using authorship to get a photo of a famous celebrity next to your search result, fooling the users into thinking you are them, until at some point users no longer pay attention to the photos.
- Using authorship to get product photos into the search results, confused for shopping results.
- Using schema.org to attach physical locations to websites that really have no physical location in order to instill a false sense of trust.
- Using Events schema markup to push extra links below your main search result, whether they’re genuine events or not.
- Posting the same low quality video on every page of your site with a different file name and using VideoObject Schema to get a thumbnail in the search results.
- Using markup to get regular web pages listed as “applications,” “recipes,” and so on in order to compete for higher competition keywords in search categories where they are less saturated.
I wouldn’t advise doing any of this on any site you ever hope to last, but this of course is not what spammers care about. There are varying opinions regarding how much, if any, manual review is involved in the approval of rich snippets, but it seems if markup is ever going to take off we can’t possibly hope for manual approval of every case.
What do you think? Can Google effectively police these kinds of manipulation, or will spam eventually kill off this new wave of structured metadata?