content posted in content  on 28 April 2007
by Andrew Lang 
View by Categories | View By Latest

Blue sky thinking about search engines

When people talk about search engines in 2007, they might give a passing reference to Yahoo! and MSN, but really they are talking about Google, because that's where the majority of searchers are to be found.

When one search engine dominates the market, and they use an algorithm to determine its rankings, naturally it follows that everyone is interested in divining the nature of the algorithim. This is where we're at in 2007, in terms of search engine marketing, or search engine optimization.

Yes, it's a simplistic view, but it's quite an accurate reading of what's happening right now: everyone's trying to get more 'Google love' by swapping links with relevant sites in a deliberate way to gain higher rankings, or to create 'link bait' : laying the seeds of one-way links via compelling content. The overall feeling is that people are obsessing over website traffic, and writing their content in a very self-conscious way for the sake of attracting the most traffic. The content is sometimes seen as a means to an end, not an end in itself.

If there was no algorithm to divine, would they write the same things? What if unbiased human editors were used as a 'filter' for rankings determined initially by an algorithm? This would raise the bar of quality considerably. More signal, less noise - and right now, the internet is very noisy. Some people even call it a spam repository. When searching for certain things, it's easy to sympathise with that point of view.

But it's also true that the internet is full of incredibly useful, interesting, entertaining websites. A lot of which are run by website owners who don't know about algorithms or search engine optimization. These fools are too busy writing interesting, unique content. Meanwhile the spammers/SEO-aware are tweaking their sites to beat the enthusiasts in the rankings.

Yes, this is a polarised view of the way things are, but there's a lot of truth to it.

The blue sky thinking here is imagining a search engine that still used an algorithm as the first 'pass' for determining rankings. However, there would be a second 'pass' - human editors. This second filter doesn't determine the order of the rankings per se - but they could flag the obviously low quality sites that are really just props for advertising space. Searchers could then choose to have these types of sites 100% filtered out of their results. You could apply a number of other filters too :- a filter to show more frequently updated sites, sites that accept particular payment methods, etc. More so, you can then filter these results further by your own editor's eye - telling the search engine more and more of what you like from the results you click through and actually STAY on : the search engine learns what you like too (personalised results).

These kinds of measures would truly put the emphasis on quality content, and less on optimizing for an algorithm (speaking as a web-design company who do well in the search engine rankings!).

Share this article:


view my profile on Google+