This week has been a good week for ArcticStartup if you would quantify it over the traditional media metrics – pageviews and visits. Our critical piece on the Finnish government’s decisions over a strategy group was read by some 25 000 people in just a couple of days. Yesterday on the other hand, our in-depth post on Denmark’s questionable Entrepreneur Tax has also built traction around the world, mostly thanks to Hacker News.

As I was watching real time Google Analytics (which tell you how many people are on your site and how your site’s analytics are developing in real time) on Tuesday as the ICT-strategy group story spread to new social circles, I noticed something interesting in the way Google treated us as a site. The realisation was that as more people found our story, Google also began sending more people through search results.

It was interesting. Almost as more traffic for our current story also some how validated our other content in Google’s algorithms and most likely increased their over all ranking quite a bit. However – this was only temporary for half a day. People coming to the site through search results quieted down later in the evening to lower levels as the traffic got back to our more usual levels.

Later Tuesday evening I was talking to a journalist from one of the most respected media companies in Finland and she said that they update their site some 50 times a day, exactly due to this reason. While we’re unable to push out that many articles, we do notice similar kind of effects with changes in our publishing frequency.

This is fascinating for a couple of reasons.

First of all, if all this is really true it implies Google and search engines have a significantly large effect on how media companies work.

Some of this thinking is also put forward by DuckDuckGo, another search engine. More of the backgrounds of all this is explained in their “Don’t Bubble Us” presentation.

Secondly, if search engines like Google validate the quality of earlier content based on current traffic it really pushes metrics for bigger, search-engine-optimised sites higher. I would even go as far as saying that they are over represented.

This of course has big implications for sites that don’t follow the high frequency publication cycle. We are a good example of this. Even though I’m biased to say this, I’m quite confident our piece on the Danish Entrepreneur Tax was really good in brevity and detail, but since we’re unable to support it through huge amounts of other stories being published – less people will find it through search engines.

For quite some time already, I’ve understood that distribution is one of the biggest challenges for smaller indie media sites and companies like us. That’s one area we really focus on after putting together a killer article. There are a couple of ways to go about this, but that’s a story for another article.

I guess the learning of all this is that there are huge opportunities in democratising traffic and visibility for sites that don’t want to follow the high frequency publishing cycle, that would give them the wanted visibility among the “big boys”.

As long as this is broken, sites that simply churn out a lot of content regardless of the quality – are over represented in search results, while the less frequently updated sites (with usually more indepth and detailed content) have it harder to get readers for their content.

This might also explain to you why media sites push out a lot of content, usually letting go of quality over quantity.

No more articles