Rating NZ blogs

I have had a go at rating New Zealand blogs along the lines used by Tumeke and Halfdone. It’s an interesting exercise because it raises the questions of what one is measuring – and why.

human-contactObviously we want to get some idea of a blog’s influence or “reach.” But really that would take complex sociological surveys and would produce only vague results. So, inevitably, the blog rank must be defined by the methodology used. So it’s no surprise that Tumeke’s and Halfdome’s surveys do produce different rankings for most blogs. Especially when we get below the top 20.

One can resort to the rankings provided by sites like Alexa and Technorati– but then all the available rankings have there own quirks which can radically influence individual results. Applying them to New Zealand blogs also requires a bit of work so one may as well include more information.

So my solution is to diminish the “quirk effect” of individual methods by using several and aggregating these to produce a final rating. I have used two web activity rates (Alexa and Technorati), four measures of linking (Alexa, Technorati, Google links from blogs and Google links from the web) and two “subscription” numbers (Technorati Authority and Google Reader subscriptions).

The resulting New Zealand Blog Popularity List is available as a pdf download. I have summarised the results for the top 60 in the table below and compared these with the most recent ratings from Tumeke and Tumeke .

Permalink

Similar articles

Add to FacebookAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to FurlAdd to Newsvine

Aggregate Ranking

Blog Name

Open Parachute

Half Dome

Tumeke

Kiwiblog

1

1

1

Public Address

2

4

3

No Right Turn

3

11

9

The Standard

4

2

2

Not PC

5

5

4

Whale Oil Beef Hooked

6

3

6

New Zeal

7

14

13

TUMEKE!

8

8

10

frogblog

9

x

7

No Minister

10

6

5

Poneke’s Weblog

11

15

11

The Inquiring Mind

12

16

18

Cactus Kate

13

12

17

roarprawn

14

19

16

New Zealand Conservative

15

9

27

The Wellingtonista

16

22

x

Half Dome

17

13

23

The Dim-Post

18

7

8

Capitalism is bad

19

65

36

Just Left

20

30

43

Radical Cross Stitch

20

42

21

The Hand Mirror

22

17

15

Homepaddock

23

10

12

Jack Yan: The Persuader

24

25

33

Kotare

25

66

45

The Fundy Post

26

45

26

Keeping Stock

27

24

35

Barnsley Bill

27

20

24

Big News

29

21

37

The Hive

30

58

x

The visible hand in economics

31

18

22

Open Parachute

32

x

19

Clint Heine and Friends

33

34

44

Aotearoa: a wider perspective

33

60

50

Kiwipolitico

35

26

14

Oswald Bastable’s Ranting

36

31

39

Anti-Dismal

37

28

30

Lindsay Mitchell

38

32

42

MandM

39

27

54

Hot Topic

40

63

20

Liberty Scott

41

35

53

Hitting Metal With A Hammer

42

41

38

Reading the Maps

43

59

25

Policy Blog

44

33

46

TBR.cc

45

29

32

Put ’em all on an island

46

81

28

Alliance Party of New Zealand

47

49

62

Mulholland Drive

48

44

56

MonkeyWithTypewriter

49

37

48

Liberation

50

56

80

John Key

51

40

52

g.blog

52

43

49

Socialist Aotearoa

52

50

60

MacDoctor Moments

54

39

31

goNZo Freakpower

55

46

89

RobiNZ Personal Blog

56

61

41

Bioblog

57

x

x

Pundit

58

142

57

Henry

59

x

x

Workers Party

60

77

40

Physics Stop

61

x

x

17 responses to “Rating NZ blogs

  1. I still can’t fathom out why a ranking such as this is relevant, what it indicates or what it is useful for.

    Like

  2. The results are quite interesting. MandM vary greatly depending on the method, whereas others don’t so much.

    Like

  3. Madeleine – plotting the data shows that at ratings above 20 the scatter is quite large – not just for MandM.

    It’s probably what you would expect for the sort of distribution that exists where the ratings for the most popular sites are just so much greater than for the less popular. Here the distributions have been compressed by the method used.

    Uroskin – I guess it doesn’t really tell you much. It may be an ego things for some bloggers. Alternatively, it is a way of evaluating how one’s own blog is evolving. But as I say an improvement in rating doesn’t necessarily correlate with increased influence.

    Apart from that – I have found it a good exercise (learning something new) for an ageing brain.

    Like

  4. Here’s something new – my site is Halfdone – not halfdome!

    I’ve been thinking about widening my data take to new sources, and considered the RSS subscriptions in Google Reader. The problem is they’re too easy to manipulate, but then that goes for Alexa too, as Lyn at The Standard has posted.

    In fact, I showed a while ago that with the 3 measures I use simply changing the formula can lead to wildly varying results, so the whole thing is something of an arbitrary exercise.

    Like

  5. Woops – I’ll correct that. Just another example of my brain getting in the way of my perception.

    Like

  6. Ken

    Interesting, I like the fact that it goes some way to taking account of RSS readership, as the site meters do not.

    Like

  7. Hey thanks for that not sure why i get a 1 from Alexa or what it means – maybe thats cos Im blonde

    Like

  8. Suggest you do a find and replace… you missed some!

    I should also point out that there are multiple RSS feeds for some sites – I checked one this morning, and one of the feeds had ten times (40 vs 4) the subscribers.

    Like

  9. Pingback: Open Parachute’s Blog Rankings « The Inquiring Mind

  10. Interesting that one blog in your list is outside the top 30 but the top 20 in other lists, while one in your top 20 list is not even in Halfdones top 50! that’s some variance!

    Like

  11. bustedblonde – as I said each measure has its quirks. One that occurs for some (including Alexa) is that it it refers to the domain holder rather than the blog itself. Blogs based at institutions can therefore score highly. Their influence can be reduced by aggregating a number of measures (or possibly choosing the measures more wisely – which requires more knowledge than I have).

    Scrubone – please let me know details of those I have missed. This will be useful if I do the exercise again. (If I do I’ll try to include some other measures which might include more RSS feed numbers).

    Dave – the variance between any 2 of the 3 rankings listed is about the same (R^2 about 0.5). I guess this is what we would expect from the nature of the exercise.

    Like

  12. Pingback: What’s in a number? « Homepaddock

  13. Just seen this, sorry Ken… One reason my blog does better at Tumeke! than elsewhere is that I supply my Statcounter figures to Tim (as he encourages everyone to do). I also get occasional posts that attract a lot of comments (500+ in one instance), so I do relatively well. Better than #63, at least… 😉

    Like

  14. Yes Gareth, it’s not surprising that each attempt produces its own different (and quirky) results. I guess one just uses what one thinks is the most useful.

    I don’t know any simple way of automatically getting stat counter or comment data. Tumeke must do a lot of work for his list.

    I think I will have another go at the ranking methodology I tried here. However, I now have a better idea which ones to use as a statistical analysis shows that some are really simple duplicates of others.

    Also, by ranking the different groups (rank, links, RSS feeds) separately we can get an idea of which blogs do which things better.

    Like

  15. Pingback: maxzoneblogoblivionmetricidal « The Inquiring Mind

  16. Pingback: Ranking methods for NZ blogs « Open Parachute

  17. Pingback: NZ blog ranks - March ‘09 « Open Parachute

Leave a Reply: please be polite to other commenters & no ad hominems.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s