So which AU sites are “engaging”?


I love the word ‘engagement’ … it’s thrown around recklessly by everyone when they want to illustrate how awesome and important what they do is to others.

Problem is – generally the term isn’t backed up with any real rationale. “Our users are engaged” is a phrase that has lost all meaning. Often – volume of users is confused as engagement. Often a site that says they’re users are engaged has users that spend 2-3 mins max on the site and don’t return.

I had a dig through July’s Market Intelligence numbers and wanted to see the top 20 sites in the 2 areas I think really show engagement

1/  Frequency – ie, how often a user returns on average
2/ Session Duration – ie, how long a user spends on the site per session on average

The caveat was the sites included needed to have more than 30k AU users per month and need to be sites not applications. The figures are site specific, no network figures have been included

So what are the sites?

Frequency (visits on average per month per user)
I believe this is important as it suggests these sites are trusted by users and used as a starting point or a place they can call on for information in areas they need directly.

1/ ninemsn homepage – 8.42
2/ SMH homepage – 7.71
3/ The Age homepage – 7.68
4/ Bigpond homepage – 6.23
5/ OptusZoo homepage – 5.98
6/ News.com.au homepage – 5.87
7/ Yahoo!7- 5.87
8/ Brisbane Times homepage – 5.86
9/ RSVP – 5.66
10/ Elders Weather – 5.5
11/ Herald Sun homepage 5.5
12/ ESPN Soccernet – 5.48
13/ Trading Room – 5.41
14/ The West homepage – 5.39
15/ WA Today Homepage – 5.38
16/ The Courier Mail homepage – 5.26
17/ ASX – 5.2
18/ The Daily Telegraph homepage – 5.18
19/ Four Four Two – 5.12
20/ Habbo – 5.09

These figures are based on the month of July, sourced from Nielsen Market Intelligence

This seems to be dominated by default choices (Bigpond, Optus, Ninemsn) which users have kept … sports sites (particularly soccer), stock trading and local news (the strong showing of Age, SMH, Courier Mail, WA Today etc shows that these brands are still getting a stack of direct, loyal traffic).

Session Duration
This is important as it demonstrates that when a user arrives at these sites the information is compelling and interesting enough to stay and take it all in. They aren’t in/out …

1/ IT Wire – 21m, 49s
2/ Lifehacker – 17m, 51s
3/ Weatherzone – 17m, 18s
4/ Gizmodo – 15m, 13s
5/ Trading Post Automotive – 12m, 55s
6/ Carsales – 12m, 54s
7/ Cricinfo – 12m, 49s
8/ Zoo weekly – 12m, 47s
9/ RSVP – 12m, 44s
10/ Essential Baby – 12m, 29s
11/ Business Spectator – 11m, 59s
12/ Real Estate.com.au – 11m, 42s
13/ Kotaku – 11m, 37s
14/ Take40.com – 11m, 27s
15/ SMH homepage – 11m, 10s
16/ Hot Hits – 10m, 51s
17/ REIWA – 10m, 30s
18/ Sportal – 10m, 27s
19/ Carpoint – 10m, 26s
20/ RAA (SA) – 10m, 19s

These figures are based on August data to date, sourced from Nielsen Market Intelligence

Key trend here is technology users will spend large amounts of time on a site they trust/respect. Auto sites and real estate do well (as to be expected with such an involved purchase). It is interesting to see Essential Baby users spend almost 12m per session whilst Kidspot users on average spend around 3m, 45s (or about 28% of the time). Also interesting to see that Business Spectator is the only Finance news site in the top 20 and the next stickiest is The Australian‘s Business section … users spending on average around 7m, 30s per session.

So next time someone blurts out “engaged audience” ask them how long users are spending on their sites and how often they’re coming back.

33 responses to “So which AU sites are “engaging”?

  1. I am guessing that you could not get any data on Australian audience with global sites? Doesnt really paint the whole interweb picture does it?

    Would another measure of engagement be interaction with the site? I dont know how you would measure this but that is one of the first things I think of with engagement.

    The Allure Media stable is punching up and above their weight with their session duration times! That is massive for a blog of sorts to have 15min+ average time spent! They must be doing something right!

  2. talkingdigital

    good points jules. i must point out this list is limited to sites on AU Nielsen Market Intelligence.

    It’s extremely difficult to do the same task using Netview to get a picture of AU and o/s sites through Netview as you can’t filter using these 2 variables.

    interaction with the site i think can be looked at via pages viewed – particularly as most of these sites are content destinations to interactive sites.

  3. Hmmm …

    Without responding to the assertions here about competitors etc, I will say that my primary measurement of engagement with sites is whether users are commenting on the site or not.

    I’m going to blow our own horn a bit and point out that ZDNet.com.au’s audience is among the most vocal. Most of our articles these days attract comments, and often we get dozens on a single yarn. You can see here, for example, a 48-comment discussion on a relatively innocuous story:

    http://bit.ly/xP2uC

    You can see the wider comment stream here:

    http://bit.ly/2uSBk7

    Cheers,

    Renai LeMay
    News Editor
    ZDNet.com.au
    renai.lemay@zdnet.com.au

  4. Lots of great posts about engagement lately!

    I’ve recently run a similar analysis of AU MI data for July but instead have used an Engagement Index combining scores from 4 different engagement metrics:
    – Frequency
    – ASD
    – Page Depth
    – Popularity

    Which sites rate the highest in terms of overall engagement? Here are a few highlights:

    * Cricinfo 9.75/10
    * Masterchef 9.25/10
    * realestate.com.au 9/10
    * SMH 8.5/10

    Itwire may have the highest ASD but overall it scores 6/10 due to below-average growth and page depth. 6 is still a very solid score though.

    This is all unofficial (with my Digital WatchDog hat off) since none of these example sites are audited yet (so they could have measurement errors impacting the figures). Also note: outliers, non-ad supported sites, sites with flawed measurement etc excluded from the dataset to create more accurate market averages.

  5. I am wondering whether I should get my site FanFooty audited. My relevant July numbers from Analytics:

    Frequency: 4.8 (1.05M visits from 218k uniques)
    Session duration: 22 min 35 sec
    Page depth: 7.36 pages per visit
    Popularity: Not exactly sure what this refers to… ranking in the sector perhaps? The site has grown 100% year-on-year the last two years if that’s what you mean.

    I have had bad experiences with Nielsen, they charge too much for my traffic levels. Why shouldn’t I rely on Google?

  6. talkingdigital

    Paul – first of all I must admit I am an unashamed fan footy fanboy. Use the site Fri/Sat/Sun multiple times to get SC scores – so thankyou for providing it.

    Re relying on Google. You can rely on it … problem is the numbers aren’t public domain so potential clients can’t access it when pulling category numbers

  7. Good to hear Ben. Maybe you can help me monetise it next season, because I haven’t done a very good job of it so far. 🙂

    Nielsen seems to me to be a racket, and I can’t afford to pay their extortion rates just to get access to premium advertisers at this stage.

  8. Nice analysis Ben. I think total time on site during a month is possibly a better measure of engagement then session duration. Some sites serve their users by delivering information quickly, easily and with a minimum of clicks, hence shorter sessions.

    Andrew Hunter
    Head of News, Sport and Finance, ninemsn

  9. Paul,
    Those are excellent figures, probably in the 8-10 range.

    It’s good to see another niche site thinking about engagement. That’s the aim of this model – to emphasize and simplify engagement (over reach metrics) so that the conversation can turn to who has the better audience, not just the biggest.

    Popularity is UB growth from previous month. But since the *real* market growth was quite low, any growth will provide you with a a good Pop score (for July at least). Outliers in MI (e.g. with misleading 200,000 % growth) are impacting the reported average.

    One more note on Google Analytics: After auditing/comparing 50 websites in the past 2 months who are using both Nielsen & GA – GA is consistently overcounting by 10-20% because it seems to be letting through more bot traffic.

  10. But is this irrelevant to brands if they can’t figure out how to tap ‘engagement’ and become a synergistic part of that trusted environment?

  11. Hi Ben,

    Re: AU Nielsen Market Intelligence. It’s extremely difficult to do the same task using Netview to get a picture of AU and o/s sites through Netview as you can’t filter using these 2 variables.

    How long do you think until we as an industry can do this?

    I find it incredibly frustrating that I can’t research stats and engagement for sites that Nielsen doesn’t track. What about sites like Last.FM, Facebook, etc etc. Global monoliths that are just a black hole from a research perspective.

    More than often you have to wait days for someone from the sites Sales dept to respond to a query for this info. I’ve been amazed by how bad this is recently.

    Any thoughts on whom could provide this info, or build a research system to rival Nielsen that actually provides AU traffic stats for all websites around the world? UB, PI’s, ASD, Frequency and Pages Per Visit are all vital when making research decisions.

    Declan

  12. Hi Ben,

    I really like reading your blog and I agree that “engagement” is a good metric on which to evaluate a site. I know its something you’ve been talking about for a while.

    However, more than half the the top sites for your session duration list all have an automatic page refresh, of various different amounts (see list below). This has *got* to be affecting their Average Session Duration numbers, as each time a page reloads, it extends the length of the session. It gives an unrealistic picture of how long people are staying on the site, as pages that are autorefreshing in inactive browser tabs are counted the same in Neilsen as pages loaded from the user clicking around a site.

    (As an example of the effect this can have – and as much as I don’t want to focus on a competitor – but have a look in MI at the top site in the list, iTwire, and look at the session times recently. They have more than doubled between June and August, whilst the average daily UB’s have not changed. The site itself hasn’t changed or relaunched, the content mix is the same, there is no new functionality, so how do you explain such a jump in such a short period of time? Is it just coincidence that their autorefresh changed to 15mins across all pages?)

    For a good measure of engagement, I agree with Renai’s post above – user comments and forum activity is a very powerful measure. However its harder to aggregate and report on, as its not tracked in MI or GA. Alexx, I’d love to see an official “engagement index” with comments included somehow…

    1/ IT Wire – auto refresh of 15mins, all pages
    2/ Lifehacker – auto refresh of 15mins, all pages
    3/ Weatherzone – auto refresh of 10mins
    4/ Gizmodo – auto refresh of 15mins, all pages
    5/ Trading Post Automotive – no refresh that I can see
    6/ Carsales – no refresh that I can see
    7/ Cricinfo – no refresh that I can see
    8/ Zoo weekly – auto refresh of 15mins
    9/ RSVP – auto refresh of 60mins
    10/ Essential Baby – auto refresh of 5mins
    11/ Business Spectator – auto refresh of 4mins
    12/ Real Estate.com.au – auto refresh of 5mins
    13/ Kotaku – auto refresh of 15mins, all pages
    14/ Take40.com – no refresh that I can see
    15/ SMH homepage – – auto refresh of 5mins
    16/ Hot Hits – no refresh that I can see
    17/ REIWA – no refresh that I can see
    18/ Sportal – auto refresh of 6mins
    19/ Carpoint – no refresh that I can see
    20/ RAA (SA) – no refresh that I can see

  13. talkingdigital

    hi rhys,

    how does auto refresh change dwell time? my understanding would be it only impacts page impressions.

    my feeling on this – and it’s more a feeling than anything scientific – is that auto refreshes impact would be small on these metrics. what % of users stick around idle long enough for the auto refresh to kick in?

  14. Rhys, the ABA’s Digital WatchDog committee will be locking down the guidelines for auto-refresh in the next couple of months. The committee will have representation from ABA audited publishers and the MFA, so the issue can be tackled from both sides. Now that the industry is beginning to actually monitor rules compliance, it will be harder for sites to game the stats and keep it under the radar.

    An official engagement index may or may not be in the works. 😉 haha. Stay tuned.

  15. Hi Ben,

    I think the key is that pages are auto refreshing on pages that are not “active” to the user. Think about how people use the web. They have multiple instances of a browser like Firefox or IE open, with different websites up in each. More often that not, they will also have multiple tabs open (and since IE7 launched now all browsers have the ability for tabbed browsing), with multiple websites in each.

    Generally, you look at one website, then on to another and another and so on. Throughout your day, you’ll probably leave a few websites open in tabs or browsers either because you answer an email, write a document, answer the phone, go get lunch, or whatever.

    For sites that are still open and have an auto refresh, the page will continue reload automatically on their timer even when the site is not “active” – ie: when the user is not explicitly looking at that site. As long as its open in a browser or tab, the auto refresh thinks you’re still looking at it, even if it is “idle”, and will continue to refresh. The auto refresh doesn”t care if its idle or not.

    Each time it refreshes, it tells Neilsen that the user is still there, and extends the session time that’s recorded.

    I think that for tech sites the problem is probably amplified, as tech savvy users probably have more tabs and browsers open. (3 out of the top 5 on your ASD list are tech sites )

    My feeling is that this can have quite a significant impact on PI’s and ASD/Dwell time. If you look closely at the sites who have changed their auto refresh timing,
    and then look at the PI’s and ASD numbers, you can see a pretty clear correlation.

  16. Hi Ben,
    Nice analysis. FYI – Habbo.com.au is normally measured in the 18 to 20 minute averge time on site – however since we launched a new flash client in mid June – these visits have not been counted. This is being rectifed and you should see Habbo’s figures again rise towards the top of the list.
    Incidently as you probaly know there is a huge discrepancy between Nielsen and both Google Analytics and Hitwise when it come to time on site measurement. Comparing May figures (prior to Habbo Nielsen measurment issue) across the three systems – these are the measurements. Nielsen MI – 18 mins 11 sec, Google Analytics – 42 mins 43 sec and Hitwise – 57 mins 35 sec

    Jeff

  17. @Dex, re non-Nielsen site tracking: isn’t that what Hitwise does, track all sites via ISPs without having to engage with publishers?

  18. How is it that auto-refreshing is still common practice? Surely we can do better!

    Many of the sites in that list would be peoples homepages . In theory you could not look at your browser in a work day and if smh was your homepage you’ve generated 96 page impressions and an 8 hour session time with not a single ad being seen.

  19. So what is good engagement vs bad engagement?

  20. and to add to that somewhat short post… do you think that there any validitiy with coming up with a single figure or index to represent the engagment factor as you put it ‘how often’ and how long’ (i.e. by multiplying frequency and average session duration)?

  21. Good read, good chatter!

    I tend to agree with Hunter. I think tracking monthly minutes per visitor (calculated average) a more useful metric of engagement than minutes spent in one individual visit.

  22. Hi Rhys,

    I know it must have pained you terribly to name and comment on a competitor – especially when your site didn’t rate a mention.

    Just to correct a factual error on your part: in fact the content mix of iTWire has changed considerably since June, as has our talent line up.

    Best regards and keep punching
    Stan Beer
    Editor

  23. Renai interesting point about measuring engagement through comments. Again though the number crunching could be taken past the superficial count. For example this post’s 22 comments held my attention (and motivated me to leave a note) for much longer than 150 comments on other sites ever would – especially when the popularity of some of those sites mean that most commenters are just looking for a link back to their blogs.

  24. talkingdigital

    it’s interesting looking at engagement via comments. i guess it depends on the definition of engagement … is it people actively contributing and interacting or is it people spending time with the site and reading the content passively? mandi is right – there are a lot of sites where the comments are motivated by linkbacks or traffic etc

  25. Hi Ben,

    I have been following your blog and finally had the time to write in. This topic brought me out. At comScore there are a number of metrics that could be used for engagement including Frequency (avg visits per visitor) and Duration (avg minutes per visit). Others include “Avg minutes per usage day”, “avg minutes per usage day”, “avg pages per usage day”, “avg visits per visitor”, etc. We have these because different types of sites use different metrics for their own engagement. Not to get into a debate on what is the best engagement metric, but I would like to share some numbers from our Australia panel:
    – – Web Domains – UV – avg Minutes/visitor – avg visits/visitor

    1 – GOOGLE.COM.AU – 10,022 – 2.4 – 21.0
    2 – LIVE.COM – 7,822 – 6.6 – 13.8
    3 – NINEMSN.COM.AU – 6,876 – 3.1 – 14.2
    4 – FACEBOOK.COM – 6,102 – 8.7 – 16.8
    5 – YOUTUBE.COM – 5,782 – 15.9 – 7.7
    6 – GOOGLE.COM – 5,658 – 2.7 – 8.4
    7 – YAHOO.COM – 4,261 – 4.2 – 5.8
    8 – EBAY.COM.AU – 3,855 – 14.9 – 6.2
    9 – WIKIPEDIA.ORG – 3,833 – 3.2 – 4.0
    10 – MSN.COM – 3,556 – 1.4 – 6.1
    11 – MYSPACE.COM – 3,531 – 12.4 – 10.5
    12 – YAHOO.COM.AU – 3,269 – 4.7 – 9.0
    13 – MICROSOFT.COM – 3,096 – 2.5 – 2.3
    14 – BLOGGER.COM – 2,991 – 1.9 – 2.8
    15 – BING.COM – 2,724 – 1.3 – 4.3
    16 – EBAY.COM – 2,530 – 2.0 – 4.9
    17 – COMMBANK.COM.AU – 1,999 – 5.0 – 4.9
    18 – NEWS.COM.AU – 1,993 – 10.6 – 4.8
    19 – APPLE.COM – 1,866 – 2.6 – 3.9
    20 – WORDPRESS.COM – 1,748 – 1.4 – 1.9
    21 – ASK.COM – 1,729 – 1.2 – 2.9
    22 – BIGPOND.COM – 1,704 – 2.8 – 7.5
    23 – ABOUT.COM – 1,560 – 2.0 – 1.6
    24 – ADOBE.COM – 1,495 – 2.5 – 1.7
    25 – BEBO.COM – 1,475 – 13.0 – 8.5
    26 – MOZILLA.COM – 1,460 – 0.5 – 2.1
    27 – IMDB.COM – 1,241 – 3.6 – 2.3
    28 – WIKIANSWERS.COM – 1,235 – 1.5 – 1.8
    29 – REALESTATE.COM.AU – 1,224 – 16.6 – 3.6
    30 – ABC.NET.AU – 1,187 – 4.6 – 2.5

    This is comScore’s top web domain report for June 2009 from our panel of 30,000 Australians under measurement.

    (Not sure how the formatting will look on the above. Apologies in advance)
    – Joe

  26. Yeah there is definitely a big difference between sites which have comments from people mainly looking for linkback etc, and those with a “real” discussion between readers.

    I don’t think it’s too hard to quantify or qualify this kind of engagement though. You can do stats about frequency of posting per use or how many per day in general as a total. And as for qualitative measurements … well it’s pretty easy for readers, marketers and everyone in general to see which sites have readers talking and which don’t.

    My 2c,

    Renai

    PS Ben when are you going to turn on threaded talkback on your blog? Jeez!

  27. talkingdigital

    what is threaded talkback? once i know what it is i can invetigate!

  28. Threaded comments means you can reply to a specific comment and your comment will appear under that comment in a separate ‘thread’. At the moment all of the comments here are just in one, massive, chaotic thread 😉

    Should be a WordPress plugin for it if you’re using WP.

    Renai

  29. Hi Ben,

    Great blog – Would you know where i can locate the full Nielson report?

    Regards,
    Aylin

  30. talkingdigital

    Hi Aylin – groupm have a subscription to Nielsen and this is how we cut the data. Sensis should have access to the same numbers as I’d imagine they are also a Nielsen client.

  31. Just came across this topic discussion. I’d really have to question Stan’s idea of metrics. I am sure that as wonderful as your mix of content and talent line up is, it does not account for the fact that your site is listed as number 1 ahead of Real estate sites or car sales site and not even CNet who is proven to be the number 1 tech website by unique users and page impressions doesn’t make an appearance. Seriously look at the 20 listed and surely commonsense prevails that NO ONE in their right mind would spend 17 minutes on a weather site.

    It also makes sense that people looking for cars, new partners, real estate will stay on that site for a much longer time. So what does it say about the top 4 sites. ALL of them have an autorefresh on pretty much every page from between 10 & 15 mins.

    Stan, in honour of your boxing analogy to Rhys about punching, how about a challenge to you.

    The challenge to you and all the others that are auto refreshing is, if you wish to prove that your site is worthy of its number 1 spot because of its engaging mix of content and new talent please turn off your auto refresh on all pages for a month and let’s check back after that to see where you are.

    Round 1. Ding, Ding. 🙂

    Nick

  32. Hi Nick,

    For a start, where do you get “my idea of metrics” from?

    I didn’t post this blog – take it up with the author (and Nielsen) if you want to question the metrics and the interpretation thereof.

    My comment to Rhys was merely to correct his inaccurate assertion that our content mix hadn’t changed.

    As for your challenge, we have nothing to prove.

  33. it wire still using page refresh to boost their figures

Leave a comment