Australian ISP Market Share, March 2012

Back in 2010 I had a go at estimating the relative market shares of Internet Service Providers (ISPs) in Australia. A reader has asked if I have more recent data, so I thought it was high time I revisited these estimates.

I’ve applied the same method as last time: Roy Morgan make ISP customer profiles available for purchase. At the bottom of each report’s synopsis, you’ll see that a sample size has been included (for example, the Internode customer profile is based on a sample of 404 customers).  Now, combining these sample sizes from the ISPs’ profiles I think could potentially provide a good basis for estimating overall market share.

My results are presented in Table 1 below. Market share is ALL business, government and home subscribers with ANY kind of internet access including dialup, DSL, cable, fibre, satellite and wireless (fixed & mobile).

The Roy Morgan samples behind these estimates were taken between April 2011 and March 2012 — except where marked with an asterix. Marked samples were taken between April 2010 and March 2012 (i.e. over two years instead of one), so please regard estimates based on these data with extra caution. Market share in these cases could actually be much smaller than calculated here.

Several ISPs listed individually in Table 1 are actually subsidiaries  of larger groups, particularly the iiNet group. As a group, iiNet now has a combined market share of 13.2%. Please refer to the notes under the table for more details.

Table 1: Estimated Australian ISP market share, March 2012

Internet Service Provider Roy Morgan
sample size (no.)
Estimated relative
market share (%)
3 Internet 243 1.8%
AAPT 255 1.9%
Adam 185 1.4%
Chariot 135 * 1.0% *
Dodo 348 2.6%
Exetel 119 0.9%
iiNet 660 4.9%
Internode 404 3.0%
iPrimus 237 1.8%
Netspace 164 1.2%
Optusnet 2,367 17.6%
TADAust Connect 144 * 1.1% *
Telstra Bigpond 6,607 49.2%
TPG 749 5.6%
Unwired 111 * 0.8% *
Virgin 163 1.2%
Vodafone 252 1.9%
Westnet 297 2.2%
TOTAL 13,440 100.0%

* Treat these data/estimates with extra caution. Time period for these samples are two years, April 2010 – March 2012. All other time periods are one year, April 2011 – March 2012.

Important notes:

  1. AAPT, Netspace, Westnet, and Internode owned by iiNet. Estimated market share for whole iiNet group = 13.2%.
  2. Adam Internet serves South Australia and Northern Territory only. Also, Adam are in the process of being acquired by Telstra (at time of writing).
  3. Chariot is owned by TPG.


Queuing Theory and iiNet, Part II

It’s an interesting read, but the author makes a lot of basic errors. Unfortunately, customers refuse to line up and call at regular intervals and spend the average amount of time on the call. The reality is obviously more bursty than that and needs non-linear modelling.

- Michael Malone, CEO of iiNet

The feedback from Michael Malone above was in response to my previous blog post on Applying Queuing Theory to iiNet Call Centre Data. I don’t accept that I made “a lot of basic errors”, but I did make a lot of assumptions. Or perhaps the statistician George E. P. Box said it better, “Essentially, all models are wrong, but some are useful.”

But Michael is correct – customers don’t line up and call at regular intervals, and the reality is more “bursty” (i.e. Poisson). My model is inadequate because it doesn’t take into account all the natural variation in the system.

One way of dealing with, or incorporating, this random variation into the model is by applying Monte Carlo methods.

Take the iiNet Support Phone Call Waiting Statistics for 6 February 2012, specifically for the hour 11am to noon. I chose this time block because the values are relatively easy to read off the graph’s scale – (a bit over) 664 calls and an average time in the queue of 24 minutes.

Now if we assume Average Handling Time (AHT), including time on the call itself followed by off-phone wrap-up  time, was 12 minutes, then my model says there were 664*(12/60) / (24/60 +1) = 95 iiNet Customer Service Officers (CSOs) actually taking support calls between 11am and noon on 6 February 2012. That’s an estimate of average number of CSOs actually on the phones and taking calls during that hour, excluding those on a break, performing other tasks, and so on. Just those handling calls.

But there will be a lot of variation in conditions amongst those 664 calls. I constructed a little Monte Carlo simulation and ran 20,000 iterations of the model with random variation in call arrival rates, AHT, and queue wait times.


Little’s Law applies
664 calls were received that hour (at a steady pace)
Average time in the queue of 24 minutes
AHT (time on the actual call itself plus off-call wrap-up) of 12 minutes

then the result of the 20,000 monte carlo runs is a new estimate of 135 iiNet CSOs taking support calls between 11am and noon on 6 February 2012.

I ran a few more simulations, plugging in different values for number of CSOs handling calls (all else remaining equal – i.e. 664 calls an hour; AHT=12 minutes) to see what it did for average time in the queue. The results are summarised in the table below:

Modelling suggests that if iiNet wanted to bring the average time in the phone call support queue down to a sub-5 minute level during that particular hour of interest, an additional 85% in active phone support resourcing would need to be applied.

The table of results is graphically presented below (y-axis is time in queue, x-axis is CSOs)

Looks nice and non-linear to me :) You can see a law of diminishing returns thing start to take place around about the point of the graph corresponding to 160 CSOs / 16.5 minute average queue wait time.


Applying Queuing Theory to iiNet Call Centre Data

In previous posts I’ve talked about queuing theory, and the application of Little’s Law in particular, to Internet Service Provider (ISP) customer support call centre wait times. We can define Little’s Law, as it applies to a call centre, as:

The long-term average number of support staff (N) in a stable system is equal to the total number of customers in the queue (Q) multiplied by the average time support staff spend resolving a customers’ technical problems (T), divided by the total time waited in the queue (W); or expressed algebraically: N=QT/W.

Thinking things through a bit more, the total number of customers in the queue (Q) at a point in time in a stable system should be equal to the rate at which people joined the queue (λ), minus the rate at which the support desk dealt with technical problems (i.e. N/T) over the period of observation. Obviously Q>=0.

So N=QT/W and Q=λ-N/T which all comes out in the wash as:


I thought might be a bit of fun to see if this could be applied to the customer support call centre waiting statistics published by one of Australia’s largest ISPs, iiNet.

iiNet make some support centre data available via their customer toolbox page. Below is a screenshot of call activity and wait times graphed each hour by iiNet on 10 January 2012. The green line (in conjunction with the scale on the left hand side of the graph) represents the average time (in minutes) it took to speak to a customer service representative (CSR), including call-backs. The grey bars (in conjunction with the right hand scale) represents the total number of incoming phone calls to iiNet’s support desk.

It may be possible to use the formula derived above to estimate how many CSRs iiNet had on the support desk handling calls that day. For example, during the observed peak period of 8am to 1pm on Tuesday, 10 January 2012, the iiNet support desk was getting around 732 calls per hour on average. The expected wait time in the queue over the same period was around 11 minutes.

If we assume that the average time taken for a CSR to resolve a technical problem is, let’s say, 12.5 minutes, then we can estimate that the number of CSRs answering calls in a typical peak-hour between 8am to 1pm on 10 January 2012 as:

732*(12.5/60) / (11/60 + 1)

= 129 CSRs actively handling calls.

Sounds sort of a reasonable for a customer service-focussed ISP the size of iiNet. But if iiNet wanted to bring the average time in the queue down even more – to a more reasonable 3 minutes, for example – they’d need 145 CSRs (all else remaining equal) during a typical peak-hour answering calls.


“iiNet fares best in TIO complaints…”

Or do they?

The above headline and corresponding story appeared in ARN on 4 May 2011. The article was reporting on the Telecommunications Industry Ombudsman’s (TIO) January to March 2011 statistics published here.

Now before I go any further, let me say I have grave concerns with the source data itself, let alone the way it’s being reported in the press. When talking about TIO complaint statistics, it is worthwhile to familiarise yourself with this recent research report from the University of Technology, Sydney (UTS).

Findings of the UTS research include (from the executive summary):

  • Over 80% (of carriage service providers surveyed) assert that the TIO accepts complaints which are out of jurisdiction, frivolous or vexatious.

Recommendations include:

  • Amend TIO policy and procedure to cease the multiple counting of complaints in statistics and recommence reporting disposition of complaints.
  • Amend TIO policy and procedure to refer to Level 1 Complaints as Contacts rather than Complaints.

So fair to say the TIO isn’t exactly highly regarded by the ISP industry. Indeed, Exetel have commenced legal action against the TIO for acting outside its charter. Or, in Exetel CEO John Linton’s own words, “Exetel is now going to take a court action to have the TIO closed as being a ‘criminal’ organisation based on a level of incompetence, lack of knowledge and unconstitutional actions that even Australians, who are mostly as apathetic as the governments they tolerate, shouldn’t have imposed on them.”

Well OK then…

Now if you still want to accept that all “complaints”, as reported by the TIO, are real, legitimate complaints then the fairest way to present the numbers is as a proportion of overall customer base. That’s going to be a bit tricky, because not all ISPs publish accurate customer numbers in the public domain. So I’ve had to estimate relative market share using Roy Morgan survey data and a methodology used in a previous blog post. When I refer to market share I mean ALL business, government AND home subscribers with ANY kind of internet access including dialup, DSL, cable, fibre, satellite and wireless (fixed & mobile). The Roy Morgan surveys were conducted between January 2010 and December 2010

Then, to be consistent, I only looked at TIO complaints made against a provider’s internet services . That is, I excluded complaints made against landline and mobile phone services. I considered all internet services TIO complaints made between October 2010 and March 2011

Then I derived a normalised score by dividing the relative market share by the proportion of complaints. The benchmark is a score of 1000 and a higher score is better. A score over 1000 means that an ISP recorded relatively fewer complaints as a proportion of its customer base than its peers. Less than 1000 means that an ISP was over-represented in TIO complaints relative to its size.

The TIO complaints leaderboard comes out as follows (click to up-size):

Table 1: TIO internet services complaints leaderboard, 2010/2011

iiNet did indeed do very well. With roughly an 11% market share, but only 6% of complaints, it gets a score of 1843. That’s well above the expected benchmark of 1000. But it didn’t fare best. The ISP that recorded the lowest number of TIO complaints as a proportion of number of customers was Internode, with a massive normalised score of 3448.


The NBN, CVC and burst capacity

Late last year, NBN Co (the body responsible for rolling out Australia’s National Broadband Network) released more detail on its wholesale products and pricing. You can download their Product and Pricing Overview here. The pricing component that I wanted to analyse in this post is NBN Co’s additional charge for “Connectivity Virtual Circuit” (CVC) capacity.

CVC is bandwidth that ISPs will need to purchase from NBN Co, charged at the rate of $20 (ex-GST) per Mbps per month. Note that this CVC is on top of the backhaul and international transit required to pipe all those interwebs into your home. And just like backhaul and international transit, if an ISP doesn’t buy enough CVC from NBN Co to cover peak utilisation, its customers will experience a congested service.

The problem with the CVC tax, priced as it is by NBN Co, is that it punishes small players. By my calculations, an ISP of (say) 1000 subscribers will need to spend proportionally a lot more on CVC than an ISP of 1,000,000 subscribers if they want to provide a service that delivers the speeds it promises.

Here comes the statistics.

Consider NBN Co’s theoretical 12 megabit service with 10GB of monthly quota example that they use in the document I linked to above. 10GB per month, at 12Mbps gives you 6,827 seconds (a bit under 2 hours) at full speed before you’re throttled off. There’s 2,592,000 seconds in a 30-day month, so if I choose a random moment in time there is a 6827/2592000 = 0.263% chance that I’ll find you downloading at full speed.

That’s on average. The probability would naturally be higher during peak times. But let’s assume in this example that our theoretical ISP has a perfectly balanced network profile (no peak or off-peak periods). It doesn’t affect the point I’ll labour to make.

A mid-sized ISP with (let’s say) 100,000 subscribers can expect, on average, to have 100,000*0.263% = only 263 of those customers downloading at full speed simultaneously at any particular second. However, the Binomial distribution tells us that there’s a relatively small, but still statistically significant (at the alpha=0.05 level) probability that there could be 290 or more customers downloading at the same time.

So a quality ISP of 100,000 subscribers will plan to buy enough CVC bandwidth to service 263 customers at any one time. But a statistician would advise the ISP to buy enough CVC bandwidth to service 290 subscribers, an additional (290-263)/263 = 10% , or find itself with a congested service about one day in every 20.

This additional “burst headroom”, as a percentage, increases as the size of the ISP decreases. From above, an ISP of 10,000 can expect to have 26 customers downloading simultaneously at any random moment in time. But there’s a statistically significant chance this could be 35+. This requires them to buy an additional (35-26)/26=33% in CVC over and above what was expected to cover peak bursts.

The table below summarises, for ISPs of various sizes, how much additional CVC would need to be purchased over and above the expected amount, to provide an uncontended service 95%+ of the time.

Graphically it looks a bit like this…

As you can see, things only really start to settle down for ISPs larger than 100,000 subscribers. Any smaller than that and your relative cost of CVC per subscriber per month is disproportionally large.


Further reading:

Rebalancing NBNCo Pricing Model

NBN Pricing – Background & Examples, Part 1


Australian ISP Market Share, 2009-2010

The market research firm, Roy Morgan, has released its latest ISP satisfaction data, with an overwhelmingly positive result recorded for Internode and iiNet.

According to the latest Roy Morgan Internet Satisfaction data, Internode (93.4%) is still the top performer for customer satisfaction while iiNet (89.9%) appears to be closing the gap from 5.6% points in the 6 months to April 2010 to 3.5% points in the 6 months to May 2010.

Scrolling further down the Roy Morgan press release page, you’ll find individual ISP customer profiles available for purchase.  At the bottom of each report’s synopsis, you’ll see that a sample size has been included [for example, the Internode customer profile is based on a sample of 305 customers].  Now, combining these sample sizes from the ISPs’ profiles I think could potentially provide a good basis for estimating market share.

My results/estimates are presented in the table below.  Market share is ALL business, government and home subscribers with ANY kind of internet access including dialup, DSL, cable, fibre, satellite and wireless [fixed & mobile].  The Roy Morgan samples were taken between April 2009 and May 2010.

Table 1: Estimated Australian ISP market share, 2009-2010

Internet Service Provider Roy Morgan sample
Est. market share
2009-2010 (%)
3 Internet 322 2.6%
AAPT 342 2.8%
Adam 134 1.1%
Chariot 134 1.1%
Dodo 321 2.6%
Exetel 206 1.7%
iiNet 509 4.2%
Internode 305 2.5%
iPrimus 284 2.3%
Netspace 166 1.4%
Optus 2,099 17.3%
Primus-AOL 153 1.3%
TADAust 119 1.0%
Telstra 5,710 46.9%
TPG 539 4.4%
Unwired 175 1.4%
Virgin 158 1.3%
Vodafone 103 0.8%
Westnet 384 3.2%
TOTAL 12,163* 100.0%
  • AAPT, Netspace and Westnet are owned by iiNet
  • Chariot is owned by TPG
  • Adam offers residental internet access in South Australia & Northern Territory only


ISP Customer Service in 2009

A few weeks ago Whirlpool’s Australian Broadband Survey 2009 Report was released.  Last year I used the 2008 report to analyse the survey results specifically as they pertained to ISP customer service; so I thought it would be good idea to update my analysis, and see just how much the ISP customer service landscape has changed over the 12 month period.

My objective, as it was last year, was to take results from these three survey questions related to customer service

  1. When calling customer support, how long did you have to wait on the phone (or talk to an operator) before you spoke to the right person?
  2. How quickly have technical support issues typically taken to resolve?
  3. How would you rate their customer service?

and distil them down to a single score that can be used to rank providers.  I arbitrarily set the benchmark score across the whole industry to be 1000, with each individual ISP’s customer service ranked relative to that benchmark.  So an ISP score higher than 1000 is above the industry average.  Lower than 1000 is below average.

The methodology employed was exactly the same as last year, so no need to go into the details.  Without further ado, here are the updated results:

Stan’s Top Five ISPs for Customer Service in 2009 [2008 rank in brackets]

  1. Adam Internet [3]
  2. Westnet [1]
  3. Amnet [2]
  4. Internode [4]
  5. iiNet [6]

Congratulations go to local Adelaide-based outfit, Adam Internet.  Number 1 with a bullet in 2009.  Westnet (purchased by iiNet in 2008) has always prided itself on providing subscribers with a premium customer service experience, so it was very surprising to see them knocked off their coveted number 1 spot.  Also surprising to see aaNet slip out of the Top Five altogether, replaced by iiNet.

The overall results from the three customer service questions (equally weighted) are as follows:

Table 1: Australian ISP customer service scores
<1000:below average1000:average>1000:above average

ISP 1. Time in queue 2. Speed of resolution 3. Rating of service OVERALL CUSTOMER SERVICE SCORE
Telstra Cable 513 556 731 586
Telstra DSL 488 523 724 562
Optus Cable 605 886 812 748
Optus DSL 669 668 809 710
iiNet 1389 1410 1101 1284
Internode 1636 1803 1176 1488
TPG 757 665 812 739
Westnet 2249 2601 1220 1820
Adam 2796 2544 1139 1842
Exetel 1125 954 919 991
Netspace 559 958 973 777
aaNet 903 863 904 889
iPrimus 950 1273 1025 1066
Amnet 2792 2242 1128 1774
Telstra NextG 389 365 645 437
AAPT 879 604 846 755
Other ISPs 844 725 938 826
TOTAL 1000 1000 1000 1000

It’s important to keep in mind that Whirlpool’s Australian Broadband Survey isn’t scientific.  Although it gets tens of thousands of responses, it only reflects the opinions of those who are aware of the Whirlpool site and motivated to express an opinion.  It is a self-select survey and, as such, the respondents’ attitudes may not be statistically representative of the ISP’s customer base.   In other words, take with a grain of salt.

Ranked from highest to lowest the results are as follows:
ISP (2009 score) (2008 score):

  • Adam Internet (1842) (1727)
  • Westnet (1820) (2132)
  • Amnet (1774) (1735)
  • Internode (1488) (1348)
  • iiNet (1284) (1081)
  • iPrimus (1066) (903)
  • Exetel (991) (992)
  • aaNet (889) (1204)
  • Netspace (777) (912)
  • AAPT (755) (808)
  • Optus Cable (748) (736)
  • TPG (739) (963)
  • Optus DSL (710) (672)
  • Telstra Cable (586) (711)
  • Telstra DSL (562) (676)
  • Telstra NextG (437) (562)
  • Other (826) (959)
  • AVERAGE (1000)


Can I get a Little MORE support around here?

In November last year, I blogged about the phone queue reporting and graphing page beta-released by my ISP, Internode.  The aim was to use the data presented on that page, with some basic queuing theory (Little’s Law), to determine the size of their helpdesk.  I theorised that a rough estimate for how many Internode support staff are on duty at any particular point in time could be given by:

Calls in Queue x 12.5 / Wait Time

Looking at the hourly averages, I concluded that, on the Saturday of my analysis, Internode helpdesk had 8 or so people on hand to assist with customers’ technical problems.  I have been informed that my estimate for that period was surprisingly accurate.

The graphs and hourly averages data were taken offline for a little bit, but they’ve recently been reinstated.  I thought it would be timely and interesting to have another look and see what’s changed over the intervening months.  Last Saturday evening I went through and analysed the hourly averages covering the time period from 8pm Friday (17 July 2009) to 8pm Saturday (18 July 2009).  Note that Internode’s residential technical support helpdesk is staffed from 7am to midnight, 7 days a week.  I then applied the same methodology from Can I get a Little support around here to estimate the number of support staff on duty (last column).

Table 1: Internode helpdesk phone queue – hourly averages

Time period Avg. wait time
Avg. calls queued
Support staff on duty
Friday, 8pm-9pm 00:21 0.1 4
9pm-10pm 00:22 0.1 3
10pm-11pm 00:21 0.0 not enough data
11pm-midnight 00:22 0.0 not enough data
Saturday, 7am-8am 00:22 0.1 3
8am-9am 00:30 0.2 5
9am-10am 00:22 0.2 7
10am-11am 00:26 0.3 9
11am-noon 00:22 0.2 7
noon-1pm 00:22 0.2 7
1pm-2pm 00:23 0.2 7
2pm-3pm 00:23 0.2 7
3pm-4pm 00:58 0.7 9
4pm-5pm 00:22 0.2 7
5pm-6pm 00:23 0.2 7
6pm-7pm 00:22 0.1 3
7pm-8pm 00:21 0.1 4

Looking through my small window of analysis, it appears that Internode have largely resolved any problems they were experiencing late last year/early this year in terms of extraordinarily long wait times.  Time spent in the phone queue has collapsed from around 10 minutes to less than 30 seconds.  However, this dramatic improvement doesn’t appear to be due to any significant increase in staff numbers.


Estimating Adam Internet’s Broadband Customer Numbers

Executive Summary

I estimate that Adam Internet is somewhere between 27% and 30% the size of Internode, which equates to between 42,249 and 46,934 broadband customers.  This estimate does not include other services such as 3G, VoIP and dialup customers.


Adam Internet” is an Internet Service Provider (ISP) based in my home town of Adelaide, Australia.  They are tightly focussed in terms of the markets that they operate – only providing residential broadband services (apart from 3G) to South Australia and the Northern Territory.  I’ve never used them, but by all accounts they are a very reputable company, with high levels of customer satisfaction and a strong parochial following in the Whirlpool forums.  Adam Internet is also the eternal, bitter, blood-rivals of Internode.  Well, that might be a bit of an exaggeration.  I wanted to inject a bit of drama.

Nothing like a bit of drama.

Anyway, as a privately owned company, Adam Internet is not required to make public how many customers it has.  Indeed, when it comes to operational data, they play their cards very close to their chest indeed.  That’s fine.  It’s not really any of our business.  However, when the company last chose to talk about subscriber numbers, it was “about 75,000 customers“.  It’s not clear whether this is just residential broadband customers, or all customers including VoIP, dialup, and other services.

The Challenge

I thought it would be fun to look at some ratios available in the public domain and then, keeping with the Adam-Internet-vs.-Internode theme of this post, use Internode as a base line to estimate just the number of Adam Internet broadband customers.

The Data

Adam Internet and Internode both attract a very loyal following among the ne’er-do-wells of the Whirlpool forums.  With that in mind, the first metric I looked at was the ratio between respondents to the last three annual Whirlpool surveys.  Assuming that there’s a correlation between Whirlpool survey respondents and number of customers this may be a useful measure of relative size.  The results are summarised in the table below:

Whirlpool survey year Adam Internet respondents Internode respondents Respondent Ratio
2006 707 3,156 0.224
2007 1,003 2,981 0.336
2008 673 2,649 0.254
Average 794 2,929 0.271

Source: Total respondents taken from the question, “Would you recommend your ISP to other people?”

Results from the last three Whirlpool surveys suggests that Adam Internet is 27.1% the size of Internode.

The second metric I used was Google Trends.  Assuming that there’s a correlation between Google searches and customer numbers, Google Trends may be a useful indicator of a company’s market share.  I compared average traffic of “Internode” to “Adam Internet” (without the quotes) from Australia over the last 30 days.  At the time of writing, Adam Internet’s Google traffic was 27% of Internode’s, a remarkably close match to the average Whirlpool survey ratio above.


Results from the Google Trends comparison suggests that Adam Internet is 27% the size of Internode.

Finally, I turned to Wikipedia.  At the time of writing, Internode had 300 staff to Adam Internet‘s 100 staff.  Assuming that both companies adhere to roughly the same staff-to-customers ratio, Wikipedia suggests that Adam Internet is about 30% of the size of Internode.

The Results

Of course I’m more than likely totally wrong in all my assumptions.  And probably the entire approach is wrong.  However, the three sources of Whirlpool, Google and Wikipedia all seem to suggest that Adam Internet is somewhere between 27% and 30% the size of Internode.  If wrong then at least it’s consistently wrong.

But if I’m confident that the percentage market share is in the ballpark, how would this translate to actual numbers of broadband customers?

Back in September last year I had a crack at modelling Internode’s growth rate.  The formula that I came up with then was:

Internode broadband customers = 463.047 * loge(0.626 * loge[year])

At the time of writing (22 May 2009), “year” is 9.389.  Plugging that value into the formula suggests that Internode currently have 156,447 broadband customers.  Admittedly a lot has changed at Internode since September 2008.  They’ve released many new products onto the market including (in no particular order) Chumby, Tivo, ADSL “TwoPlus” and even (rather belatedly) a 3G product of their own.  Not to mention the obvious problem that my model was a big dodge right from the start.  But the last media release I read from Internode quoted “more than 150,000 (broadband) customers” nationally.  I guess if Internode had more than 160,000 broadband customers they’d say so?  So I think it’s fair to say the number is between 150-160k.  Perhaps Internode are just being modest.  Or, God forbid, perhaps my model is proving to be not only correct but remarkably resilient!


If Adam Internet is somewhere between 27% and 30% the size of Internode as indicated, and if Internode have 156,447 broadband customers as modelled, it follows that Adam Internet have between 42,249 and 46,934 broadband customers of their own.  If correct then the difference between that figure and the publicly quoted number of 75,000 customers could be made up of 3G, VoIP, dialup and other Internet services.

This analysis is just a bit of fun.  Don’t take it too seriously.  Having said that, constructive feedback is always welcome.  Do you think I’m in the ball park or so off base it’s not even funny?  Would a senior Adam Internode rep like to comment?

Which Australian ISP has the best customer service?

Which Australian Internet Service Provider (ISP) has the “best” customer service?

I’m glad you asked.

By my measure, Westnet utterly dominates the ISP customer service landscape.  All other ISPs simply pale in comparison.  Perhaps there’s something special in the Perth drinking water supply that makes businesses more people-focussed, because my second ranked ISP, Amnet, is also based in that beautiful city.  I’m chuffed to see that two ISPs operating out of my home town of Adelaide finished strongly – Adam Internet and Internode.  Relatively small outfit, aaNet filled out my top 5 with iiNet romping home with a respectable, above average result.  Finally, for reasons that I explain at the end of this post, I believe it’s worth giving an honourable mention to Exetel.

In summary, Stan’s Top Five Customer Service ISPs are:

  1. Westnet (by a Western Australian mile)
  2. Amnet
  3. Adam Internet
  4. Internode
  5. aaNet

But this isn’t just my opinion.  It’s based on hard science.  And when I say hard science I obviously mean my back-of-the-envelope statistical doodling.  My objective was to take results from the Australian Broadband Survey 2008 Report related to customer service, and distil them down to a single score that can be used to rank providers.  I arbitrarily set the benchmark score across the whole industry to be 1000, with each individual ISP’s customer service ranked relative to that benchmark.  So an ISP score higher than 1000 is above the industry average.  Lower than 1000 is below average.

I started by looking at the question:

When calling customer support, how long did you have to wait on the phone (or talk to an operator) before you spoke to the right person?

I used the resulting percentages (and some statistical shenanigans) to estimate the average time respondents to the survey spent waiting in the customer support phone queue:

ISP <1 min 1-4 mins 5-9 mins 10-20 mins >20 mins est. average time in queue
Telstra Cable 2.4% 19.4% 27.4% 26.2% 24.6% 19.9 mins
Telstra DSL 1.3% 18.4% 25.9% 26.6% 27.7% 21.5 mins
Optus Cable 1.7% 16.6% 27.7% 31.8% 22.2% 19.3 mins
Optus DSL 2.7% 19.3% 23.5% 28.3% 26.2% 20.8 mins
iiNet 10.0% 37.9% 25.0% 14.2% 12.9% 12.0 mins
Internode 8.4% 39.2% 27.6% 15.9% 8.9% 10.2 mins
TPG 2.7% 31.6% 33.2% 22.9% 9.6% 11.8 mins
Westnet 33.5% 48.7% 12.1% 3.6% 2.1% 3.9 mins
Exetel 12.1% 39.2% 24.1% 13.6% 11.0% 10.8 mins
Adam 20.3% 50.0% 19.2% 7.1% 3.4% 5.6 mins
aaNet 7.2% 45.9% 31.1% 9.6% 6.2% 8.2 mins
Netspace 2.2% 26.0% 27.9% 24.7% 19.2% 16.9 mins
Amnet 19.4% 50.2% 18.9% 9.7% 1.8% 5.1 mins
iPrimus 3.8% 20.3% 32.3% 26.6% 17.1% 16.2 mins
Telstra NextG 3.6% 12.3% 22.5% 28.3% 33.3% 24.5 mins
AAPT 2.6% 26.6% 23.4% 21.4% 26.0% 19.8 mins
Other ISPs 9.8% 33.1% 25.7% 18.0% 13.4% 12.7 mins
TOTAL 9.7% 33.3% 25.2% 18.1% 13.7% 12.9 mins

So, for example, Westnet customers spent just 3.9 minutes, on average, waiting in the phone queue before speaking to a Customer Support Officer.  Not too shabby.  The average across all ISPs I estimated to be 12.9 minutes.  Of course it’s possible that these estimates might be a bit off.  But it doesn’t matter too much, because the idea is to rank each ISP relative to all ISPs.  So (to 3 decimal places) Westnet gets a relative score of (12.895/3.927)*1000=3284 for wait time.

And so on.  I have omitted the calcuations here to keep things concise, but I applied the same basic technique to the two remaining questions in the survey related to customer service:

How quickly have technical support issues typically taken to resolve?

How would you rate their customer service?

Again, each ISP was scaled comparatively to the overall index of 1000.

Finally, I generated a total score by taking the harmonic mean of the three individual categories.  My overall results from the three customer service questions (equally weighted) are as follows:

ISP Time in queue Time to resolution Rating of service TOTAL SCORE
Telstra Cable 649 696 808 711
Telstra DSL 600 657 802 676
Optus Cable 667 751 804 736
Optus DSL 620 653 759 672
iiNet 1077 1074 1093 1081
Internode 1260 1664 1203 1348
TPG 1089 991 843 963
Westnet 3284 3183 1268 2132
Exetel 1192 909 922 992
Adam 2290 2433 1125 1727
aaNet 1571 1219 966 1204
Netspace 764 1045 978 912
Amnet 2519 2320 1110 1735
iPrimus 797 974 961 903
Telstra NextG 527 479 740 562
AAPT 650 998 853 808
Other ISPs 1012 896 976 959
TOTAL 1000 1000 1000 1000

Westnet completely blitzed the field across all three categories.  Hence it ranks overall as the number 1 ISP for customer service in Australia, with a total score of 2132.  So Westnet is more than twice as “good” than the industry average (=1000).  In fact, one surprising result was just how well some of the smaller outfits in general did against the 900 pound gorillas, Telstra and Optus.  At the other end of the customer support spectrum, Telstra’s NextG service hobbled in with a miserly 562.  In customer support, size does not matter.

But the really surprising result for me was Exetel.  I have a bit of a soft spot for Exetel.  Their whole raison d’etre is to provide the lowest cost broadband services in Australia.  Remarkably, they apparently do so without compromising customer service all that much.  Exetel’s overall score was only a smidgeon below the industry benchmark index, and they comprehensively trounced many of their bigger competitors.

Ranked from highest to lowest the results are as follows (with scores):

  • Westnet (2132)
  • Amnet (1735)
  • Adam (1727)
  • Internode (1348)
  • aaNet (1204)
  • iiNet (1081)
  • Exetel (992)
  • TPG (963)
  • Netspace (912)
  • iPrimus (903)
  • AAPT (808)
  • Optus Cable (736)
  • Telstra Cable (711)
  • Telstra DSL (676)
  • Optus DSL (672)
  • Telstra NextG (562)
  • Other (959)
  • AVERAGE (1000)

basil-fawlty-on-phone“Your call is important to us”


Get every new post delivered to your Inbox.