Lonsdale Exchange It was the late 1980’s.

My employer at the time ran two data centres in Melbourne.  There was the production site in the Melbourne CBD, and the backup site at Clayton.  They were linked together with data links provided by Telecom Australia.

The disaster recovery planners thought that wasn’t redundant enough.  And since there was this new player in the leased data line market, Optus, and they were offering more competitive rates (ie. cheaper) for the same service, our backup data links were switched to Optus.

All went well for several months, until the day the Telecom Australia Lonsdale Telephone Exchange failed.  Failed as in the new digital exchange technology crashed.

No problems we thought, we’ll switch to the Optus provisioned backup lines.

Except they were dead too.  You can probably guess why.

In the post-mortem of what went wrong, we found that the data comms links were
a) physically separate links
b) terminated at the same exchange
c) actually Telecom Australia links which Optus resold to us.

The fix was for the backup link to be terminated at a different telephone exchange (North Melbourne).  I don’t remember if it was Optus providing these backup links.  I do remember it cost rather a lot of money to do.  I think Telecom Australia was very happy with the money we hurled at them.

I’d like to think our disaster recovery planners learnt that day that:
1. cheapest is not always best.
2. if you are buying a data communication product, that you need to check the provider (Optus in this case), is not just re-selling the product from the actual vendor (Telecom Australia).

Bookmark and Share