YourSuper Comparison – More than meets the Eye
Australia’s worst super funds have finally been placed in the village stocks for all to see. The YourSuper comparison tool was released earlier this year, it offered investors the ability to sort through various super fund options who have a MySuper default option and compare fees and returns over a 7 year period.
While the tool has been available for use for a few months, it was only last week that funds were categorised on their investment performance. Either “Performing” or “Underperforming”. This kicked off a flurry of excitement in the media about who the worst were and how the members of those funds will soon receive a letter outlining their fund’s underperformance.
As with most things in the media, there is no nuance, no context and plenty of confusion as they try to quickly paint the blacks and whites of a tool that has a lot of grey. We even received one phone call from a concerned client after they heard their superfund was in the worst category, rightfully he wondered what we were doing with his money. That left us having to explain his funds were on a platform with that company, not invested in the superfund managed by that company. There is a big difference. As the client responded, “they didn’t explain that on the news”.
It is good that the investors in some of these funds will be given a wakeup call that they are underperforming. But our client’s experience does highlight some investors will be bamboozled by what they’ll be confronted with. This is where the next round of problems begins.
The investors who do not know they are in a poor fund may be initially spooked by the letter they receive and then very confused on what they should do next, if anything. They’ll likely want to take action and rectify the problem. The first place to visit is the YourSuper Comparison tool.
We work in this area for a living, and we can say the tool, which is essentially a list, is a mess. Just trying to compare options can be very confusing unless you take the time to methodically wade through it and then visit the websites of the funds in an attempt to run a comparison. Even then it’s not particularly clear because of the opaqueness of how the funds present their portfolio construction and the risks they take. Links are provided back to the funds from the tool, but rarely does a link go directly to a page that shows the breakdown of the assets within the fund.
This information seems to be buried deeper and deeper in fund websites because it takes some effort to find. If you can find it, there’s no standardisation on how any of it is presented. If the government is going to run a tool like this, they should have the participating funds have a clear standardised breakdown of the percentages of asset classes and sub asset classes in their funds.
One bank fund had several small pie charts of how their funds were constructed, but no actual percentages. They were nice to look at, but not much use in terms of delivering accurate information. Anyone interested had to hunt around their website for the right product disclosure form to find the actual information. Would the average super fund member know where to look? Doubtful.
Analysing the list, the process has clearly identified some genuine dud funds. However, in the under performers there are funds that aren’t great, but some are actually doing better than funds in the performing list. We assumed this might be due to accounting for asset allocation and the risk taken, but the funds appear to fall into a similar risk category.
We’ll keep the funds anonymous, but as an example, fund A’s option for those born in the 1990’s has an 8.28% return and is marked as underperforming, while fund B’s option for those born in the 1990’s has a 7.83% return and is marked as performing. According to the information both provide, they both are near identical in the levels of defensive and growth assets, which means they’re taking very similar levels of risk to achieve those returns. Both are high growth options. Yet the highest returning fund is marked as underperforming, and the lowest returning fund is marked as performing. Either way, using the performance of their benchmark equivalents as a guide, both should be classified as underperforming.
And some funds were quick with excuses to explain why they are underperforming.
One of the funds in the underperformance section was in the media trying to hide behind their ethical branding and the fact this virtue was the reason for their underperformance. This shows the cynical behaviour and excuses found in the traditional investment space has quickly found its way into the ethical investment space. Quite often we’ll be told there’s no sacrificing performance by investing according to your ethical beliefs. Some have even boasted investors can do better by exclusion or screening on ethical grounds.
By our calculations, after fees, it appears this fund underperformed an index based equivalent by 1.7% per annum over seven years. It’s quite sneaky to wield virtue as a shield to excuse poor performance. You could almost guarantee had this fund topped the charts they’d be boasting that ethical investors don’t have to sacrifice performance to do the right thing.
There’s nothing wrong with investing according to your beliefs and investors should be free to do it, but this raises another issue with the performance tool. If this fund continues to underperform it will be excluded from accepting new members. Unlike other underperforming funds, an ethical fund would more likely be chosen by its investors instead of being a default choice made by employers. That choice will be taken away. It will probably die out if it can’t increase its performance.
This would seem to be the objective of the government here. Long term, more money in investors’ pockets should mean less spent on the pension, but past performance isn’t a guarantee of future performance. Which brings us to what is glaringly absent from this government built performance comparison tool.
A past performance disclaimer!
Past performance may not be indicative of future results.
A very strange oversight. While there is a disclaimer (as shown below), it makes no mention of past performance. Any product provider or someone recommending financial products is expected by the government to offer a performance disclaimer so consumers aren’t misled. Even individuals and companies independently running product comparisons have the good sense to include a past performance disclaimer. Why does an authoritative government comparison tool forget to include it?
For someone who has no experience, how would they make a decision to switch their super fund? Likely scroll to the top of the list and pick the fund with the highest return!
For all the meetings, consultants, working groups and financial resources likely expended on the YourSuper comparison tool, it’s a very rough start.