Categories
Finances

Retirement Funds from around the World

Here’s some super fun facts about retirement funds from around the world for you [Yes, that super pun was intended 😉]

On average, Australia has $72,823 saved per person for retirement in our superannuation plan. This is the largest private pension per person in the world.

This is from a wikipedia article here (which probably isn’t a complete picture because another wikipedia article says we are the 4th largest holder of pension fund assets in the world.)

Here is a table breaking down the average funds per person based on country and population (roughly sorted by funds under management):

CountryAssets US$
(in billions)
Population
( in millions)
Average funds
per person
 United States5662331 $17,105.74
 Australia185725.5 $72,823.53
 Norway10465.42 $192,988.93
 Japan1103126 $8,753.97
 Canada79237.7 $21,007.96
 South Korea46251.3 $9,005.85
 Netherlands38817.1 $22,690.06
 China2511439 $174.43
 Singapore2085.85 $35,555.56
 Malaysia18532.4 $5,709.88
 Netherlands18317.1 $10,701.75
 Chile16019.1 $8,376.96
 India1281380 $92.75
 France12665.3 $1,929.56
 Russia125146 $856.16
 Denmark1195.79 $20,552.68
 South Africa11259.3 $1,888.70
 Brazil80212.6 $376.29
 Ireland304.94 $6,072.87

Wow, look at Norway

Norway has the most saved per person, but their fund is publicly owned and was formed when Norway made lots of money in oil and is now invested in ethically run companies. Individuals don’t contribute to it and the government can only access 3% of the funds each year. You can watch this youtube video to find out why Norway is so rich.

A few other countries of note

The US has the most funds under management, but due to wealth distribution the average per person is lower. Pensions in the UK seems to be a confusing affair and it’s up to the employer to set up a pension plan and to make contributions on your behalf. 

Singapore once had employer contributions set as high as 25% before their recession in the 80’s.

In New Zealand, the pension is called superannuation. It’s a lot simpler than our system and isn’t means tested. Here is an interesting article comparing them.

Why is Australia so wealthy?

The main reason why we have so much saved is because of the compulsory employer contributions; which is currently 9.5% on top of your salary goes towards retirement savings. This was established in the 90’s by the Keating government at the time. Here is a youtube video of Keating ranting about super.

The average working Australian

I’ve told my brother, who’s 22 and works in a supermarket as a fruit and veg manager, that he’ll likely have 400k in super when he retires, even if he does nothing with it. You can use money smarts super calculator to play around with some numbers.

The average supermarket employee makes around 50k a year (plus or minus around 5k). 400k in savings feels like an insane amount of potential wealth for someone in my family (a previously low socio economic but still very bogan family, we are now upper middle bogans 😉)

Categories
Agile Design Finances Mobile Testing Software Testing Technology

Metrics and Quality

The superannuation and investment mobile app I’ve been working on over the last year has finally been released. It’s been on the app store for just over a month now* and this blog is about how we are using metrics to help keep tabs on the quality of our app.

*You can download the app via google play or the apple app store, you can also create an account here.

Average app store rating

The average app store rating is one useful metric to keep track of. We are aiming to keep it above 4 stars and we are also monitoring the feedback raised for future feature enhancement ideas. I did an analysis of the average app store reviews of other superannuation apps here to get a baseline of what the industry average is. If we are better than the industry average, we have a good app.

Analytics in mobile apps

We are using Adobe Analytics for tracking page views and interactions for our web and mobile app. On previous mobile app teams I’ve used mParticle and mixpanel. The framework here doesn’t matter, I’ve found adobe workspace to be a great tool for insights, once you know how to use it. Also Adobe has tons of online web tutorials for building out your own dashboards.

App versions over time

Here’s our app usage over time broken down by app version:

We have version 1.1 on the app store and released 1.0 nearly 2 months ago. We did an internal beta release with version 0.5.0. If anyone on the old versions tries to log in they’ll see a forced update view.

Crash Rates

Crashes are a fact of life with any mobile app team, there are so many different variables that go into app crashes. However keeping track of them and aiming for low rates is a good thing to measure.

With version 1.1 we improved our crash rates on android from 2.77% to 0.11%. You can use a UI exerciser that is called monkey from the command line in your android emulator to try and find more crashes too. With the following command I can send a 1000 random UI events to the emulator:

adb shell monkey -p {mobile_app_package_name} -v 1000

Crashes in App Centre

We can dive a bit deeper into crashes in app centre (a Microsoft based platform that integrates with our team city continuous integration pipeline for managing all of our test builds).

When exploring stack traces you want to look for lines that reference your app (instead of getting lost in all of the framework code), look for lines that start with your apps package name.

App Centre gives reports based on device and operating system break down:

With analytics set up, you can even dig into an individual report and get the page views that happened before that crash occurred.

What’s a good crash rate?

That depends on the context of your app, ideally zero is the best but perfect software is a myth we keep trying to obtain. As long as it’s trending downwards you are making progress towards improving it. Here’s a good follow up blog post if you are interested in reading more.

Error Tracking

I can also keep on eye on how many error messages are seen. The spike in the android app error messages was me throwing the chaos monkey at out production build for a bit. However when there is both a spike in android and iOS, I know I can ask, “was there something wrong with our backend that day?”

Test Vs Prod – page views

If every page has one event being tracked, we can compare our upcoming release candidate against production; say we see that 75 page views were triggered on the test build and we compare this to the 100 page views we can see in production. We can then say we’ve tested 75% of the app and haven’t seen any issues so far.

This is great for measuring the effectiveness of bug bashes/exploratory testing sessions. If you want an answer to, “how much testing did you/the team do?”.

Hang on, why 75%?

There’s no need to aim for 100% coverage, our unit tests do cover every screen but because they run on the internal CI network those events are never sent to adobe. We have over 500 unit/UI tests on both android and iOS (not that number of tests is a good metric, it’s an awful one by the way).

But if you’ve tested the main flows through your app and that’s gotten you 50% or 75% coverage you are now approaching diminishing returns. What’s the chances in finding a new bug? Or a new bug that someone cares about?

You could spend that extra hour or two getting to 90-95% but you could also be doing more useful stuff with your time. You should read my risk based framework if you are interested in finding out more.

Measuring Usability

If you are working on a new feature or flow of your app, you can measure how many people actually complete the task. E.g. first time log in, how many people actually log in successfully? How many people lock their accounts? If you are trying to improve this process you can track to see if the rates improve or decline.

You could also measure satisfaction after a task is completed and ask for feedback, a quick out of 5 score along the lines of, “did this help you? was it easy to achieve?”. You can put a feedback section somewhere in your app.

The tip of the iceberg

These metrics and insights I’ve shared with you are just a small subset of everything we are tracking. And is a small part of our overall test strategy. Adobe has been useful for digging down into mobile device’s and operating systems breakdowns too. There’s many ways you can cut the data to provide useful information.

What metrics have you found useful for your team and getting a gauge on quality? What metrics didn’t work as well as you had hoped?

This is not financial advice and the views expressed in this blog are my own. They are not reflective of my employers views

Categories
Mobile Testing Software Testing

Bugasura and exploratory Mobile Testing

A few weeks back, Pradeep Soundararajan; founder of Moolya testing and I were having a conversation on twitter on test strategies for mobile apps. He suggested trying Bugasura for running a bag bash.

Bugasura is an android app and a chrome extension. it helps with keeping track of exploratory testing sessions and comes with screenshot annotation and jira integration. 

Here are a couple of screenshots of the android app in action, being used for an exploratory session on our test app.

Bugasura Flow

First I selected the testing session:

While I’m testing I see this Bugasura overlay which I can tap to take a screenshot and write up a bug report on the spot:

Here’s their reporting a bug flow:

And here’s a testing report after I finished my exploratory testing where I can push straight to Jira if I want:

Here’s the sample report link (caveat, the screenshots attached to the bug are now public information on the internet, so there’s a privacy concern right there), but OMG, the exploratory session recorded the whole flow too (so a developer could see exactly what I did to find that bug).

Flow Capture

Here’s that bug report in chrome paused at screen 13 out of 18:

On a side note, I love having these Jerry Weinberg (who wrote perfect software and other illusions about testing) and Elisabeth Hendrickson (who wrote Explore It!: Reduce Risk and Increase Confidence with Exploratory Testing) quotes sprinkled through out the app.

Some caveats I’ve found so far; the test report is public (not private by default), you wouldn’t want to include screenshots of private or confidential information.

Bugasura only works on android/chrome. There isn’t an ios version but I guess with some remote device access running through Chrome it could work? We use Gigafox’s Mobile Device Cloud at work to access a central server of mobile devices and I imagine Bugasura could work with it.

Also I think they may have misspelt Elisabeth’s name in her quote.

This blog post reflects my opinions only and do not reflect the views held by my employer.