This week, I’ll share about the continuous improvement we’ve done on our startup and what I learn from it.
Previously, I shared that after updating our landing page for demand validation, we saw the conversion rate doubled and we were quite satisfied with the improvement. Next is to wait until we had enough conversion to move on to the next step. For a while, I was indulging in the “success”, naively of course 😉
I grew up with this Chinese idiom stuck in my head: 学如逆水行舟，不进则退. It translates literally to “Learning is like rowing a boat upstream, if you don’t keep pushing forward, you’ll fall behind”. I’m also a DevOps advocate, for which continuous improvement is part of the trade. Though for a while, it didn’t occur to me that I need to continuously improve our ads, not until I stumble across the “Experiment” feature of Google Ads. ¯\_(ツ)_/¯
That reminds me of what I saw in our company’s weekly updates, what A/B test was done, its result, and what change was done to improve further. That’s a strong reminder that I also need to continuously improve our business. I’ll go into the why shortly, let’s look at the what for now.
Google Ads Experiment
Google Ads Experiment is a guided and automated step that can also be done manually. I’ve tried both approaches, the limitation of either method is the deciding factor on which to use.
For an experiment, Google Ads can be instructed to apply draft changes that are performing better than before. Those changes can be anything from keywords down to ads wording. The limitation, only one experiment is allowed at a time, but experiments can be queued, making this a useful feature for ongoing improvements.
When done manually, we can either create new Ads Group or Campaign and compare the results ourselves. I went with this to do our branding test because we have 7 brand names to test. Although it’s still possible to use Experiment but testing each brand sequentially might include seasonality bias in the data. We wanted a more accurate data from this test.
I set up to run the branding test for 4 weeks, and there’s one week left. I’ll share the result next week. But now we can already observe the improvement from some of the more favorable brands.
Our CTR grew from 1.16% to 2.76%. That’s the average of all 7 brands. Our top brand actually fetches more than 5% of CTR and we’re just hitting the industry average according to Instapage.
Our conversion rate goes from 3.36% up to 5.24% and it’s better than the industry average of 4.40%. Our top brand managed to push the number further to 6.19%. I’m really satisfied with the result. But it’s not the time to be complacent.
Although the experiments in this post were about demand validation and branding test, continuous improvement applies to all areas of the business, or any business for that matter.
Every business, no matter who runs it, faces ongoing change. The only constant of business is change. The only constant in life is change.Mark Cuban
The world is changing constantly so are consumers’ needs. Looking at the closing of Speedy Video after 3 decades, while consumers’ needs have changed they continue to serve them the same way. On the contrary, the company I worked in, it’s 10 years old and they never stop improving their customers’ satisfaction, which is why they’re one of the top 3 of their industry right now.
Changes can also come from the environment, the competitive environment in this case. We’re expecting strong competition after we launch, either from existing players or new ones who saw the same opportunity that we saw. We’re now in the training pool learning our way, hoping that one day we can swim alongside the big players in the competition pool. Without continuous improvement, we’ll be out of the game in no time 😉
In the next post I’ll wrap up branding test and share more about biases in business decision and what we can do about it.