Synthetic data – Fake numbers, Real Insight?

The data now available to businesses – should they choose to engage with it all – is staggering! Like drinking from a fire hydrant 
 the danger is that one is quickly swamped by the sheer volume of data, concepts, tools and ideas floating around this wealth of information and the potential it holds. With all this data available why on earth is synthetic data – artificially generated data to be used instead of, or in conjunction with ‘real’ data – of interest to anyone? The amount of real data being generated is already larger than we can comprehend , so why would we want to add to the confusion by creating synthetic data?

It’s difficult to pin down a precise definition, since its exact nature varies according to the specific function it is generated for, but broadly speaking synthetic data can be thought of as any data not derived from direct observation or measurement. A synthetic dataset maybe fully synthetic, meaning it does not contain any original data, or partially synthetic, meaning that only sensitive or missing data has been replaced. It is typically generated through machine learning models, such as decision trees or deep learning, that use the statistical relationships between the variables in the data of interest to generate an entirely new artificial dataset based on these relationships. These statistical relationships may be derived directly form observing an existing dataset or built from the ground up by a data scientist with prior knowledge of these relationships.

The potential utility of synthetic data is likely familiar to anyone who’d been involved in a data analysis project. This might range from simply facilitating more reliable data analysis, by filling gaps to tackling security and privacy concerns, but synthetic data can also play a vital role in the field of machine learning.

Better Data Analysis

Some of the key applications of synthetic data are based around overcoming problems associated with data quality and quantity. Looking at data quality issues first – when working with real data, various skews and biases in the collection process can lead to difficulties in maximising the value of the data later down the line. These may be down to sampling issues – it’s notoriously difficult to obtain a truly random sample – but there may also be incomplete entries, incorrect formatting, nonsensical outliers and various other quirks that anyone who has worked with real life data is no doubt familiar with.

Being able to generate a fresh dataset with all of the insight of the original but without the inherent messiness that comes with real data – outliers and missing data being just a couple of examples – has the additional benefit of making it much easier to work with, and can even help provide a more navigable data source for those less familiar with the idiosyncrasies of real world data.

Building upon your freshly created, easy to use synthetic data set, you can also expand it to deal with issues arising from a lack of a truly random sample. Imagine you have a dataset used for a segmentation project, where on the back end you want to develop a classification tool that can be used to assign current and potential customers to one of the segments you have created. Whilst there are many powerful classification algorithms that can be used for this purpose, they will all be subject to some level of bias based on the respective sizes of the segments in the sample. By creating a synthetic version of the dataset, that bring your sample up to, say, 1000 entries for each segment, however, you can develop a classification tool safe in the knowledge that it will free from any sizing bias in the original sample.

Security & Privacy

No matter how much data you have, it’s likely that some of it contains sensitive information – be that internal information or personal details from your client base. This data likely holds a lot of potential with regards to developing business intelligence, but, due to its sensitive nature, actually digging through it to find and develop insight can often be something of a security nightmare, particularly in a post-GDPR world. Whilst any organisation with a CRM database can benefit from additional security for their data, this has particular benefits in fields such as financial services and healthcare, where customer or patient data can be especially sensitive.

Methods to circumvent this traditionally involved either removing or anonymising the sensitive data, but both of these approaches have their pitfalls. Removing the data altogether risks essentially deleting a source of potential insight, undermining the process of analysing the data in the first place. Anonymising is generally effective for small scale analysis, but often papers over the cracks from a security perspective as sensitive information can often be reverse engineered in the case of a hack or data breach.

Using synthetic data, however, provides an excellent compromise between preserving the key insights contained within a dataset without any risk of exposing sensitive information. This is particularly useful for those in the financial services industry, who tend to hold vast swathes of highly sensitive information about many of their customers but can also be useful to any company looking to embark on a large-scale data analysis project with sales or research data that may contain sensitive information.

Consider, for example, a scenario when you want to share data with a third party. This may be a third-party service such as cloud storage or computing, or an agency or freelancer you are collaborating with on a data analytics project. These scenarios can often, at the very least, leave you with several hoops to jump through before you can safely share your data, which can ultimately limit the potential of your data. Generating a synthetic version of your data – which would retain all important statistical properties of the original data set – allows you to make the most of what you have without having to worry about the exposure of sensitive information.

Machine Learning

As well as streamlining the regular data analysis process, synthetic data has been found to have several powerful benefits in the field of machine learning. This is typically to overcome data scarcity issues – machine learning algorithms typically take in massive amounts of data, which can be expensive and time consuming to collect in a traditional sense, and so feeding in synthetic data can be a cost-effective way of improving the performance of the algorithm.

Consider self-driving cars
 the AI software that drives these cars is fed data of real cars being driven to learn and improve how to drive, but the volume of data required to ensure that a self-driving car is 100% safe far outweighs what is realistic to collect. To circumvent this, Google have their cars drive around 3 million simulated miles per day, which provides their algorithms with the data they need to train their cars to drive themselves safely.

The growing importance of synthetic data is reflected in the wave of new organisations focusing on generating synthetic data for their customers. DataGen, for example, is an Israeli start-up that uses machine learning to produce artificially generated still images and videos that their customers can use to train their own AI. Last year, a synthetic text generator co-authored an article for The Guardian. Synthetic data has even been leveraged as part of COVID-19 research collaboration.

So, as we can see, even in our data saturated world, there is very much a place for synthetic data. Next time you feel that you have reached an impasse related to your data usage, consider whether synthetic data might help. From helping you circumvent security and privacy concerns, to boosting the quantity and quality of your data, and enhancing the development of more complex machine learning products, the applications of synthetic data are both plentiful and diverse.

A renewed focus on growth is the antidote to these ‘interesting times’.

“May you live in interesting times” 
 delivered with a heavy dose of irony, this figurative English translation of a Chinese curse (“Better to be a dog in times of tranquility than a human in times of chaos”) sums up how many of us may be feeling!.

Back in 2006 the EU presciently published a paper on the recessionary effects of a pandemic in Europe. Its baseline scenario suggested a first-year fall in GDP of 1.6% with additional effects pushing this up to between -2% and -4%.

Two years later a financial crisis rather than a pandemic gave us a whole new perspective on recession. UK GDP shrunk by more than 6% between the first quarter of 2008 and the second quarter of 2009. This recession was the ‘deepest’ (in terms of lost output) in the UK since quarterly data were first published in 1955 and the economy took five years to get back to the size it was before the recession. A ‘once in a lifetime’ downturn.

Fast forward to 2020 and anybody whose lifetime is more than a ‘baker’s dozen’ will have now experienced two ‘once in a lifetime’ economic events. The latest baseline forecast envisions a 5.2% contraction in global GDP in 2020 – European real GDP is projected to contract by 7% and the UK by more than 11%. That 2006 EU prediction seems positively benign! We know it’s bleak, and only going to get bleaker.

Yes, there will be some ‘winners’ during this healthcare disaster but the impact on many many businesses around the world is chilling. That marketing spend gets slashed in times of economic uncertainty is just a truism, but businesses need a growth strategy now more than ever – hunkering down or going into hibernation until ‘winter’ passes is not a serious option.

The focus for businesses must be how to grow in economies experiencing unprecedented headwinds and for which – a vaccine notwithstanding – the coronavirus hangover is likely to be felt for some years yet. There are however causes for optimism.

The good news is that ‘the arithmetic of growth’ is pretty straightforward – encapsulated in the ‘where to play’ and ‘how to win’ questions 
 maintain your own customer base (and potentially increase the lifetime value of each customer), poach customers from competitors, and/or attract new customers to the category, then engineer the marketing Ps to deliver. Strategy, at whatever level, is about making choices. It’s deceptively simple!

But how should you translate this into an actual growth strategy – again when faced with that truism that marketing spend, and especially consumer insight budgets, get cut in times of economic uncertainty. Deepening our understanding of our consumer isn’t optional or a ‘nice-to-have’ – but with a greater focus on cost and budgets we need to be very sure of the ROI. How does one develop a growth strategy when under such unprecedented commercial pressure?

Firstly, think ‘agile’! and secondly don’t reinvent the wheel.

The last 9 months have demonstrated how ‘agile’ organisations can be 
 process bureaucracy has gone out of the window in response to this crisis. Agile organisations are fast, resilient, and adaptable and putting in place the building blocks of a growth strategy should similarly not be a long, drawn out, ‘drains-up’, exercise. It should also be a pragmatic and flexible process or system.

And the customer insight required is already, largely, there – if you choose to look for it. The amount of data available to your organisation has increased exponentially and shows no sign of slowing. You must have the ability to integrate and synthesize the plethora of information that is now available to you, to harness this to the task of solving the problem.

And while there may be some sort of quasi-scientific method that would deliver a solution – there isn’t time that there used to be to digest all this new information, as organisations are driving an agile agenda – which means faster, leaner decision-making and so rethinking the classic approach to problem solving you have to become comfortable with leaner and creative ways of adding value 
 to engage in effectual reasoning. This is about having a broad notion of where you are heading and adjusting on route. It is not worrying too much about the how but focusing on the ‘end game’. This contrasts with the more causal approach to business problem solving which required us to evaluate risk and proceed in a more cautious way.

Early in 2020 we were embarking on growth strategy project for a UK based multi-national. Lock-down #1 threw everything up in the air – timetable, budget and process.

It would have been easy to just pull the rug on the entire project, but to their credit the leadership team decided that a growth strategy would provide a pathway through the coming recession and set the business up for success longer term. And it would get proper focus – the CEO mandated that “this is not an ‘at-the-side-of-the-desk’ effort”

However, compromises had to be made. Instead of traditional qualitative and quantitative and a process taking circa 6 to 8 months, we quickly arranged a wider pool of internal stakeholder interviews and a deep review of internally held data, while in parallel arranged 50 qualitative interviews with their own customers and non-customers identified through ‘connections’. Our work is always highly collaborative which appealed to a client team very keen to get involved in all aspects of the process.

Was it ‘perfect’? No. Did it work? Yes. It took 3 months to conduct the interviews, integrate this information with the organisation’s existing data, develop the initial frameworks, identify future growth opportunities and agree on customer priorities. And with these foundations the business could hone its ‘brand benefit’ and develop its capability road map – those elements needed to be in place to deliver its differentiated product offer to its target customers.

“The work has gone down brilliantly with the board” CEO

“A great team effort and great team result” Chief Commercial Officer

In developing this key framework, everybody bought into the fast, adaptable and resilient mantra – and in this same spirit redefined what was acceptable vis-à-vis the data, information and knowledge that they would work with 
 looking with fresh eyes at the value of what we might call non-traditional data.

Developing a growth strategy doesn’t always need ‘new’ data – in our experience organisations sit on a treasure trove of under exploited data, information and knowledge – conventional and unconventional. This ranges from, previously collected, quantitative and qualitative data, behavioral and sales data and the relevant experience and market knowledge of key stakeholders within the business.

Tapping into this enables you to unearth nuggets of insight at a fraction of the cost of a new research project. Organisations should ‘squeeze this orange’ for all it is worth.

Now in today’s environment, the ground may have fundamentally shifted, and so what we learnt yesterday may not help us understand today – but by marshaling what we know, we can ensure that any new data that needs to be collected or areas to be explored are focused laser like on the issues at hand – short and sharp – we are not spending any time reinventing the wheel. This saves both money and time

The organisations that succeed (and maybe thrive) during and after this pandemic will be those organisations which truly ‘know what they know’ and pragmatically apply this knowledge in a fast, resilient, and adaptable way to identify future choices and a winning strategy.

Riding the data avalanche! An opportunity for everyone.

Data-driven knowledge businesses are 23 times more likely to acquire customers than those which are not[1] and insight-driven businesses are growing at an average of 30% each year; and this year (pandemic aside) they would be taking $1.8 trillion annually from their less-informed industry competitors[2]. Businesses big and small sit on a mountain of underused data – the good news is that tapping this potential is not hard.

As the common refrain goes, data is the new oil. It’s easy to see the parallels – data, like oil once was, is a vast, largely untapped asset. In its raw form it can be messy, confusing and difficult to work with, but is immensely valuable once it has been extracted and processed.

However, the analogy can only be taken so far. Data, unlike oil, is effectively an infinite resource – with the total volume of data out there roughly doubling every couple of years. While known oil deposits are projected to run out in around 50 years, at the current rate the volume of data is expected to increase by a factor of over 30 million in that same time period.

Here’s an interesting thought 
  more data has been generated in the last ten minutes than ever even existed before 2003.

There is no sign of this data avalanche slowing down either. The “internet of things” means that in the not-so-distant future we may have every single electronic object generating data, and there may even be new types of data we haven’t even considered yet. It’s more important than ever for businesses to understand their data and, more importantly, what they can do with it.

Data has fostered new innovation, and new innovation has created more data! Amazon have changed the way we consume by using sophisticated AI recommender systems, which take data from all of their users to tell each individual what they should be buying or next, replacing the human curators they used in their earlier years.

Some companies have gone even further, turning the data they collect from their customers into their primary asset. We all know that Facebook and Google do not charge you money to use their various features, but instead collect data about you, which, when aggregated with millions of other users and sold to third parties is a far more sustainable and profitable business than trying to get people to pay a subscription to the services they’ve used for free, for years.

These companies are huge and therefore benefit from a massive user base who are constantly providing vast amounts of data which they can work with 
 and have developed many cool tools to interrogate it.

The volume of available data means that even the simplest of operations can benefit from at least dipping a toe into the data universe and tapping the value of the insight that lays within it. How an organisation uses its data will be a key point of difference between organisations that maximize their potential and those that do not. Really successful organisations will be knowledge-based organisations.

And you don’t necessarily need the latest technologies, or an army of data scientists, to take advantage of this data bounty. In our experience organisations sit on a treasure trove of under-exploited data, information and knowledge. This ranges from previously collected hard research data as well as behavioural or transaction data collected from a range of sensors to the relevant experience and market knowledge of key stakeholders within the business. Tapping into this enables an organisation to unearth nuggets of insight at a fraction of the cost of a new research/data collection project.

As far as existing data is concerned, you should ‘squeeze the orange’ for all it is worth.

From a purely consumer research/insight perspective – all too often businesses keep outputs and work product from previous projects but don’t see the value in the raw data – but it’s yours, you have paid for it and it can be put work.  Now in today’s environment, the ground may have fundamentally shifted, and so what we learnt yesterday may not help us fully understand today – but by marshaling what we know, we can ensure that any new data that needs to be collected or areas to be explored are focused laser like on the issues at hand – short and sharp – we are not spending any time reinventing the wheel. This saves both money and time

An open mind, an inquisitive attitude and some creative thinking are all that’s needed to get started – as well as a capability that we like to refer to as ‘sense-making’ (Pirolli and Card 2005) 
 which is the ability to quickly get to grips with the wide and potentially overwhelming panorama of data, information and knowledge available 
 bringing this together into a coherent whole to inform choices and decisions

Knowledge based organisations must have the ability to integrate and synthesize the plethora of information that is now available, to harness this to the task of solving the problem. And while there may be some sort of quasi-scientific method that would deliver a solution – there isn’t time that there used to be to digest all this new information, the agile agenda means faster, leaner decision-making and so rethinking the classic approach to problem solving means becoming comfortable with leaner and creative ways of adding value.

Whilst we’ve touched on the benefits data has brought to big organisations, no business should feel that size is a barrier to entry in exploring the opportunities in data. In fact, it is the smaller organisations that might reap the biggest rewards – in relative terms at least – since intelligent use of data is more likely to offer a competitive advantage.

A longitudinal interventionist study conducted by UC Berkeley on a CafĂ© based in Copenhagen[3] demonstrated how detailed data collection– leading to data-driven decision making – could oversee an sixfold increase in turnover over a seven year period. This did not require and fancy software or expert data scientists – it was simply a matter of recording the numbers – having the right sensors in the ground – to gather actionable data that clearly expressed how the CafĂ© was performing against various KPIs and ensuring that the findings – and the implications of those findings – were acted upon. By embedding this ‘wide-angled lens’ approach into the ‘DNA’ of the organisation, the impact of decision-making was consistently understood in terms of data.

“For Sokkelund the implementation led to real-time data on operations e.g. sales per seat per day, or per customer, as well as customer retention data. Understanding this data is crucial; for example, when the chef decides to change the menu, or when tracking the effects of marketing. Having access to such small data makes it possible to track small improvements to the business model 
 these changes not only led to cost reductions, but lowered customer acquisition costs, improved customer retention and added new revenue channels”

This process was termed as a “small data transformation”, which highlights how really anyone who can utilize data to their advantage. ‘Small data’ can be just as powerful as ‘big data’ in the right context, and it’s about finding the right approach for your business, rather than chasing the latest and greatest trend.

Collecting and collating corporate knowledge isn’t hard.  To help our clients on the journey to becoming knowledge-led organisations, Decision Architects have developed the Surge3 platform – which is a knowledge ecosystem bringing together what an organization knows into a single hub so that it can be efficiently and effectively collated, interrogated and analysed.

  • It offers a system for collecting, collating and integrating raw market research data and the ‘product’ from market research projects, to provide a central hub where the ‘corporate research memory’ can be accessed, making ‘corporate knowledge’ visible and searchable
  • It fosters collaboration between individuals and groups on areas of shared interest – by tagging ‘who has worked on what’. Users can identify colleagues with topic expertise, learning from their experience rather than ‘reinventing the wheel’
  • It create new recombinant value by combining disparate data (from previous research studies), to create new information, knowledge and insight

Effective data integration and management is at the core of Knowledge Organisations – at a minimum actioning internally generated data – making it integral to decision making. Once you have this foundation, adding in additional streams of external data then provide a powerful platform from which you can build sustainable competitive advantage.

To find out more email neil.dewart@decision-architects.com

 

[1] https://www.mckinsey.com/business-functions/marketing-and-sales/our-insights/data-driven-marketing

[2] https://www.forrester.com/report/InsightsDriven+Businesses+Set+The+Pace+For+Global+Growth/-/E-RES130848

[3] [https://cmr.berkeley.edu/2019/11/small-data/]

The Ongoing Battle For Your Mind

Another in our occasional series of blogs in which we will revisit some of the articles that we have found most useful over the years. This week we are going to focus on Ries & Trout’s three 1972 Ad Age articles ‘The Positioning Era Cometh’; Positioning Cuts Through Chaos’ and How To Position Your Product’ and their subsequent 1981 book ‘The Battle for Your Mind’.

“I studied [‘The Battle for Your Mind’] as an undergraduate in the 1980s and the fact that it is one of the few books I can remember nearly 35 years later 
 both for its content and cover –iconic brands bursting out of a man’s head in glorious technicolour –  is a testament to its impact, and that it is still relevant today speaks to the importance of the message” Adam Riley, Founder, Decision Architects

Ries and Trout opened their 1972 articles with an observation that “[in today’s market] there are just too many products, too many companies [and] too much marketing ‘noise’. Yes, that was 1972 
 (plus ça change, plus c’est la mĂȘme chose).

To succeed they argued a brand had to create and own ‘a position’ in the consumer’s mind and cited Wednesday April 7, 1971 as the day the marketing world changed. David Ogilvy had taken out a full-page ad in the New York Times to set out his new advertising philosophy and outlined 38 points that started with ‘the results of your campaign depend less on how we write advertising than on how your product is positioned’.

Ries and Trout charted the evolution from the 1950s product era to the 1960s image era to the 1970s positioning era – which addressed the challenges posed by the ‘new’ over-communicated to, media-blitzed, consumer. Their work described how brands should take or create a “position” in a prospective consumer’s mind 
 reflecting or addressing its own strengths and weaknesses as well as those of its competitors 
vis-Ă -vis the needs of the consumer.

And while positioning begins with the product – the approach really talks to the relationship between the product and the consumer 
 and by ‘owning’ some space in the mind of the consumer – by being first or offering something unique or differentiated – a brand can make itself heard above the clamor for attention – in their words – wheedling its way into the collective subconscious.

Ries and Trout laid out ‘How To Position Your Product’ with six questions 


  1. What position do we own? The answer is in the marketplace
  2. What position do we want? Select a position that has longevity
  3. Who must we ‘out-gun’? Avoid a confrontation with market leaders

“You can’t compete head-on against a company that has a strong position. You can go around, under or over, but never head-on”

  1. Do we have enough money (to achieve our objective)?
  2. Can we stick it out (in the face of pressure to change / compromise)
  3. Does what we are saying about ourselves match our position?

Its then easy to see the DNA of these questions in the ‘choices cascade’ set out by Monitor Alumni, Roger Martin, and P&G CEO (President & Chairman), A.G. Lafley, in their 2013 book ‘Playing to Win’ 
 the ‘where to play’, and ‘how to win’ questions culminate in a positioning – whether you call it a ‘benefit edge’ or a USP 
 it is a statement of position and is at the core of our growth methodology.

Just re-reading the 1972 articles you get both a fascinating insight into the marketing environment of 50 years ago, but also a sense of how relevant these articles are to contemporary marketing practice.

www.ries.com/wp-content/uploads/2015/09/Positioning-Articles002.pdf

“For anybody interested in the psychology of consumer behaviour, Ries and Trout’s work really made the marketing discipline a lot more interesting, and as a student in the 80s their book really ignited my interest in marketing strategy and insight 
 what we used to call market research!” Adam Riley, Founder, Decision Architects

NPS and the Customer Satisfaction Jigsaw

Since it was first introduced by Fred Reichheld in 2003 (“The One Number You Need to Grow”, Harvard Business Review) to help measure customer satisfaction 
 the Net Promoter Score (NPS) has grown to become a crucial tool in the marketer’s arsenal.

The reason that NPS has been so successful and, in many ways become the go to customer satisfaction benchmark, is its simplicity – the ease and speed with which it can be asked. This means that it can be asked more regularly, limit the time that customers need to spend providing feedback, (because as much as we would like them to, not everyone wants to take part in a 20 minute survey once a month!), and it does not require any kind of math degree or statistical tool to analyse.

While many companies, big and small, use NPS as a performance index to evaluate the state of their brand, it is not without its critics for whom it is not specific enough, relied on too heavily by companies and difficult to translate into action. Used properly it can be a helpful metric, but there are a few noticeable caveats which, to garner maximum value from any customer satisfaction program need to be understood and accounted for.

So what should marketers be doing when implementing customer satisfaction to ensure maximum value?

 A key starting point is to create a customer interaction plan. Engagement with the right audiences at the right times during the customer journey, as well as at the right frequency to obtain an accurate read on customer sentiment is crucial. Mapping and understating the key stages of this process will enable us to see where we are doing well and where there is the most room for improvement.

To harness the full power of NPS we then need to link up the scores provided with any other information that we hold on customers. This may not always be possible, but a well managed CRM system can enable us to dive into satisfaction scores much more deeply, looking at where the brand is performing well and how we are performing within key segments, cohorts or demographics. Another key benefit of this approach is that we can link satisfaction scores to other key business metrics allowing us to assess the tangible value that an increase (or decrease) in satisfaction score might bring.

We need to look at how we can ‘move the bar’ and increase scores, a point at which many a marketer we have spoken to have found challenging. Beyond purely looking at scores by stage of the customer journey this is the point at which text responses can provide real additional value. By focusing these on what the company could be doing better for all customers (rather than a traditional and not always that useful ‘Why did you give this score?’) we can look to explore key themes for improvement among both promoters and detractors of the brand. Many companies with successful customer satisfaction programs will also take the opportunity at this stage to engage directly with customers to understand their scores, and this can add an additional layer of value by tapping further into the voice of the consumer.

Finally, and arguably the most crucial stage of the process is feeding these responses back into the business, to create a Satisfaction Cycle that empowers the broader business to make tactical changes and improvements to products and services based on 4 key elements:

  • Which improvements are we able to make?
  • Which audiences would benefit most from these changes?
  • How valuable are they to us as a business? (linked to key business metrics)
  • What can we do in the short term and what are longer term goals

So is this ‘The One Number You Need to Grow’? That is certainly up for debate. To implement growth plans, any business needs much more information than the NPS on its own provides. It is no golden goose and gives us only a point in time satisfaction score. To harness the power that a well-formed customer satisfaction program can bring we must use it in the right way as part of a range of tools that can measure, identify and most importantly enable us to improve the customer journey. Knowing that customers would recommend our brand is great, but knowing what we can do if they don’t and how to improve the scores across key audiences is where NPS can deliver its true value

In praise of the ‘Diffusion of Innovation’

Another in our occasional series of blogs in which we will revisit some of the articles that we have found most useful over the years 
 these are the articles that can always be found on desks in the Decision Architects office. This week we are looking at Everett Rogers’ 1962 work on Diffusion of Innovations (and yes, it’s a book not an article).

“This model provides an intuitively simple lens through which we can look at how consumers approach any sort of NPD – it is not without its critics but it is a useful short-hand which we often apply to (say) a segmentation framework to talk to the propensity of different segments to try or adopt a new product or service 
. be that a new delivery format for hot drinks, a new insurance concept or some form of health tech” James Larkin, Decision Architects

The foundations of this now ubiquitous framework are interesting. Rogers’ 1962 book was based on work he had done some years earlier at Iowa State University with Joe Bohlen and George Beal. Their ‘diffusion model’ was focused solely on agricultural markets and tracked farmers purchase of seed corn. It was Iowa, and Rogers was professor of rural sociology!

The diffusion model’s signature bell curve identified

Innovators:             Owned larger farms, were more educated and prosperous and were more open to risk

Early Adopters:      Younger and although more educated were less prosperous but tended to be community leaders

Early Majority:       More conservative but still open to new ideas – active in their community and someone who could influence others

Late Majority:        Older, conservative, less educated and less socially active

Laggards:               Oldest, least educated, very conservative. Owned small farms with little capital

Between 1957 and 1962 Rogers’ expanded the model to describe how new technology or new ideas (not just seed corn) spread across society, but whilst Rogers initial work assumed that technology adoption would spread relatively organically across a population in practice there are barriers that can derail mainstream adoption before it has begun. The expansion to this frame discussed in Geoffrey More’s 1991 book ‘Crossing the Chasm’ highlighted a critical barrier to widespread adoption. This Chasm exists between the early adopter and early majority phases of the framework and to successfully navigate requires an understanding of the personality types that form the 5 fundamental building blocks of the model.

Innovators are happy to take a risk and try out products and services that may be untested or ‘buggy’. They look at the potential, do not expect things to be perfect and are happy to work with companies to improve initial offerings, a fertile testing ground for new technology. Early adopters in contrast are more tactical in their adoption. They want to be at the forefront of new technology but will have conducted their own research to evaluate the likelihood that the product will offer them tangible value. They are also more fickle, and are more likely to leave a product or service that is not living up to what was promised creating a potential void between them and the early majority.

Once we start to look at the early majority and beyond there is marked shift towards using something that ‘just works’. They are less interested in something new or shiny but, as the Ronseal advert would put it, something that ‘does exactly what it says on the tin’.

Seth Godin put it well in his 2019 blog when he said:

“Moore’s Crossing the Chasm helped marketers see that while innovation was the tool to reach the small group of early adopters and opinion leaders, it was insufficient to reach the masses. Because the masses don’t want something that’s new, they want something that works, something that others are using, something that actually solves their productivity and community problems.”

At its basic level the innovation adoption curve is a model that can be used to critically assess the appetite to adopt something new within a particular audience (be that segment, cohort or among a more general population). This provides us with a crucial framework element addressing the ‘where to play’ question … which we talk about so often with our clients and to enable the prioritisation of resources where they will have the biggest impact on growth and revenue

To go beyond the innovator and early adoption phases, products and services must deliver on their early promise, be built around customer needs improving on what went before. Getting there first can be a huge commercial advantage but failing to understand your audience and adapt accordingly can be the difference between wide-scale adoption and, at best, obscure appeal.

 

Why ‘new normal’ will look a lot like ‘old normal’

Since March my mailbox has been inundated with new surveys, trackers, consumer trend evaluations, and ‘thought pieces’ on the ‘new normal’. The world we live in from this point on will look nothing like the world we have known 
 so says their collected wisdom! If one was a cynic, one might argue that sowing doubt and uncertainty about the future reinforces the need to spend budget on consumer insight at a time when client businesses are looking to conserve cash and agencies are feeling the pinch – and this is my business as well, so I am not going to argue with the importance of maintaining  ‘sensors in the ground’!

But if you believe all that you read we are facing a foreign landscape with consumer behaviour turned on its head! But with some trepidation 
 can I be the small voice in the crowd that says actually I believe that the future is going to look much more like the past than many would have us think.

Now I will caveat that with the future when viewed from the pre-Covid world was going to look different (that’s just a truism) 
 the migration from the high street to the virtual street perhaps being the most notable trend – and the pandemic has moved this process on (if for no other reason than such a precipitous fall in revenue would be difficult for any business to cope with especially those with a poor online presence).

Perceived wisdom is that the pandemic moved digital migration forward 5 years 
 as people have been forced to shop online, socialise with friends and family members online, to bank online, see their doctor online etc. And some of these behaviours are here to stay as sub-optimal customer experiences in a pre-pandemic world can now be seen as such by a wider group of consumers – who really wants to queue for 20 minutes in a bank branch or sit next to (other) sick people in a doctor’s waiting room. OK, some people will, but broadly speaking the pandemic has shown those of us who are not innovators and early adopters a better way in some areas.

However, the ‘new normal’ is not actually ‘normal’ and will meet the headwinds of behavioural inertia or the tendency to do nothing or to remain unchanged. The majority of us will go back to an office, and probably 5 days a week. We will start shopping in stores again – because we like physical (as opposed to virtual) shopping, and so the home will become less of (not more of)  “a multi-functional hub, a place where people live, work, learn, shop, and play” (‘Re-imagining marketing in the next normal’ McKinsey, July 2020). We will want to travel again as soon as possible – the ‘staycation’ was fine, but we won’t want to make a habit of it, and our new found sense of ‘community’ will wane when the pressures and time requirements of everyday life kick back in.

I am not saying that there won’t be any change and I am not just sticking my head in the sand and hoping the current crisis would just go away. But consumer behaviour is akin to an elastic band 
 Covid-19 has pulled it in all sorts of different directions, but fundamentally it wants to ‘ping back’. When we have a few years post pandemic perspective, I suspect covid-19 will be seen to have caused a mild bump in the overall evolution of consumer behaviour 
 there won’t be a ‘new normal’ that looks very different from the ‘old normal’.

Katy Milkman – a behavioural scientist at Wharton was reported in The Atlantic as saying that new habits are more likely to stick if they are accompanied by “repeated rewards”. So if the threat of the virus is neutralised the average person will go back to a routine and at the moment the pandemic looms large because its our everything. While there will be some behavioural stickiness – its easy to overestimate the degree to which future actions will be shaped by current circumstances.

 

 

In praise of ‘Marketing Myopia’

In this occasional series of blogs we will revisit some of the articles that we have found most useful over the years 
 these are the articles that can always be found on desks in the Decision Architects office. The first of these is Marketing Myopia, published in the Harvard Business Review in 1960, chosen by Adam Riley.

“I love this article 
 it talks to the ‘where to play’ and ‘how to win’ calculations that we have at the heart of our work 
 and I reference it time and time again. And when we use examples of obsolescence 
 Kodak, Nokia, Blackberry etc etc. you can see in their downfall a failure to heed the lessons of Marketing Myopia. Levitt was one of the giants of our trade”

In 1960 Theodore Levitt 
 economist, Harvard Business School professor and editor of the Harvard Business Review, published ‘Marketing Myopia’ and laid the foundations for what we have come to know as the modern marketing approach. Levitt, one of the architects of our profession, popularized phrases such as globalization and corporate purpose (rather than merely making money, it is to create and keep a customer). The core tenet of his ‘Marketing Myopia’ article is still at the heart of any good marketing planning process or submission. In this article Levitt asked the simple but profound question … “what business are you in?”

He famously gave us the ‘buggy whips’ illustration


“If a buggy whip manufacturer defined its business as the “transportation starter business”, they might have been able to make the creative leap necessary to move into the automobile business when technological change demanded it”.

Levitt argued that most organisations have a vision of their market that is too limited – constricted by a very a narrow understanding of what business they are in. He challenged businesses to re-examine their vision and objectives; and this call to redefine markets from a wider perspective resonated because it was practical and pragmatic. Organisations found that they had been missing opportunities to evolve which were plain to see once they adopted the wider view.

Markets are complex systems. The ability to successfully define, and to some extent ‘shape’, the market you compete in today – and will compete in tomorrow –  is the foundation of good marketing. It is critical first step to maximizing business opportunities and identifying those competitive threats that may imperil the long term prospects of the business – or change the rules of the game to make its products or services irrelevant.

Senior management must ask, and marketers must be able to answer, the question … as a business, ‘where should we play’? This means defining the market in which we will compete – and being able to give size, scope, growth rates, competitive landscape, key drivers and barriers to success within it, as well as an appreciation of customers needs today – which are being fulfilled -and those unmet needs which may shape the definition of the market tomorrow. Market definition is not the same as ‘segmentation’ – but it is a necessary pre-cursor

When identifying ‘where to play’, marketers must address how to redefine our market to create a larger opportunity, or one which we are better positioned than the competition to take advantage of? How could our competitors reshape the market to their advantage and what impact would this have on us? And how will external trends – be they political, technological, social, economic etc. – reshape the market and affect our success? Many marketing questions then flow from this market definition – what attractive customer segments exist, how do we develop and deploy our brands against attractive market opportunities, what capabilities do we have today that give us competitive advantage, and what capabilities will we need tomorrow to sustain this.

At the time of his death in 2006, Levitt (alongside Peter Drucker) was the most published author in the history of the Harvard Business Review, and in a interview he gave about his published work, he said  “In the last 20 years, I’ve never published anything without at least five serious rewrites. I’ve got deep rewrites up to 12. It’s not to change the substance so much; it’s to change the pace, the sound, the sense of making progress – even the physical appearance of it. Why should you make customers go through the torture chamber? I want them to say, ‘Aha!’”

SMEs … from dodo to lifeblood of the economy

SME. It’s a term that conjures up images of elbow grease, pull-yourself-up-by-the-bootstraps business – small teams, ambitious entrepreneurs, businesses that begin at the kitchen table. But SMEs are big business.

ONS data suggests that there are over 1.6m SMEs in the UK – that’s 99% of all UK businesses – and over 70% of these are classified as micro businesses. SMEs account for about 50% of the British economy and over 60% of private sector employment – some 14.4 million people. The figures are very similar in Europe as a whole, where SMEs represent 99.8% of all businesses, employing 93 million people and generate 58% of GDP.

The SME sector is often described by government as the ‘lifeblood’ of the UK economy, but it wasn’t always this way. The definition of an SME that we are familiar with can trace its origins back to The Bolton Report of 1971, but then the prevailing sentiment was that the SME sector was in terminal decline …

 “The small firm’s sector was in a state of decline in both number and in contribution to output and employment and in a few years would cease to exist. Economies of scale would make the remaining 800,000 small firms uncompetitive and doomed to extinction”. The Bolton Report of 1971

Yes, the government of the early 1970s seriously thought the SME sector was “doomed to extinction”. Fast forward 45 years, and the government was arguing that “a sustained recovery of the UK economy will rely on the private sector with small and medium-sized businesses taking the lead”. The 2015 Report on Small Firms was the first official report on the health of the sector since the Bolton Report and its author Lord Young found

“This shift in the number and importance of small businesses has not been simply a linear trend over 40 years. But within this Parliament alone I have seen a transformation. The business population has increased by 17 per cent since 2010. In 2011 we saw a record number of start-ups, and the beginning of 2014 saw a record increase in the number of firms.

This, in part, could be due to the rising number of self-employed people in the UK. And due to a culture change, more Brits find themselves “driven not by necessity but by desire”.

 There is lots of fighting talk about the success of the UK economy being driven by the SME sector, but the term SME is woefully misunderstood and often mischaracterised – it has become a convenient label for motivational soundbites. The reality is that unless we actually grasp the complexity of what it is to be an SME, the economy is going to struggle

Changing aspiration and huge advances in technology are driving/enabling a complete reshaping of the world of work

  • The corporate sector will need fewer employees (and offer fewer opportunities)
  • The ‘independent economy’ will continue to expand aggressively (necessity or desire)
  • Entrepreneurship will become a more well-trodden pathway
  • Out of this eco-system, more firms will be created

Since the dark days of 1973, technology has gone a long way to leveling the playing field between huge internationals and the typical start-up. Communication, administration, marketing and management have all become more affordable and less labour intensive, while your next-door neighbour may now have a customer-pool of clients that stretches around the world.

Government initiatives have also gone some way to powering SMEs forward. The Start Up Loans company, a government-backed financial package, has lent £131m to 25,000 businesses and created 33,000 jobs, while apprenticeship grants have provided £1,500 to firms taking on their first apprentice. Then there’s the employment allowance, which hands businesses and charities a £2,000 tax cut off their National Insurance Contributions, as well as a £1.1bn package of business rates measures (although arguably the recent nationwide hikes in business rates are placing a stranglehold on SME growth).

But if SMEs are to continue to thrive, we must now take a step back and ask, what is an SME really? Only from a deep-seated placed of understanding can we ensure that they receive the support they need to continue as the UK’s backbone, especially as Brexit looms.

It’s time to rethink ‘SME’. Out with the outdated understanding. In with a more nuanced approach.

It’s time to stop tarring all SMEs with the same brush; this is a landscape of mobile hairdressers, digital agencies, garages, fishing trawlers – to understand them, it’s essential to look beyond revenue and number of employees. The government has begun to use different lenses when thinking about SMEs, focusing on business owners in terms:

  • Ethnicity
  • Gender (focusing on female entrepreneurs)
  • Senior citizens
  • New university graduates
  • High tech, ex-corporate employees
  • Immigrant statistics
  • Those recently made redundant

Although helpful, it’s not enough.

When it comes to business owners, the government must identify and understand the emotional journey of those spear-heading the businesses in this sector and understand the differences in motivations and thinking. They must also move away from just over-focusing on the Gazelles (companies out there that have experienced 20% growth pa in the last three years). While these businesses are obviously attractive  – they emerge from a positive, thriving, dynamic, SME sector of over 1.5 million businesses – all of which have a part to play.

The SME sector is fragmented, messy, diverse and distinct. We must dig deep if we’re to fully understand it.