New technologies have made remote usability testing particularly affordable and easy to do.

Laser-sharp focus without ignoring knowledge from elsewhere brings new insights to light


Christian Stadler

3 years ago | 7 min read

Last year the McKinsey Global Institute (MGI) turned 30. The consulting firm’s business and economic research arm’s stated intention is to offer fact-heavy insights without taking deliberate policy positions.

It certainly cemented the firm’s reputation as an intellectual powerhouse. No small fry in an industry marked by causal ambiguity. As the late Harvard Business School professor Clayton Christensen explained companies typically don’t know what they are looking for when shopping for advice. Neither can they be sure whether subsequent success or failure is linked to that same advice.

What they do know, however, is that McKinsey is often considered to be the world’s most prestigious consulting firm. The work of the institute is helpful in this regard, as it is widely used by corporate leaders and policy makers of all colors. Both Mitt Romney and Barack Obama, for instance, cited from the same report—presumably not the exact same passage—during their election battle in 2012.


And while the firm would never formulate it that way, MGI represents exactly the sort of image the firm strives for: super smart and highly relevant.

To better understand how the magic happens I interviewed James Manyika, one of the co-chairmen, as well as a partner and a more junior consultant on secondment to the institute. Further, I reviewed a number of MGI reports.

How it works: finding the right balance between depth and breadth

Marcel Hirscher is a national hero in my native Austria. Skiing is for us what football is for Brazilians. And Hirscher truly prevailed. He won 67 World Cup races, two Olympic gold medals and seven World Championships. He also brought home the overall season title 8 times in a row.

His domain were the technical disciplines. But despite his dominance, he never won a downhill race which requires slightly different skills. The lesson seems clear: you need depth and specialization to excel at the highest levels.

MGI’s slalom equivalent is economics. But in a very particular way which it refers to as micro-to-macro. The idea is to paint a picture of macro-economic trends based on micro-economic industry level insights. There is, however, a slight twist in this. Applying the micro-to-macro approach MGI researchers also leverage knowledge from other disciplines. It is actually a fundamental part of the whole approach.

Manyika explains: “We’ve been doing a lot of work on AI impact on jobs and automation. When we first began to do that work, we’d primarily two senior fellows involved. One is Susan Lund, who has a PhD in economics, the other one’s Michael Chui and he’s a PhD in computer science. I was the director involved – my background as you may know, I have a PhD in AI and robotics. And then we also had another senior partner who was kind of a challenger who’s also an economist.”

New knowledge is usually generated at the cross-roads of disciplines. MGI knows this and has an open ear for cross-pollination. Occasionally relying on unexpected sources such as Twitter to keep track of current thinking. “I actually find Twitter to be useful to point to other things. There are authors who are there. There are people who say, look I just read something,” notes Chui.    

Back to Hirscher. In giant slalom athletes benefit from experience in the more speed centric downhill and super G races. And one of his Olympic gold medals was in combination, where one race is downhill and a second one slalom. Depth in your core area is vital but breadth will enhance your abilities to excel.

The process: academic rigor, kind of

Most management scholars fall into one of two buckets. There is the qualitative research crowd who construct new insights from interview data and field notes. The ‘opposing’ team starts the journey from existing ideas, using quantitative data to test them.

MGI is not bound by academic orthodoxy taking a more flexible approach to knowledge creation. It readily mixes qualitative and quantitative data and theory is informing them rather than setting boundaries. The process itself goes through three phases.

In phase one the chairmen and directors decide whether a particular idea is worth exploring, i.e. it fits the MGI micro-to-macro pathway and offers them an opportunity to say something new. “We’ll try and understand how much has been looked at already, and what can we add that’s empirical and fact-based.

And quite often in many cases, we’ll say quite frankly we’ve nothing to add here. Often it’s topics that get into either societal things or areas where we don't think we can bring this micro-based macro.”

To figure this out MGI typically spends a few weeks on an exploratory study and engages with its advisory body. And with McKinsey being McKinsey this includes several Nobel laureates.

In phase two the actual research happens. The MGI partner in charge starts with recruiting a team of McKinsey consultants on secondment. As the reputation of the institute has grown, so has the demand for secondments. “I actually wanted to work with the MGI before I wanted to work with McKinsey,” Anneke Maxi Pethö-Schramm notes.

The new team members then reach out to McKinsey’s own industry practices, building a repertoire of relevant cases. They also reach out to academia and other companies. What they collect is a mixture of quantitative data and stories, for example, creating a table showing the potential of different AI tools in different industries.

This rather laissez-faire approach would not fly in hardcore academic circles but has the distinct advantage of being able to combine qualitative and quantitative data. It’s not unusual that a report builds on 400 cases.

To some extent this feels like academic research on steroids. While MGI is not revealing any numbers, it is obvious that the man-power behind each project is substantial. With three directors and eight partners my estimate is that at times more than 100 consultants will be working on projects in a given year.

On top of this there is a sizable econometrics group supporting MGI projects. This makes sure that when necessary some serious modelling and regression analysis is unleashed.

Two-thirds into the project, the team enters the last phase, presenting to the advisory board. In these sessions, the main trends are nailed down. A story is emerging. Importantly, this does not include any policy recommendations. It is after this that the writing starts—which is actually part of doing research as ideas are being ironed out as they put them to paper.  

It’s hard for other consulting teams or even think-tanks to copy the MGI process as this clearly requires a thick check book. Three things, however, can be imitated quite easily. First, only pursue projects that really fit with your area of expertise. Second, borrow from academia without being religious about it. Finally, use an advisory board to keep things ‘real’.

The secret sauce: unique data

In 2016 almost all pre-election polls showed Hillary Clinton winning. As Nate Cohn points out in a New York Times OpEd, there are at least three reasons why everyone got it wrong: “Undecided voters broke for Mr. Trump in the final days of the race, or in the voting booth. Turnout among Mr. Trump’s supporters was somewhat higher than expected.

And state polls, in particular, understated Mr. Trump’s support in the decisive Rust Belt region, in part because those surveys did not adjust for the educational composition of the electorate.”

In short: they sampled the wrong data. In the management field, getting the right kind of data is also a key ingredient to be able to tease out new insights. Sampling is often an issue but the even more fundamental problem is that for many questions there is simply not enough data. Manyika recognizes  this is an issue for MGI as well.

“Yeah, so sometimes we just cannot get the data,” says Manyika. “It’s interesting, we have a method, but we just can't do that in that country. Africa’s a good example, there’s so many things we’d love to do.”

While this is true, McKinsey’s legendary networks often give it access to data others can’t get. “So, we did a big meta-study where we’re looking at what are all the things of economic value that move around the world basically, and it’s all the usual stuff, trade and all the rest of it and services,” explains Manyika, “but we also included cross-border data flows.”

Information on data flows other than trade is not part of official government statistics. So where to turn? Manyika adds: “A lot of companies who are infrastructure providers who are providing connectivity across countries actually have that data, and they’re comfortable giving it to us because we’re not the government.”

Similar efforts have recently been underway to capture the career paths of African Americans. Again, the data is distributed among many companies and McKinsey is in a position where many are comfortable to share it with them.

The search for the right kind of data really sits at the heart of MGI’s research. With great effort, individual knowledge-seekers (meaning academics or other consultants) can also hunt for such unique datasets. But to be fair, this is a true competitive advantage of MGI.

What happens with all the information

MGI also spends considerable time disseminating the insights it develops. The chairmen and directors brief head of states, senior policy makers, and CEOs of Fortune 500 companies. It’s structured in a way that there is no pressure to create any business out of this.

But of course a nice little (un)intended consequence is that firms want to solve the issues that emerge. And who to turn to other than the firm who identified the issues in the first place. The consultants who were seconded during the project return to their industry practices, armed with new expertise, are ready to roll.


Created by

Christian Stadler







Related Articles