ALSO READWill India's rank change? WB to recalculate ease of doing business rankings Ease of doing business indicators, methodology designed on hard data: WB Chile demands answers in WB ease of doing business ranking controversy India jumps 30 spots to 100 in World Bank's ease of doing business rankings Ease of doing biz: Top govt servants among respondents to World Bank survey
Reasonable people can disagree about the usefulness of the World Bank’s country rankings. But after the Chief Economist resigned amidst a controversy about the index, the Bank has made a number of misleading claims, including defending numbers in the press that its researchers have quietly repudiated.
In the past few weeks, my colleague Divyanshi Wadhwa and I published two pieces examining the controversy over the World Bank’s Doing Business index, which measures the regulatory burden facing businesses around the world. Our first piece examined the case of Chile, investigating the World Bank Chief Economist’s own revelation that methodological changes to the index had caused Chile’s rank to fall during Michelle Bachelet’s Socialist government, rise under her conservative successor, and fall again when she came back to power. Leaving conspiracy theories aside, our analysis—for which we released data and code—confirmed that this pattern of fluctuations was far greater than what we could reproduce using a consistent methodology over time.
Our second piece questioned the results for India, whose sudden rise in the World Bank rankings has been celebrated by Narendra Modi’s government. Once again, holding methodology as constant as possible, the sudden surge under Modi's government became a modest rise.
Last week, CGD published a response to our analysis by Shanta Devarajan, senior director for Development Economics at the World Bank, and also a member of CGD’s Advisory Group. He concluded that our analysis was "neither enlightening nor useful.” Read it for yourself, but my quick summary of Shanta's reply is:
1) Rankings are relative, so yes they change due to others' actions.
2) The methodological changes are purposeful improvements, not flaws.
3) India has genuinely reformed.
I accept all three points. Nevertheless, Shanta’s response—cleverly titled “Wrong Criticisms of Doing Business”—does not actually say we’re wrong. It doesn’t address the core substantive flaws in Doing Business discussed in our first post or, in my view, rebut the core technical claim we made: that changes over time in Doing Business rankings rely on apples to oranges comparisons, and that using a consistent methodology shows much smaller changes in the ranking for both Chile and India over time.
Three misleading statements in the World Bank response
Senior World Bank staff have demonstrated the confusion and misinterpretation that can result. Last week, the World Bank Country Director for India, Junaid Ahmad, appeared on India's CNBC-TV to answer a journalist's questions about the Doing Business fracas.
Watch for yourself:
News (@CNBCTV18News) February 9, 2018
The Bank is under pressure to defend Doing Business, and Dr. Ahmad is eager to keep warm relations with a major client government. But unfortunately, a lot of what he suggests in his statement is at best misleading.
Here are three cases:
1. "[A]ll the shifts that you’re seeing [are] based on ground realities, of policymaking and impact on the ground. And these are feedback directly from the companies themselves, from the small and medium enterprises."
The Doing Business index is not, and never has been based on reports from small and medium enterprises "on the ground." It is based on a survey of a very small sample of experts, mostly lawyers, management consultants, and government officials. They are asked not to report the reality on the ground, but rather the de jure regulations applying to a hypothetical firm if the law were applied strictly. Analysis by the Bank's own researchers shows these de jure rules bear little resemblance to the de facto reality on the ground in many client countries.
2. "[The improvements in India's ranking] are not an artefact of methodology, or an artefact of data."
In this case there is something of a smoking gun, as the Doing Business team altered the previously published numbers for India and other countries after the fact.
The 2017 Doing Business report is no longer linked from the Doing Business website as far as we can tell, but it is still available online here. If you go to page 213 you'll see that India ranks 130th with an overall "distance to frontier" (DTF) score of 55.27. But if you look at the current Doing Business website, you'll see that the 2017 score has been changed to 56.05.
The World Bank has been insistent that it used a comparable methodology to 2017 in calculating the 2018 score. And in fact, it does now report 2017 and 2018 DTF scores using what appear to be a comparable methodology. But note that these were added after the fact—they are not what was reported in 2017, and not the basis for the 2017 ranking of 130th. The ranking of 130 is based on a different methodology. If you use the data currently on the website, India’s 2017 rank would have been 123 not 130.
So even accepting all of the methodological choices the Bank made and taking them at face value, the idea that India's jump is "not an artefact of methodology, or an artefact of data" is wrong.
Figure 1: Gaps between the Doing Business report and the data online
The graph shows the change in rankings between the 2017 and 2018 reports, minus the change in rankings using the revised data on the World Bank website.
Most of the Doing Business rise in India is fragile to recent methodological revisions (as we showed in our earlier post), and in addition it is now clear that some of the 2018 jump—i.e., 7 points out of the 30-point rise—is due to an even simpler, more objective error. Notably, the World Bank revised India’s data in mid 2017 but used numbers it had already deemed incorrect in its October 2017 press release, and now refuses to publicly acknowledge the discrepancy.
3. [E]veryone is worried about the ranking, but everyone has forgotten that ultimately what we look at is something we call distance to the frontier."
On October 31, 2017 the World Bank in India issued a press release with the headline: "India Jumps Doing Business Rankings with Sustained Reform Focus." The opening paragraph of the press release also stresses the rankings, and India's move from 130 to 100. The distance to the frontier score appears only later in the press release as "one of the key indicators" in the survey.
The 2014 external review of Doing Business led by former South African finance minister Trevor Manuel recommended that the Bank discontinue the overall country rankings as part of the Doing Business report. The Bank chose to disregard that recommendation, because rankings are a powerful marketing tool. But the India and Chile cases show the pitfalls of a focus on marketing rather than substance.
Paul Romer's departure doesn't erase the facts he uncovered
It is worth returning to Paul Romer’s initial allegations about Doing Business:
1) Romer documented clear cases where the Bank made methodological changes that affected the rankings, then went back and altered earlier data after the fact. He announced that the Bank would produce revised rankings for recent years to correct for these changes.
2) Romer then went further, implying—in a Wall Street Journal interview, and emails shared with the Financial Times—deliberate, politically-motivated manipulation of the Doing Business rankings, and wholesale fabrication of data by World Bank researchers. After a public rebuke from the World Bank CEO, Romer walked back most of the accusations in category (2), and Jim Kim accepted his resignation on January 24.
Most people we've spoken to consider the second set of allegations unfounded, and we've seen no direct evidence of malicious intent in the data revisions. However, Romer's departure does nothing to erase concerns around the first set of allegations, and these remain unaddressed. The World Banks’ reaction to our posts suggests that the institution would rather sweep these concerns under the rug. For an organization that prides itself on producing quality data to inform the global development community, that’s not good enough.
This blog post was originally published by Centre for Global Development here