Today’s interview is with Simon Anholt, one of the world’s leading thinkers and practitioners on ultra-widescale human engagement. His work over the last thirty years has focused on creating and leading new fields that measure, understand and influence attitudes, culture and activity at the global scale.
In 2014 he launched the Good Country, and the Good Country Index: the world’s first study of how much each country on earth contributes to the rest of humanity and to the planet. The Good Country was launched in June during Anholt’s TEDtalk which had one of the fastest growing viewing figures of all time. This started a global discussion about how countries and companies can balance their duty to their own people with their responsibility to the wider world. The Place Brand Observer caught up with Simon Anholt to learn about the second version of the Good Country Index, released last week.
Learn about:
- Recent changes regarding ‘good country’ performance;
- Why some countries like Ireland have dropped in their position on the Good Country Index;
- Why some categories presented in the Index are difficult to rank;
- The Global Vote and why it is important.
Simon, which findings surprised you most in the latest edition of the Good Country Index?
No surprises really: I was disappointed that Kenya dropped so far in the overall rankings (from 26th to 93rd), but a certain degree of volatility is unavoidable in this kind of ranking, especially for countries whose external engagements are quite limited. In any case, that fall is just as likely to be the result of the changed methodology of this year’s Good Country Index as any significant change in Kenya’s behavior.
Which are the main changes in ‘good country performance’, comparing this version of the index with the first one?
Reporting trends between the two editions of the Good Country Index is not a straightforward matter because my colleague Robert Govers and I have made a number of improvements to the study. The methodology is exactly the same but we’ve replaced seven of the thirty-five datasets with ones that do a better job of measuring each country’s global impact:
- In the Peace and Security category, we’re using a new dataset to measure Internet Security.
- In the World Order category, we’ve replaced Population Growth with Birth Rate.
- In the Planet and Climate category, we’ve replaced four out of the five indicators:
-
- Ecological Footprint (per GDP$) has replaced Biocapacity Reserve
- Reforestation Since 1992 replaces Hazardous Waste Exports
- Hazardous Pesticides Exports replaces Water Pollution
- Consumption of Ozone-Depleting Substances replaces Other Greenhouse gas emissions
- In the Health and Wellbeing category, International Health Regulations Compliance replaces Drug Seizures.
For this reason, any changes in a country’s ranking in these four categories since the previous edition of the Good Country Index may be wholly or partly the result of these modifications, rather than any change in the country’s real performance.
In future editions of the Good Country Index, we will continue to include better data whenever we find it, which means that direct comparisons between one edition and another won’t be straightforward: but we feel that this is a worthwhile price to pay for a constantly improving study.
This is why we’ve called the latest edition Good Country Index 1.1 and not the 2015 or 2016 Good Country Index: it’s as much an upgrade as an update.
Ireland topped the first ranking but is now down to 11th position. Any explanations why?
Part of the explanation are the new datasets used. Another factor is that Ireland’s economy started to grow again 2011 (the target year for most of the data in the latest edition of the GCI – of course the big global databases collected by United Nations agencies and other international bodies that we use to calculate our rankings are generally published three or more years after data collection begins, so the Index is unavoidably a retrospective view of the world), and since we divide most of our results by GDP in order to create a more level playing-field, the same performance from a bigger economy will produce a lower rank.
However, growth in GDP for Ireland is certainly not the only reason for the drop (apart from new datasets). Ireland has been able to improve its rank on Science and Technology and Planet and Climate. It has dropped on the other categories, but again, explanations vary and some indicators actually improve, but other indicators drop, even though GDP grows. Nevertheless, it’s logical that smaller economies tend to be more volatile in the index.
Do you think the need for a Good Country Index has increased over the last years? Why?
The need for the Good Country increases all the time, especially with the rise in nationalist sentiment in so many parts of the world. It’s inevitable that as people feel more and more threatened by the rising tide of globalized crises – climate change, terrorism, migration, economic instability, pandemics and so forth – so they become more vulnerable to being seduced by opportunist politicians like Trump, Farage, Le Pen and Hofer who are so adept at echoing their fear and anger. At the moment there is no cohesive counter-narrative to these messages: the only alternative is the traditional politics of left, right and centre, and unfortunately it’s partly because ‘politics as usual’ is increasingly and deservedly discredited in so many countries, that the extremists, the fundamentalists and the nationalists are doing so well.
The Good Country philosophy, which stresses our interdependence and takes a global view, aims to provide a counter-narrative to ‘politics as usual’ and this may be why so many people are attracted to it.
Which categories presented in the Index did you find the most difficult to rank?
The ranking process is relatively simple, or at least straightforward, in all categories: the 35 datasets we use are combined into a common measure, which gives an overall ranking, a ranking in each of the seven categories, and a balance-sheet for each country that shows at a glance how much it contributes to the world and how much it takes away.
The performance indicators are measured per GDP dollar, to correct for the size of each country’s economy, and create a level playing field (we also tried normalizing for population, GNI and other measures, but it doesn’t actually make a lot of difference).
Countries receive scores on each indicator as a fractional rank (0=top rank, 1=lowest) relative to all countries for which data is available. The category rankings are based on the mean fractional ranks on the 5 indicators per category (subject to maximum 2 missing values per category). The overall rank is based on the average of the category ranks.
Finding the right data to feed into that algorithm is incredibly difficult in every category, and that search has consumed the vast majority of our time and efforts over the last five years.
The simple fact is that there really aren’t very many reliable, robust, regular surveys that cover our minimum list of 125 (and now 163) countries – in total, perhaps no more than forty or fifty, and of those, the ones that genuinely measure a purely external impact of each country, that is unequivocally beneficial or harmful to the rest of humanity or the rest of the planet, are a small subset.
Critics of the Good Country Index sometimes point out that even though the data we use to drive the index, and the way we combine it, may be reasonably neutral and objective, there is always subjectivity in the choice of data, and of course they’re right in principle, but the simple fact is that the 35 datasets we chose are literally the only suitable ones we can identify anywhere.
Of course we could have pruned the number of datasets down to the least controversial ones, but this would narrow the scope of the study and unbalance it towards certain areas and of course we wanted to avoid that.
None of the datasets we use are beyond criticism, and indeed we and our colleagues and expert advisors are engaged in a constant process of challenging every aspect of the index.
‘Planet and Climate’ is, in theory, one of the easier ones, because unless you’re a climate change doubter (which we’re not), it’s pretty much beyond argument that polluting the atmosphere, cutting down forests, etc., are bad for the planet: but finding really robust and appropriate data is no easier in this section than any other. Indeed, it’s somewhat harder, which is why we’ve made such significant changes to that section in the new edition.
Incidentally, we apply no weighting to any dataset or category because of the impossibility of doing this objectively. Everybody has their own view about whether emitting a ton of carbon dioxide does more harm to humanity than exporting a ton of cocaine, or whether hosting a certain number of refugees does more good than sending out a certain number of peacekeeping troops.
And I don’t really put much faith in ‘expert panels’ to decide on these huge questions – such an approach would only make things more objective by an infinitesimal fraction. At some future stage, perhaps when we get some funding, we might be able to include a way for people to produce their own ‘personalised’ GCI on the basis of their own views about the relative importance of the global issues which the index measures and this is an idea I’m quite attracted to.
It would probably also be a good idea to publish an alternative ranking in which countries are ordered by their absolute contribution, i.e. not divided by GDP, so people can compare the two if they wish. And I’d also like to show the GCI alongside some compatible ranking of domestic ‘good behaviour’, so people can make that critical comparison too: perhaps the Human Development Index, the Social Progress Index, or some combination of similar studies would be suitable.
In the meantime, I’m constantly telling people to look first at the individual data sets, then the categories, and only finally at the overall ranking: but of course they do the exact opposite!
Perhaps the most difficult category of all, and certainly the most controversial one, is ‘Contributions to International Peace and Security’, because remaining scrupulously objective about how you penalise countries for exercising violence outside their own borders is a real challenge, as is making the somewhat unusual – even artificial – distinction between exercising violence at home and abroad.
As you would expect, we receive a great many comments about the peace and security category. This is partly a natural consequence of the fact that most of us are just not accustomed to seeing countries measured in terms of their external impacts. It’s never been done before, and it produces many results which people find strongly counter-intuitive (such as very unstable countries coming near the top of this category, simply because they do less harm outside their borders than many more stable and developed countries).
So one of the reasons why some ostensibly peaceful and otherwise high-ranking countries like Sweden tend to rank relatively low in both editions of the study in this category is that, during the period when most of the data was collected, they were involved in international conflicts, such as the ISAF actions in Afghanistan (activity in this theatre diminished significantly between 2010 and 2011, which is one reason why the Peace & Security rankings of many Western countries improved in the 1.1 Edition of the Good Country Index).
For reasons of impartiality and objectivity we don’t distinguish between “aggressor” and “victim” or “good war” and “bad war” – I take the very simple view that killing people is wrong, and consequently doing it for any reason should count against a country’s contribution to international peace and security.
Every war is a failure of diplomacy – to paraphrase von Clausewitz – and I for one can live quite happily with the idea that, on many if not most occasions, even the ‘righteous’ combatant needs to pay a part of the price for the lives lost in consequence.
When a government takes the decision to send troops abroad, it must of course be prepared to accept the full consequences of those troops causing loss of life amongst enemy combatants and perhaps innocent civilians too: the fact of these deaths being recorded in an international statistical survey is one of the least serious of these consequences.
Weapons exports are of course responsible for up to 20% of the reason why several ostensibly peaceful Western democracies like Sweden tend to rank rather low on this indicator, so one way or another, a hint of the real complexity of the issue is revealed for those who care to look for it.
By contrast, we ‘reward’ countries for participating in UN Peacekeeping missions, since these are at least in principle an unified international attempt to prevent or limit armed conflict by predominantly non-violent means, and that’s much more ‘good country’ than invading another country, no matter how urgent or clear-cut the justification for war may appear to be.
You’ve just released the Global Vote. What’s that all about?
The Global Vote empowers people all over the world – anyone with internet access – to vote in the elections of a wide range of other countries. On our voting platform (www.globalvote.org) we will present one or two elections or referenda (we’re currently offering the Icelandic Presidential Elections and the UK’s EU Membership Referendum) each month, and anyone who registers on the site can cast their vote in these contests.
The information we give our Global Voters about each candidate is restricted to their international intentions: we deliberately avoid any consideration of what the candidate’s domestic agenda, since this is clearly the sole concern of the country’s electorate. We always ask each candidate to provide our voters with an answer to our two standard questions:
1. If you are elected, what will you do for the rest of us, around the world?
2. What is your vision for your country’s role in the world?
We ask our Global Voters to consider each candidate purely from this perspective.
Why are you doing this?
Because to make the world work, we need a world of good leaders. Leaders who consider the needs of every man, woman, child and animal on the planet, not just their own voters. We will achieve this aim by reminding each candidate that we’re here, we care, and we’re watching. We need them to do the right thing for their own country and for the whole of humanity, if they are elected.
By asking each candidate about their international intentions, election after election, that question will eventually become accepted as part of the normal election process for any Head of State or Head of Government. No leader will be able to stand for election unless they have a clear policy for their country’s role in the world and a vision of how they will co-operate and collaborate with other leaders and other populations. One country at a time, we can build a world of good leaders.
What are your targets?
Our aim is to have more Global Voters outside each country voting in the election than there are ‘official’ voters inside the country. Every time we achieve this, I honestly believe that we can say the world has changed a little bit: we have become more aware of our essential interdependence, more united as a species, and we have start to change the culture of governance worldwide. This is the basic aim of the whole Good Country project.
Thank you, Simon.
Learn more about the Good Country Index here or follow Simon Anholt on Twitter.
About Simon Anholt
Simon Anholt has worked with the Heads of State and Heads of Government of more than fifty countries over the last twenty years, helping them to engage more productively and imaginatively with the rest of the world. It was from this experience of working with so many different countries, cities and regions, that the idea of the Good Country was born.
Simon has published five books (most of them part of our list of recommended reading) about countries and their role in the world. He is founder and Editor Emeritus of an academic journal on the same subject (Place Branding and Public Diplomacy), and each year since 2005 has published two major global surveys tracking public perceptions of countries and cities.
Simon is an Honorary Professor at the University of East Anglia in the United Kingdom and holds a number of other honorary and advisory positions around the world.
Liked our interview with Simon Anholt on the Good Country Index and the need to raise awareness about the impact of countries on the global community? Spread the word!