We’ve mentioned the odd world of Russian opinion polls here before.
If you’d like to read more on the subject, here’s a fascinating piece by the Levada Centre’s Denis Volkov, translated by Meduza:
“I don’t believe that Putin’s approval rating is 86 percent!” We hear this phrase endlessly from commentators in Russia and abroad. But interpreting these ratings isn’t a question of faith, but a detailed analysis of all the available sociological data collected throughout Vladimir Putin’s time in power. When non-sociologists discuss the president’s ratings, they usually look at indicators from the past few months, selecting the most dramatic of these figures—Putin’s 86-percent approval rating as president)—without taking into account many other related questions. In doing so, they arrive at the flawed conclusion that Russians ardently support any decision by the authorities.
These misinterpretations usually go like this: instead of digging into the details and trying to reconcile the entire mass of conflicting data, commentators simply express their doubts about the honesty of the sociologists who conducted the poll, the validity of the poll’s sociological methodology in the Russian context, or the candor of the poll’s respondents. At the same time, they automatically classify the 10 percent of the population that doesn’t support Putin and opposes Russia’s reunification with Crimea as “the democratic minority,” which exists in opposition to a majority that’s composed, of course, of philistines and bellicose patriots. This interpretation crumbles, however, under the scrutiny of a careful analysis of the data.
First, let’s say a few words about whether respondents are afraid to answer our questions. Respondent dishonesty is difficult to assess, but the important thing to remember is that it is a constant. Most of the surveys conducted by the Levada Center (or by any other polling company) hold to the same methodology: they make use of personal interviews conducted at the homes of respondents. People’s accessibility (that is, their willingness to take part in surveys) hasn’t changed in the past 20 years. The same number of people open their doors today as did two or five years ago; as before, almost everyone shares their contact information at the end of the interview, so it’s possible to verify that the survey was carried out. Routine, multi-level controls (statistical, analytical, and by telephone) are standard procedure in any large research agency, and they allow us to monitor the quality of the work done by the interviewers.
Much depends on how the interview is designed: if questions about support for the president are going to be taken out of context, it can make respondents uncomfortable. But placed among questions about the economy and about the state of affairs in the country, in a person’s city, or in their household, they can work quite smoothly.