After all, between August 5 and 17, the non-profit research team she is a part of had surveyed 1,378 Singapore citizens aged 21 and above – a sample weighted for age, race, gender to reflect the demographic composition of the country’s citizen population – to determine how they decided which political party to vote for, as well as their perceptions on socio-economic issues. “We wanted to do a snapshot of sentiments one month before Polling Day,” she explained, and when respondents were asked in the same survey which party they would vote for, 70 per cent said they would have voted for the People’s Action Party (PAP).
On Polling Day, the PAP secured 69.86 per cent of the popular vote.
Yet beyond this prediction per se, the survey by Quad Research highlighted two related observations: first, understanding the political motivations or positions which guide the choices made by Singaporeans can be beneficial for discourse; and second, modern data collection tools can be rigorously used for these purposes. In this vein, raising data literacy of the general populace should also be an important objective in the near future.
For instance, the research showed that 54 per cent of Singaporeans chose their votes based on three top reasons: the best local candidate regardless of party, trust in party motives and values, and the most competent government based on the senior members of the party. In addition, while aggregating opinions of national beliefs – that 57 per cent of respondents believe that Singapore is meritocratic and that 66 per cent believe in social safety nets – the team measured differences between voters with dissimilar party affiliations. 68 per cent of respondents who would have voted for the PAP believe that Singapore is meritocratic, whereas only 32 per cent of those who would have voted for the opposition parties believe so.
Despite their varied professional backgrounds, the team of four at Quad Research has “always shared the common interest of building Singapore’s resilience as a community”. Financial sustainability is the main obstacle, the team shared with The Middle Ground. They financed the survey from their own pockets, and besides putting the data for sale and exploring more community-driven models, the team is currently seeking feedback on how to improve their methodology as well as their future research topics.
Here is our full interview with the Quad Research team, with a focus on the research methodology as well as the design of the survey questions:
The Middle Ground (TMG): Who mooted the idea for the survey on the recent general elections? When did the team start on the survey?
Quad Research (QR): The 2015 general elections in the United Kingdom triggered our group’s curiosity. Jeremy Au, an MBA candidate at the Harvard Business School, sparked the discussion in early-June by sharing that polling in the United Kingdom had powerfully illuminated how citizens saw their country. We were impressed by the rigour of online polls and depth of voter insights.
At our first meeting in mid-June, we decided to pioneer online self-administered surveys for socio-political polling in Singapore. Blackbox Research and the Institute of Policy Studies (IPS) had generated many valuable insights on government satisfaction and voter profiles.
However, the vast majority of socio-political polling was done either face-to-face or over the phone. We saw that the prevailing methodology suffered from high rejection rates and respondents’ concerns over anonymity. On the other hand, most online polling was either not well-designed, did not have a sufficiently large sample size, or was primarily circulated amongst friends. Furthermore, we wanted to glean valuable insights from the community around political choice, party brand value, voter rationales and national beliefs.
TMG: In Singapore, the publication of either opinion polls during an election or exit polls on Polling Day before the results are declared is illegal. Given the strong correlation in your pre-election polling with the actual vote share, would the team make a case for pre-election polls? Why or why not?
QR: We were pleasantly surprised that our secondary finding of vote share was highly correlated with the actual electoral vote share. Our mid-August snapshot showed that if the election was held at that time, the PAP would win 70.0 per cent of vote share (±2.6 per cent margin of error). In our internal discussions then, we expected a substantial change due to a month of intense political activity, election campaigning, and social media actions between our poll and the general elections. However, the mid-September election results showed that the PAP won with 69.9 per cent of the vote share.
We believe there is still more opportunity for further research on why electoral campaigning was not as impactful as we expected. Had most voters already made up their mind by mid-August due to earlier groundwork? Did campaigning by all political actors create a net negligible effect? Was campaigning effective – yet handicapped by Singapore’s shorter electoral campaign season?
There are pros and cons for pre-election polls. On the one hand, we recognise their value in assisting citizens in understanding Singapore and helping collective decision making for the common good. On the other, we also note that pre-election polling has an observer effect, that is, the act of observation changes the behaviour of the electorate. The act of pre-election polling can also generate claims of the pollster being politically biased, depending on the findings of the poll.
QR clarifies the timing of its research: We would like to clarify the timing of our research. According to the Parliamentary Elections Act, no person is allowed to publish exit polls between the time that the writ for an election is called and the close of Polling Day. We ran the fieldwork for our survey between August 5 and 17, 2015, and published findings on national-level beliefs on August 21, 2015. The Writ of Election was issued on August 25, 2015. Polling was restricted till Polling Day on September 11, 2015. We published our survey results on party vote share and voter rationales on September 15, 2015, four days after Polling Day.
TMG: How did the team design the survey, and how did you choose the seven reasons which may influence how a voter decides?
QR: We first benchmarked surveys globally. We knew that the results would be more insightful if they were internationally comparable. We were particularly interested in similar surveys done in the United Kingdom, as they had a similar parliamentary system, rather than the United States’ electoral system. We then picked two pollsters (Lord Ashcroft Polls and Ipsos Mori) and extracted relevant and nonpartisan questions.
We then localised the survey through testing and reiteration. Generally, we preserved the questions as originally presented and tested them with peers experienced in community discourse. Based on their feedback on the questions’ relevance and understandability, we tweaked the questions accordingly.
For example, we extracted a question on how voters of the United Kingdom chose their vote from Lord Ashcroft Polls. Voters could choose the option “I voted tactically to stop another party from winning”. We received feedback that tactical voting was uncommon in Singapore, yet party discontent was a relevant rationale. We thus rewrote the option as “I would vote to register my discontent with another party”.
By design, our findings can be used as a supplement to prior socio-political polling by IPS and Blackbox. IPS ran post-election surveys in 2006 and 2011, and will be releasing their 2015 report in due course. Prior issues that they have measured for the GE2011 included perceptions of efficient government, needs for checks and balances and cost of living. Furthermore, Blackbox measured approval for government policies prior to GE2015.
TMG: Why did Quad Research use a ranking approach – for respondents to rank the reasons which guide their voting preferences – as opposed to Likert scales for example?
QR: Using the same example as above, the original question in the United Kingdom requested an implicit ranking by asking: “Which three [reasons] were the most important in your [voting] decision?” As Quad Research sought to distil the electorate’s most important reasons for how they would choose to vote, we asked the respondents to rank the reasons in descending priority.
For each respondent, the order of the options presented was randomised to avoid bias.
The ranking approach encourages respondents to reflect and thoughtfully weigh the relative importance of each reason. A survey respondent using Likert scales (that is, a rating scale of one to seven, where one is “not at all important” and seven is “very important”) for their voting reasons could potentially give the same number to all the reasons, which would limit possible insights. However we do understand that there are trade-offs for every survey design methodology and encourage further research to test the intensity of these voting reasons.
TMG: The main comparison is between voters of the PAP and the opposition parties. Did the team identify significant differences between voters of the opposition parties? Do you have insights on intra-Opposition differences?
QR: For our national insights where we had to compare respondents by party affiliation, we chose to consolidate voters for the opposition parties as “Opposition” in order to further improve statistical rigour. As the smaller opposition parties had proportionally fewer respondents who would have chosen them, their sample sizes were too small to see meaningful differences between opposition parties.
We welcome feedback on how we can improve this survey methodology moving forward.
TMG: Based on your overall findings for the survey, what is the most important advice – to prepare for the next general elections – the team would give the PAP? And along this tangent, what is the most important advice for the opposition parties?
QR: Our advice to all socio-political actors: measure, improve, and share. Better data-driven insights that illuminate community issues for all sides of the debate are positive for our country. Singapore is a beautiful home for each and every one of us.
TMG: In all, what do you think is the greatest strength of the survey thus far? Conversely, what is the greatest weakness?
QR: We believe that the greatest strength of the survey is the groundswell of support from Singaporeans for more non-partisan polling. We received thoughtful and positive comments about how our research helped add a data-driven perspective to the public discourse on the election results. We also received constructive feedback on how to improve, as well as offers to join us in our work. This was heartening for the team and strengthened our commitment to reflecting Singapore.
We would have loved to repeat the survey after the elections to see whether voter preferences and rationales had changed over the course of election campaigning. Unfortunately, we were unable to do so due to the costs. We hope to explore this for future runs.
TMG: What has been the toughest challenge for Quad Research?
Our drive to continue polling in Singapore has one major obstacle to overcome – financial sustainability. In the general public, there is a common misconception that socio-political polling is cheap. Many people use online tools such as Google Forms to run informal surveys. However, rigorous socio-political polling requires thoughtful survey design, the assembling of a commercial-grade nationally representative panel and data processing. These high financial and manpower costs are normally borne by the government, profitable commercial polling companies or by institutions of higher learning.
TMG: As a team, where does Quad Research see itself in five years?
QR: In July, the four of us agreed to take on the financial risk together. We put in the funds together as private individuals, with the understanding that we may never recoup the costs. These funds covered substantial survey and set-up costs. Regardless, we as a team made the commitment to publish all statistically rigorous findings regardless of whether we were able to recover the costs and how many man-hours it would take us.
Today, we are fulfilling our commitment. As volunteers, we are taking time out from our day commitments to push these results out slowly but rigorously. We have put the data for sale in hope to at least break-even and cover the costs. We emphasise that the data contains zero personal identifiers of individual respondents. Any profits from sale will be put away for future research. We are also exploring more community-driven models and welcome any feedback that you have.
We just sent you an email. Please click the link in the email to confirm your subscription!