Research Publications



We consider it our duty to fully inform you about the advantages as well as the disadvantages of online fieldwork. That is why we closely co-operate with independent partners who monitor the use of our online panel (e.g. data representativeness, data integrity and response).

The following white papers are available:

This paper describes the set-up and results of a parallel study conducted by Research International, in which the quality of Online fieldwork was compared to telephone interviewing and CAPI (Computer Aided Personal Interviewing). These three methods of data collection were compared with respect to various quality aspects, such as data representativeness, integrity and (non)response. The study shows that the quality of the data collected by Online fieldwork is often higher than that of the data collected by telephone interviewing (CATI) and CAPI.

This paper shows the results of a study conducted by the Eindhoven University of Technology. In this study the response rates to various email invitations were analysed. Are people more willing to participate if they receive the survey results afterwards? Or will it discourage them? Does the value of an incentive affect response rates? Will a pre-survey description of the study - including who is conducting the study and why, and the importance of participation - increase response rates? Response rates are significantly lower in case of pre-survey descriptions. The possibility to receive the survey results afterwards does not affect response rates. The value of an incentive only partly affects response rates.

In this study we compare the results of socio-scientific laboratory experiments with comparable Online fieldwork. The results are compared with respect to the decisions taken by people in so-called trust games. Our study shows that the online results are largely comparable to the lab results.

In this study we analyse response quality with respect to questionnaire layout. The following layout variations were applied: Closed questions (drop-down lists vs. radio buttons); scale questions (radio buttons vs. sliders); multiple choice questions (the use of several answer rows). When inexperienced respondents are involved, the response quality appears to depend on the use of either radio buttons or drop down lists. In case of scale questions, the results showed significant differences between the use of radio buttons and slider controls. The column an answer is displayed in also appears to affect response quality.

In this study we identify factors that lead respondents to stop completing a survey, especially the use of progress bars. We analysed the effects of 5 different ways of communicating progress to respondents: (1) no progress bar (2) a header or footer containing "page X out of Y" on each page, (3) a standard progress bar, (4) a progressive progress bar, (5) a degressive progress bar. The main conclusion of the study is that it is not very useful to show a progress bar.

Online data collection has become a method that is often used in market and policy research. It would therefore be interesting to find out if and how the attractiveness of a survey affects future response. This white paper addresses the following key question: Does the attractiveness of a survey increase future response rates? This study shows that there is a significant correlation between a positive evaluation of the attractiveness of a survey and future response rates.

The Netherlands had more than 25 Online fieldwork panels in 2006. Not only research panels, but also big organizations and municipalities are discovering the possibilities of Online fieldwork and building their own panel in order to measure citizens’ opinions. This study researches the differences and similarities in the results from private panels and access panels. Does self-selection bias cause undesired panel effects because respondents may apply for the private panel or the access panel due to very specific motivations?