Response in online surveys

Chris Snijders and Uwe Matzat (Technical University Eindhoven) Bart Pluis (PanelClix) and Wiggert de Haan (Isiz)



Online data collection is increasingly used in market and policy research. This study looked at the response in online surveys for various invitations people receive (by e-mail). Are people more willing to participate if they are sent the most important results of the survey? Or does that actually deter people? Is the height of the reward for participating a decisive factor for a higher response? Will information about the survey, such as the reason for the survey, who does the survey and why participation is important – also yield a higher response? The results of this study are at the least surprising. The response is statistically significantly lower if people are given information about the background of the survey before they start. The opportunity to be kept informed about the results does not have any effects on the response rate. A variation in the height of the reward for participation only has a limited effect on the response rate.


1 - Introduction


A high response rate is important for (online) surveys. If the response rate is too low, this will yield less information or more selective information about the population, which can threaten the representativeness and quality of the survey. A high response rate, however, is a sign of commitment, improving the integrity and representativeness of the data. But why do people participate in (online) surveys? In previous surveys, respondents indicated they participated because they were interested in the subject, for the reward, to influence decisions, because they like doing it or because they see it as their duty. But is there a difference between what people say and do?

In this document the results are presented of a study carried out by the Technical University of Eindhoven. For this survey, members of the PanelClix panel were approached. The web questionnaire was programmed by Isiz. The study investigated whether the response of online surveys depends on what type of invitation people receive. The following aspects differed in the various invitations:

  • Announcement that the respondents will get feedback about the results
  • Justification about why the survey is carried out and by who, and why it is important to take part
  • The height of the reward for participation.

The results show that the number of completes from the PanelClix panel is significantly higher (around 6 percent) than when no information is given about the background of the survey, such as who carries out the survey and why it is being done. The height of the reward has a limited effect. With a low reward an increase in the reward results in a statistically significant increase in the response rate of around 7 percent. A further increase in the reward no longer has an effect on the response rate, and actually shows a slight decrease (not statistically significant). The promise of feedback of results does not encourage people to participate in the survey. We can conclude that there are differences between what the respondents say about their reasons to participate and the behaviour shown by respondents of an online survey.


2 - The study


2.1 The invit

Twelve different invitations were sent out to a random group of people from the PanelClix panel. The panel functions as a type of sample group. The following elements of the invitations were varied:

  • Neutral (short and simple) or extensive (with justification) invitation
  • Feedback of the results versus no feedback
  • The height of the reward for participation: 50, 100 or 150 points. The points are part of the EuroClix savings program. The values of 50, 100 and 150 points correspond with 0.50, 1 and 1.50 Euros respectively.

The most important differences in the invitation are shown in appendix 1. In total, there were 12 variants (2 types of texts x 2 types of feedback x 3 steps in the reward), which are also shown in appendix 2. The numbers for the different invitations, with a schematic description of the various variants, are given in the following table:

Code Number of invitations sent Neutral or extensive Feedback Reward (in Clix)
A1 603 Neutral No 50
A2 598 Neutral No 100
A3 603 Neutral No 150
A4 601 Neutral Yes 50
A5 601 Neutral Yes 100
A6 601 Neutral Yes 150
B1 602 Extensive No 50
B2 604 Extensive No 100
B3 602 Extensive No 150
B4 604 Extensive Yes 50
B5 601 Extensive Yes 100
B6 602 Extensive Yes 150

The field work was carried out in the period of 7-14 July 2005 and 4-11 August 2005. On 7 July, a first batch of around 1200 invitations were sent (equally divided over the 12 groups). On 4 August, a second batch of around 7000 invitations were sent. All the invitations were sent out to 12 random groups at the same time on 7 July and 4 August. Reminders were not sent. After 14 July and 11 August, it was no longer possible for people to respond and take part in the survey, and their reaction was not included in the response.

The respondents for the 12 groups were selected at random from the PanelClix population file, to ensure the groups were (nearly) identical with regard to gender, age, region and education level.

The web questionnaire (duration around 20 minutes) respondents had to fill in, was also identical for the various groups, with the exception of a few conditions which varied per questionnaire. These conditions were also randomly selected per respondent. All variations in response can only be traced back to the variations in the invitations sent.


2.2 Different response levels

If people want to participate in the online survey, they first have to click a link in the invitation e-mail. After clicking on this link, respondents had to complete a web questionnaire.

A response analysis was carried out into the different invitations on the following levels:

Een respons analyse is uitgevoerd voor de verschillende uitnodigingen op de volgende niveaus:

  • Click or not? The percentage of people who click on the link in the invitation.
  • Complete or not? The percentage of respondents who, after clicking on the link, have completed the questionnaire.

Based on the combination of het response analyses, it was possible to determine the percentage of respondents completing the questionnaire for each invitation type.

For all differences in response rates per invitation it was then determined if there was a statistical difference. Fisher’s exact test, with a test on the 5% level, was used for the direct comparison of response rates. A multivariate test (i.e. taking into account that various characteristics of the invitation can each have a separate effect on the response rate (and can influence each other), based on logistic regression analysis. This was also tested on the 5% level. The results of this multivariate analysis, however, did not give any other insights – apart from one exception, which is mentioned in the text – than the simple comparison of percentages, so that this comparison of percentages suffices.


3 - Results


The different response levels for the various invitation variants is shown in the following table.

Type of invitation Click or not Complete or not From invitation to complete
Neutral (short and simple) 53% 80% 42%
Extensive with justification 42% 85% 36%
No feedback of results 46% 83% 38%
Feedback of results 47% 83% 39%
50 point reward 44% 80% 35%
100 point reward 50% 83% 42%
150 point reward 47% 84% 40%

In the following paragraphs, these results are further explained.


3.1 Neutral (short and simple) or extensive (with justification) invitation

Click or not?
In the case of the neutral invitation without a detailed explanation of the survey (invitations A1-A6), 53% of people clicked on the link. For the extensive invitation, including TU Eindhoven logo (invitations B1-B6) this was 42%. This is a statistically significant difference.

Complete or not?
Of the group of people who received a simple invitation without explanation (invitations A1-A6), 80% of the respondents who started the questionnaire also completed it. Of the people who clicked on the link in the invitation with explanation and TU Eindhoven logo (invitations B1-B6), 85% completed the web questionnaire. This is a statistically significant difference. More people complete the web questionnaire if they receive an extensive invitation with an explanation of why the survey is carried out and by who, and why it is important to participate.

From invitation to completed web questionnaire?
The combination of the two response analyses shows that from all invitations sent the ones with a neutral, simple text, the response was statistically significantly higher (42%) than the extensive invitations (36%). The higher response rate for the simple invitation is mainly caused by the fact that relatively more people click on the link in the invitation. The assumption that giving more background information about the survey increases the relevance of the survey for respondents and that the response rate will improved, seems to be untrue. In fact, the opposite effect was observed.

A more detailed analysis also looked at the experience of the respondents. In the figure below shows the number of times respondents have filled in questionnaires for PanelClix in the previous 12 months. This division relates to the ca. 7000 panel members who were invited for this survey and is representative for the entire PanelClix database. A bit more than half (around 55%) of the members participated in 2 or less online surveys in the previous 12 months.

response online

Those members who participated 3 times or more in the previous 12 months have a 62% chance of completing the questionnaire, while this is only 13% for those who participated twice or less. We also see that an extensive explanation has a negative effect on the more experienced respondents: The percentage of completes is 5 percent lower if an extensive justification is given. For less experienced respondents, the effect is the other way around. They are more willing to complete the questionnaire (a difference of 3 percent). Because the response rate for more experienced respondents is so much higher, this means sending out an extensive invitations has a negative ‘net’ result.


3.2 Feedback of the results versus no feedback

Click or not?
46% of people clicked on the invitation which did not offer the possibility to receive feedback about the results (invitations A1-A3 and B1-B3). For the invitation which stated that people could receive feedback about the result (invitations A4-A6 and B4-B6) this was 47%. This is not a statistically significant differen

Complete or not?
Independent of whether the possibility of feedback about the results is offered, 83% of respondents who starts the questionnaire also completes it.

From invitation to completed web questionnaire?
The combination of the two response levels shows us that 38% of all invitations sent without feedback results are translated into completed web questionnaires. From all the invitations sent with the option of sending feedback of the results, 39% is translated into completes. This is not a statistically significant difference. We can conclude that it does not matter if people are given the option to receive feedback about the results of the questionnaire.

Again, when looked at this in more detail, we do see some differences depending on the experience of the respondent. The more experienced respondents (3 or more questionnaires in the previous 12 months) are sensitive to this (4 percent effect), while it does not matter for inexperienced respondents.


3.3 Height of the reward

Click or not?
44% of people clicked on the link in the invitation e-mail (invitation A1, A4, B1 and B4) promising a reward of 50 Clix (equivalent to 0.50 Euros) for filling in the questionnaire (of around 20 minutes). This was 50% for the invitation (A2, A5, B2 and B5) offering a reward of 100 Clix (equivalent to 1 Euro) for participating. 47% of people clicked on the link in the invitation e-mail (A3, A6, B3 and B6) with the highest reward of 150 Clix (equivalent to 1.50 Euros). There is a statistically significant difference between the rewards of 0.50 and 1 Euro, but not between 0.50 and 1.50 Euros. The difference between 1 and 1.50 Euros, where the percentage of respondents is actually lower, is just about statistically significant.

Complete or not?
Of the respondents who were offered 50 Clix (0.50 Euro) for participating and who also clicked on the link in the invitation, 80% completed the questionnaire. Of the respondents who were offered 100 Clix (1 Euro) for participating and who also clicked on the link in the invitation, 83% completed the questionnaire. For the highest reward of 150 Clix (1.50 Euros), 84% of respondents who clicked on the link completed the questionnaire. There is a significant difference between the group who received 0.50 Euro and the groups receiving 1 or 1.50 Euros.

From invitation to completed web questionnaire?
If the click and completion rates are combined, there is a statistically significant difference in response rates between the 50 points reward and the 100 and 150 points rewards. For the lowest reward of 50 points (0.50 Euro for 20 minutes) the conversion from invitation to complete is 35%. For the two higher rewards of 1 and 1.50 Euros, this is 42% and 40% respectively. The response rate improves from 35% to 42% if the reward is increased from 50 points (0.50 Euro) to 100 points (1 Euro). However, the same does not happen if the reward is increased even more (to 1.50 Euros). The response rate actually drops back from 42% to 40%, although the difference is not statistically significant. It does show that the assumption that response can be bought is incorrect. People are initially encouraged with a relatively modest reward – in this case the effect is 6 percent – but the percentage of completes generated does not improve any more if the reward becomes even higher.

Again, it is worth splitting up the respondents according to experience with completing (PanelClix) questionnaires. The less experienced respondents are the least sensitive to the number of Clix offered (12, 15 and 13% generated completes for 50, 100 and 150 Clix). The more experienced respondents show they are more sensitive to the number of Clix offered (53, 67 and 68%), although raising the reward to 150 Clix shows little extra effect.


4 - Conclusion and discussion


We can draw several surprising conclusions based on the results of this study.

  • Sending respondents feedback about the results does not have an effect on the response rate. The hypothesis that respondents are more involved if they get feedback about the results or that respondents participate because they are interested in the results can be rejected, if we take the entire PanelClix panel as our sample group. For more experienced respondents there is a positive effect when this option is offered.
  • The height of the reward for participating in the survey only has a limited impact on the response rate. The moderate reward (1 Euro for 20 minutes) yields the highest response rate (40%). However, the response rate drops if the reward is increased. The results can be interpreted in different ways. The most direct explanation can be that being rewarded for participating in an online survey is not the dominant reason for people to participate. A second explanation could be that offering a lot of Clix leads people to assume the questionnaire will be long. This can be a reason why people are less willing to participate.

Finally, we also have to stress that the results and conclusions presented in this document cannot simply be applied to every random type of online survey. There can be differences per panel, as we saw in the conclusions for some of the possible explanations. There can be differences per panel depending on the frequency with which members participate in online surveys. PanelClix has a panel of more than 200,000 members, which means that half of all the panel members participates in online surveys more than 3 times per year. This can be totally different for other panels.

We can also not ignore the fact that for other types of surveys (e.g. regional surveys or surveys about sensitive subjects) the effects of an extensive justification about why the survey is carried out can be different from this general survey.

This white paper is part of a series of white papers, which can be read online on Research publications.

5 - Appendices


Appendix 1: Explanation of differences in the invitation.

Neutral (short and simple) or extensive (with justification) invitation.
Difference between neutral (short and simple) invitation (left) and extensive invitation (right) with explanation about who is carrying out the survey (with the logo), why the survey is done and why it is important to take part. For both invitations, respondents receive 50 points (0.50 Euro) for participating.

response online

Feedback of the results versus no feedback.
Difference between a neutral invitation without feedback (left) or with feedback (right) about the results. For both invitations, respondents receive 50 points (0.50 Euro) for participating.

response online

The height of the reward for participation.

Difference between a neutral invitation with a 50 points (left) and 100 points (right) reward for participating. The points are part of a savings program. The value of 50 to 100 points corresponds to 0.50 and 1 Euro respectively.

response online


Appendix 2: Images of all 12 different invitations.

response online


response online


response online


response online


response online


response online



response online


response online


response online


response online


response online


response online