Contents - Office for National Statistics

3 downloads 0 Views 510KB Size Report
Oct 10, 2002 - William Davies & James. Rushbrooke. 23. Measuring ... Jones & Abigail Dewar. Prepared by: William Davies ...... Jeremy Barton. 8. Evaluating ...

No. 53

January 2004

Contents

Incentive Payments on Social Surveys: a Literature Review

Eleanor Simmons & Amanda Wilmot,

1

The Business Survey Front Page - First Impressions

Sarah-Jane Williams & Stephanie Draper

12

Focus Group Recruitment of Business Survey Respondents

William Davies & James Rushbrooke

23

Measuring employment in the Annual Business Inquiry.

Steven Marsh,

31

Forthcoming Conferences, Seminars and Courses

40

The National Statistics Methodology Series

43

The National Statistics Quality Review Programme

45

The Survey Methodology Bulletin is produced primarily to inform staff in the Office for National Statistics (ONS) and the Government Statistical Service (GSS) about the work on survey methodology carried out by the ONS. It is produced by the ONS, and staff in the ONS are encouraged to write short articles about methodological projects or issues of general interest. Articles in the bulletin are not professionally refereed, as this would considerably increase the time and effort to produce the bulletin: they are working papers and should be viewed as such. The bulletin is published twice a year, usually in January and July, and is free to staff within the ONS and the GSS. It is made available to others with an interest in survey methodology at a small cost to cover production and postage. The Office for National Statistics works in partnership with others in the Government Statistical Service to provide Parliament, Government and the wider community with the statistical information, analysis and advice needed to improve decision-making, stimulate research and inform debate. It also registers key life events. It aims to provide an authoritative and impartial picture of society and a window on the work and performance of government, allowing the impact of government policies and actions to be assessed.

Edited by: Jacqui Jones & Abigail Dewar Prepared by: William Davies

Incentive Payments on Social Surveys: a Literature Review Eleanor Simmons & Amanda Wilmot 1. Introduction In recent years, response rates to social surveys, where participation is voluntary, have fallen. Survey organisations have therefore increased their efforts to gain public co-operation by a variety of means, including providing respondent incentives. Incentives may be given prior to the survey taking place, regardless of participation, or retained as a reward for those who complete the survey. They may take the form of a monetary or non-monetary gift. Currently, several surveys conducted by the Office for National Statistics (ONS) offer incentives to encourage response and demonstrate to respondents that their time is appreciated. The Omnibus Survey, Family Resources Survey, Expenditure and Food Survey and General Household Survey send out books of six first-class postage stamps with the advance letter to all sampled addresses. The Expenditure and Food Survey also gives a pen for respondents to use when completing the diary and pays £10 to each adult, and £5 to 7-15 year olds, on completion of the diary. This paper is a continuation of a previous ONS Survey Methodology paper (Dodd, 1998). It summarises the main findings from ongoing research on incentives and provides information for decision-makers considering whether the use of incentives would be appropriate. Thinking on this subject has broadened in recent years, particularly in respect to the effect on response bias, data quality, cost, and interviewer confidence. The use of differential incentive payments, and the possibility that their use could create an environment where payment is expected for survey participation, are also considered.

2. Incentives and overall response rates The general finding from the literature is that the use of incentives, however small in monetary terms, is effective in increasing response rates in postal, telephone and faceto-face surveys. This seems to be the case for all types of surveys, not just those where there is a high burden for the respondent, and it appears to be true for panel surveys.

2.1 Conditional v unconditional The prevailing opinion is that an unconditional, pre-paid incentive is more effective than the conditional promise of a reward on completion (for example, Church 1993; Hopkins & Gullickson 1992; Goyder 1994). This is the case for surveys conducted by post or interviewer-administered surveys (face-to-face or telephone).

1

SMB 53 1/04

Eleanor Simmons & Amanda Wilmot

Incentive Payments on Social Surveys a Literature Review

In a meta-analysis of the effects of unconditional and conditional incentives in postal surveys, Church (1993) found that, compared with the ‘no incentive’ group, response rates increased by an average of 19.1 percentage points when incentives were unconditional, while conditional incentives produced an average increase of 4.5 percentage points. A meta-analysis of the effects of incentives in face-to-face and telephone surveys conducted by Singer et al (1999) verifies this finding for interviewer-administered surveys. In 2000, the ONS conducted split-sample experiments with pre-paid unconditional incentives using the Family Resources Survey. A book of postage stamps was sent to half the sample along with the advance letter. Response to the survey was 67% among the group not receiving the stamps compared with 70.4% among the group that did (McConaghy and Beerten, 2003). Qualitative research into the effect of incentives was conducted within a follow-up to the Expenditure and Food Survey (Betts et al, 1999, unpublished). As part of a series of in-depth interviews, respondents were asked for their reaction to the postage stamps. Their feedback was generally favourable, for example:

“…I think it made me read the letter, cos, you receive a lot of junk mail in the post and I thought ‘Oh I’m getting free stamps here, lets read the letter’.”

“…stamps are useful. I thought it was very nice to get the stamps in the letter. It was encouraging to start with. Did make me think there is more to this than meets the eye.”

The ONS uses the Postcode Address File as a sampling frame for most household surveys. This makes sending an incentive with an advance letter problematic, as approximately 10% of addresses on the file are ineligible1. Assuming that the incentive is not retrieved, sending it to ineligible addresses may not be seen as a good use of the client’s or public money. Additionally, at multi-household addresses or for surveys where only one person in the household is interviewed, it is impossible to indicate in the advance letter who the incentive is intended for. One way around this might be to ask the interviewer to give payment directly to the respondent on first contact, irrespective of participation. Mack et al (1998), reported on the effect of paying a $20 incentive on wave one of the U.S. Survey of Income and Program Participation. Interviewers gave vouchers at the doorstep immediately after verifying the address. Vouchers were given to non-interviewed as well as interviewed households. When this incentive was given, non-response rates remained significantly lower compared with the ‘no incentive’ group (3 percentage points lower) by wave six, even though no further incentive payments were made between waves.

1

Institutions, non-residential addresses and residential addresses not classified as main addresses (e.g. second homes) 2

SMB 53 1/04

Eleanor Simmons & Amanda Wilmot

Incentive Payments on Social Surveys a Literature Review

Conditional incentives tend to be used in surveys that are more burdensome for respondents e.g. involving diary keeping. These payments tend to be of a higher monetary value than unconditional payments made in advance of the interview. In an experiment carried out by the National Centre for Social Research (NatCen)2 £10 was offered as a reward for completing both components of a time use survey (an interview and diary). Completion rates for the diaries were significantly improved in households where the incentive was promised compared with households where no incentive was promised (Lynn and Sturgis, 1997). NatCen have also experimented with using promissory notes given by the interviewer directly to the respondent for completing the British Social Attitudes Survey. The promissory note is essentially a contract promising the recipient that the money will be sent to them. This method was found to improve response rates from 56% to 63.3% when a £5 promissory note was used (Lynn et al, 1998).

2.2. Monetary v non-monetary As well as monetary incentives paid directly to respondents, survey organisations have also offered other types of incentive. These include donations to charities made on behalf of respondents, lottery draws, and gifts such as pens or fridge magnets. Both monetary and non-monetary forms of incentive have been found to increase response rates. However, most research points to monetary incentives paid directly to respondents as being more effective, even controlling for the value of the incentive (Singer et al, 1999). Charitable donations: the literature on the effectiveness of promising to make a charitable donation on behalf of the respondent is inconsistent. Some studies have found this type of incentive to have no beneficial effect on response (Warriner et al, 1996; Tzamourani, 2000). Conversely, Robertson and Bellenger (1978) found that a higher response rate was achieved with the promise of a charitable donation than with a monetary incentive paid directly to the respondent. In these experiments the amount offered was very low (one or two pounds). It would be interesting to experiment with higher amounts. Lottery: research into the use of lottery draws as an incentive has tended to be focussed on postal surveys. Again, the literature on their effectiveness is inconsistent. Warriner et al (1996) concluded that entering respondents into a prize draw failed to increase response rates, whereas Harkness and Mohler (1998) found that this type of incentive improved response rates. However, the authors noted that the depressed German economic climate at the time of the experiment might have acted as a motivating influence. Non-monetary gifts: again, much of the work in this area has focussed on postal surveys and it is generally accepted that non-monetary gifts are effective in improving response rates. Willimack et al (1995) found that giving a pen as an unconditional incentive for an interviewer-administered

2

Formerly, Social and Community Planning Research (SCPR) 3

SMB 53 1/04

Eleanor Simmons & Amanda Wilmot

Incentive Payments on Social Surveys a Literature Review

survey increased response by 3.7 percentage points over the group not receiving a pen. In 2001, the Family Resources Survey carried out an experiment where half of the sample was sent a book of postage stamps, the other half a ‘UK in Figures’ booklet The group that received stamps showed significantly better co-operation rates than the group that received the booklet (71.9% compared with 66.7% respectively), (McConaghy and Beerten, 2003).

The effectiveness of different types of incentive is likely to depend on the target group and nature of the survey.

2.3 Value of incentive Some studies have shown a positive linear relationship between the value of the incentive offered and the increase in response rates (Yu and Cooper, 1983; Hopkins and Gullickson, 1992). Other research has concluded that the relationship between response rates and the value of the incentive fits a model of diminishing returns (Armstrong, 1975; Fox et al, 1988). A recent experiment, conducted by NatCen, was designed to test the effect of offering a monetary incentive to every household member conditional on full co operation from the whole household taking part in the National Travel Survey. A split-sample experiment tested a £10 and £5 incentive compared with the control group where no incentive was offered. In this experiment interviewers knew which group households belonged to. Although the use of incentives significantly improved survey response rates when compared with the control (59.3% and 50.7% respectively), no statistically significant difference could be found between the £10 and £5 group (60.6% and 57.9% respectively), (Stratford et al, 2003).

2.4 Incentives and respondent burden Although intuitively one might assume that incentives are especially useful in improving response rates when there is a high respondent burden e.g. long interviews or diary-keeping, researchers have not generally been able to find a significant interaction between burden and incentive (Singer et al, 1999). However, Lynn et al (1997) experimented with offering an unconditional £5 incentive on later rounds of the British Election Panel Study. At wave 8 this small monetary incentive had the effect of halving the non-response rate from 7.9% to 3.7%. This indicates that there is a potential for further investigation into the use of incentives on panel surveys.

2.5 Differential incentives One way of using incentives to their best effect while still keeping a check on cost is to only pay those who refuse to take part, in order to convert them. Alternatively, incentives can be offered only in areas where response rates are traditionally low.

3

This booklet contains information about the statistics ONS provides and the surveys it carries out. 4

SMB 53 1/04

Eleanor Simmons & Amanda Wilmot

Incentive Payments on Social Surveys a Literature Review

Few studies appear to have embraced this approach, possibly because of the underlying ethical issues. However, a U.S. longitudinal study on welfare reform found that the use of targeted incentives to reluctant respondents was helpful in levelling off the longitudinal sample loss (Ward et al, 2001). In thinking about this approach, concerns have been raised that respondents not paid will perceive the use of incentives in this way as inequitable. It may have a negative effect on their attitude towards surveys and their future participation. Singer et al (1999) investigated these issues as part of an experiment on the U.S. Detroit Area Study. Respondents who agreed to be interviewed were split into 4 groups. Those who: i) did not receive an incentive; ii) received $5 with an advance letter; iii) received no incentive with an advance letter but were persuaded by a subsequent offer of $25; iv) received $5 with an advance letter but were only persuaded by a follow-up offer of $25. Three quarters of all respondents considered the practice of paying differential incentives as unfair. However, disclosure of differential payments, even among respondents who felt this practice to be unfair, had no significant effect on reported willingness to participate or actual participation in future surveys.

2.6 Effect of incentives on interviewers It has been argued that the positive effects of incentives on response may be attributed to their impact on interviewers rather than potential respondents. Interviewers may expect those who have received an incentive to be more co operative and may therefore be more confident in their approach. However, this factor does not appear to have a direct impact on response rates. The sample for the U.S. Survey of Consumer Attitudes was randomly divided into three groups. The first group was sent an advance letter and $5, which interviewers were unaware of. The second group was sent an advance letter and $5, which interviewers were aware of. The third group was sent only the letter. A significant increase in response was shown in groups one and two but no interviewer effect could be demonstrated (Singer, Van Hoewyk and Maher, 1998.) This accords with other experiments where interviewers were unaware of whether incentives had been paid and response rates nevertheless showed improvement. Stratford et al (2003) reported that National Travel Survey interviewers felt that the £10, rather than the £5 incentive, was essential for improving response in London, where response is notoriously difficult to obtain. However, this was not apparent in the results; no significant difference in response rate increases between the £10 and £5 incentive was found, even in London. Lynn (2001) combined both qualitative feedback from interviewers and quantitative data analysis to examine whether the perceptions of interviewers concurred with the survey results in relation to response. This study reported that respondent incentives had a significant effect on response rates without this being apparent to the interviewer. In fact the interviewers reported that the incentive had had effects that were either neutral or negative. Lynn concludes that the positive effect of the incentive on response rates was achieved via a direct impact on the respondents, and not via the interviewers as an intermediary. Although there does not seem to be any direct interviewer effect, there is some anecdotal evidence to suggest that respondents are likely to remember receiving the advance letter if accompanied by an incentive. This may help interviewers during

5

SMB 53 1/04

Eleanor Simmons & Amanda Wilmot

Incentive Payments on Social Surveys a Literature Review

their initial approach. Some examples of comments from respondents to the 2001 Omnibus Survey were (unpublished): “was that the letter that had the stamps with it?” “oh yes the letter with the 1st class stamps.” 3. Incentives and data quality Concern has been expressed that respondents who complete a survey solely to reap the benefit of the incentive may reduce the quality of the data collected by providing substandard responses. It is also feared that respondents may view the sponsor in a more favourable light, resulting in response bias towards the sponsor’s perceived desires. There is little evidence to substantiate these fears. In fact, many studies point to incentives improving data quality in terms of greater response completeness, greater accuracy, reduced item non-response and more comments to open-ended questions (James and Bolstein, 1990; Brennan, 1992; Willimack et al, 1995). Shettle and Mooney (1999) confirm these findings. Furthermore, they found that individuals receiving an incentive were more co-operative in providing information needed to track their whereabouts for successive waves of the survey — an important consideration in longitudinal studies. However, there is some indication that providing incentives may attract people with a fear of disclosing information on some sensitive issues, who perhaps would not have participated without the incentive. Tzamourani and Lynn (2000) reported greater item non-response among the incentive groups for sensitive questions on income and political affiliation and lower item non-response for less sensitive questions. These respondents may have felt more obliged to answer the less sensitive questions, having refused the more sensitive ones.

3.1 Effect of incentives on sample composition Incentives have repeatedly been found to increase co-operation rates among certain groups: low-income and low-education groups, larger households and households with dependent children, minority ethnic groups and younger respondents. James (1996) found that respondents defined as being in poverty who received an incentive, were 1.6 times more likely to co-operate than similar respondents who did not receive an incentive. For respondents classified as not in poverty, the incentive barely increased the odds of co-operating. Mack et al (1998) found that a $20 incentive was more effective in recruiting and retaining black households and households in poverty than non-black and non-poverty households in the U.S. Survey of Income and Program Participation. Analyses by Singer, Van Hoewyk and Maher (2000) found that a $5 incentive attracted a disproportionate number of low-education respondents. Stratford et al (2003 found that Black and Indian minority ethnic groups, people living in larger households, people living in households with dependent children (lone-parent households with dependent children in particular), people aged 0-20, and single people, were more influenced by incentives on the

6

SMB 53 1/04

Eleanor Simmons & Amanda Wilmot

Incentive Payments on Social Surveys a Literature Review

National Travel Survey. There is also some evidence that incentives might appeal more to men than women (Tzamourani and Lynn, 2000). Incentives have also been found to have a greater motivational impact on those who thought the survey was of little salience or interest to them. An experiment carried out as a follow-up to the 1996 U.S. Detroit Area Study, which included questions on political and community involvement, demonstrated that the effect of incentives was smaller for those with high community involvement. A $5 incentive increased response by 15.9 percentage points among respondents with high community involvement but the effect was far greater (a 41.9 percentage point increase) among respondents considered to have low community involvement (Groves et al, 2000). Further research has shown that in a U.S. survey undertaken with electrical utility customers, a $4 payment improved response by 2 percentage points among respondents who had volunteered for a special electric rate programme and 15 percentage points among the group who had not volunteered for the programme, compared with the ‘no incentive’ group (Baumgartner and Rathbun, 1996). Berlin et al (1992) found that respondents with higher scores on an assessment of adult literacy, who would be expected to have a greater interest in the National Adult Literacy Survey, were more likely to agree to participate without an incentive than those respondents with lower scores. There is concern that incentives could increase response bias, as their motivational effect is greater in some groups of the population than others. However, it can be argued that as the groups who are more motivated by incentives tend to be those who are usually under-represented in surveys, incentives can actually reduce response bias. For example, when incentives were used in the National Travel Survey 2002, it was found that they improved the sample composition compared with population figures derived from the 2001 Census. The only exception was the overrepresentation of lone-parent households with dependent children (Stratford et al, 2003).

3.2 Changing respondent behaviour Dodd (1998) suggested that pre-paid monetary incentives could cause people to change their behaviour. In particular, in diary surveys of consumption, respondents’ spending power would be increased with a pre-paid incentive. The extent of this problem will vary according to the size and nature of the incentive.

4. Cost of incentives Opinion is currently divided on whether the cost of paying incentives is justified and many sponsors of research are yet to be convinced. Incentives have been found to reduce the cost of administering a survey. In the case of face-to-face surveys, this is due to the reduced number of visits to an address required to gain an interview, which thus impacts on the time, travel and expenses incurred by interviewers. For postal surveys, fewer follow-up mailings are required, as are fewer calls in a telephone survey. During an experiment carried out on the British Social Attitudes Survey, Lynn et al (1998) showed that the mean number of visits per eligible address was lower in the

7

SMB 53 1/04

Eleanor Simmons & Amanda Wilmot

Incentive Payments on Social Surveys a Literature Review

incentive groups than in the control group. It was also lower with a £5 incentive than with a £3 incentive. An experiment conducted on the 2001 U.S. National Household Survey on Drug Abuse compared the impact of a $20 and $40 incentive on the total cost per interview with a $0 control group. For the $20 group the cost per interview was $9 less, and for the $40 group, $7 less than the $0 control group. This was attributed to the reduced number of interviewer visits required to complete an interview. An experiment conducted on the U.S. National Adult Literacy Survey showed that while a $20 incentive reduced the cost per interview by $11.45 compared with the ‘no incentive’ group, for an incentive of $35 the reduction in cost was only $1.18 (Berlin et al, 1992). It could be argued that the use of incentives might reduce the cost of data collection on a wider scale, as with improved response comes improved interviewer morale and staff retention. As a result, recruiting and training costs may fall. This is an area for further investigation.

5. Incentives and long-term expectation effect There is concern that the increasing frequency of paying incentives may lead to an expectation among the general public that payment should be made for completing all and any surveys. As a result, the feeling of civic duty people feel towards participating in a survey, particularly a government-sponsored survey, could well be diminished. Not only is it feared that response to surveys not offering incentives will fall, but that the quality of response will decline by those who have received payment in the past. As part of an experiment using the U.S Survey of Consumer Attitudes, it was found that respondents who had received an incentive in the past were more likely to agree with the statement ‘People should get paid for doing surveys like this’. However, despite this, these respondents were also more likely to complete a subsequent wave of the survey without payment. In terms of declining data quality, such respondents were no more likely to answer ‘don’t know’ to a series of 18 key questions on the survey than those who did not receive an incentive (Singer et al, 1998). While this is certainly a positive indication that the fears surrounding expectation effects may not be realised, this experiment only addressed one survey where the gap between the two waves was 6 months. It would be interesting to see how these respondents would react to a request to complete an entirely unrelated survey. Furthermore, if in the future more surveys were to offer incentives, the expectation of payment may become more apparent.

6, Conclusion The literature provides some guidance on how incentives can be used most effectively to improve response and data quality, but there remain aspects where further investigation is required. On the whole, unconditional, pre-paid incentives are more effective in improving response rates than those promised as a reward for participation. While both monetary and non-monetary incentives improve response, cash incentives, paid directly to the respondent, have a greater effect. When considering the amount of incentive offered, 8

SMB 53 1/04

Eleanor Simmons & Amanda Wilmot

Incentive Payments on Social Surveys a Literature Review

one should bear in mind the model of diminishing returns. A higher monetary payment does not guarantee a higher response rate than a more conservative one. Indeed, a lower amount paid in advance may elicit a similar improvement in response compared with a higher amount conditional on completion. More research is required into the effect of incentives offered on more burdensome surveys although there is some evidence to suggest that survey attrition may be countered through the use of incentives in longitudinal studies. Incentive payments to respondents can enhance interviewer confidence but this has not been demonstrated to have a direct effect on response. The use of incentive payments might improve data quality in terms of completeness and accuracy and might be particularly important in maintaining contact with respondents in panel surveys. Certain groups of the population are more attracted by incentives than others and this should be considered with respect to the sample composition and population under consideration. The effect that incentives might have in changing respondent behaviour should be taken into account when designing a survey. Although considerably more investigation is required in Great Britain into the effect of incentives on the overall cost of the survey, the indications are that their use can help to reduce the amount of time spent by the interviewer to elicit a response and is therefore cost effective. There is concern that, should the use of incentives become more commonplace, it will erode the feeling of civic duty, particularly with respect to participation in government surveys, and will become expected by potential survey respondents. The research to date suggests that this may not be borne out. Again this is probably an area requiring further investigation particularly as public reaction to the use of incentives may differ in Britain compared with the United States. From the literature it is clear that a multitude of factors, including: the population under investigation, the sample size and design, along with the subject matter and its relevance to the respondent, should be considered when deciding whether or not to offer an incentive. A final issue that has not been discussed in this paper is an ethical one. Bodies involved in providing ethical approval for surveys, where the subject matter is considered particularly intrusive, or where invasive procedures are carried out, do not always look favourably on incentive payments, which can be viewed as coercive. This being the case, perhaps we can only ever really offer a ‘token of our appreciation’ for participation?

References Armstrong, J. (1975) “Monetary Incentives in Mail Surveys”. Public Opinion Quarterly, 39(1): 111-116. Baumgartner, R. and Rathbun, P. (1997) “Prepaid Monetary Incentives and Mail Survey Response Rates”. Paper presented at the Annual conference of the American Association of Public Opinion Research; Norfolk, Virginia.

9

SMB 53 1/04

Eleanor Simmons & Amanda Wilmot

Incentive Payments on Social Surveys a Literature Review

Berlin, M., Mohadjer, L., Waksberg, J., Koistad, A., Kirsch, D., Rock, D. and Yamamoto, K. (1992) “An Experiment in Monetary Incentives”. Proceedings of Survey Research Methods Section of American Statistical Association, 393-8. Betts, P., Ursachi, K., Mitra, A. and White, A. (1999) “Expenditure and Food Survey: Report on the diary and questionnaire development programme conducted by the Question Testing Unit of Social Methodology Unit”. Unpublished. Brennan, M. (1992) “The Effect of Monetary Incentives on Mail Survey Response Rates: New Data”. Journal of the Market Research Society, 34(2): 173-177. Church, A. (1993) “Estimating the Effect of Incentives on Mail Survey Response Rates: a Meta-analysis”. Public Opinion Quarterly, 57(1): 62-79. Dodd, T. (1998) “Incentive Payments on Social Surveys: A Summary of Recent Research”. Survey Methodology Bulletin, 43: 23-27. Fox, R., Crask, M. and Kim, J. (1988) “Mail Survey Response Rate: A Meta-analysis of Selected Techniques for Inducing Response”. Public Opinion Quarterly, 52(4): 467-491. Goyder, 1. (1994) “An Experiment with Cash Incentives on a Personal Interview Survey”. Journal of the Market Research Society, 36(4): 360-366. Groves, R., Singer, E. and Corning, A. (2000) “Leverage-Saliency Theory of Survey Participation: Description and an Illustration”, Public Opinion Quarterly. 64(3): 299-308. Harkness, J. and Mohler, P. (1998) “Two Experiments with Incentives on Mail Surveys in Germany”. Survey Methods Newsletter, 18(2): 15-20. Hopkins, K. and Gullickson, A. (1992) “Response Rates in Survey Research: A Meta-analysis of the Effects of Monetary Gratuities”. Journal of Experimental Education, 61(1): 52-62. James, T. (1997) “Results of Wave 1 Incentive Experiment in the 1996 Survey of Income and Program Participation”. Proceedings of the Survey Research Section, American Statistical Association, 834-839. James, T. and Boistein, R. (1990) “The Effect of Monetary Incentives and Follow-up Mailings on the Response Rate and Response Quality in Mail Surveys”. Public Opinion Quarterly, 54(3): 346-361. Lynn, P. (2001) “The impact of incentives on response rates to personal interview surveys: role and perceptions of interviewers”, International Journal of Public Opinion Research, 13(3): 326-337. Lynn, P. and Sturgis, P. (1997) “Boosting Survey Response through a Monetary Incentive and Fieldwork Procedures: An Experiment”. Survey Methods Centre Newsletter, 17(3): 18-22. Lynn, P., Taylor, B. and Brook, L. (1997) “Incentives, Information and Number of Contacts: Testing the effects of these factors on response to a panel survey” Survey Methods Newsletter, 17(3): 13-17. Lynn, P., Thomsom, K., Brook, L. (1998) “An Experiment with Incentives on the British Social Attitudes Survey”. Survey Methods Newsletter, 18(2): 12-14.

10

SMB 53 1/04

Eleanor Simmons & Amanda Wilmot

Incentive Payments on Social Surveys a Literature Review

Mack, S., Huggins, V., Keathley, D. and Sundukchi, M. (1998) “Do Monetary Incentives Improve Response Rates in the Survey of Income and Programme Participation?” Proceedings of the Section on Survey Methodology, American Statistical Association, 529-34. McConaghy, M. and Beerten, R. (2003) “Influencing Response on the Family Resources Survey by Using Incentives”. Survey Methodology Bulletin, 51: 27-35. Robertson, D. and Bellenger, D. (1978) “A New Method of Increasing Mail Survey Responses: Contributions to Charity”. Journal of Marketing Research, 15: 632-633. Shettle, C. and Mooney, G. (1999) “Monetary Incentives in US Government Surveys”. Journal of Official Statistics, 15(2): 23 1-250. Singer, E. (2002) “The Use of Incentives to Reduce Non Response in Household Surveys” ppl63-l77 in: Groves, R., Dillman, D., Eltinge, I. and Little, R. (eds) Survey Non Response. New York: Wiley. Singer, E., Groves, R. and Corning, A. (1999) “Differential Incentives: Beliefs about Practices, Perceptions of Equity, and Effects on Survey Participation” Public Opinion Quarterly, 63(2): 251-260. Singer, E., Van Hoewyk, J., Gebler, N., Raghunathan, T. and McGonagle, K. (1999) “The Effects of Incentives on Response Rates in Interviewer-Mediated Surveys”. Journal of Official Statistics, 15(2): 2 17-230. Singer, E., Van Hoewyk, J., and Maher, M. (1998) “Does the Payment of Incentives Create Expectation Effects?” Public Opinion Quarterly, 62(2): 152-164. Singer, E., Van Hoewyk, J., and Maher, M. (2000) “Experiments with Incentives in Telephone Surveys” Public Opinion Quarterly, 64(2): 171-188. Stratford, N., Simmonds, N. and Nicolaas, G. (2003) “National Travel Survey 2002: Incentives Experiment Report”. http://www.dfi.gov.uk/stellent/groups/dft_transstats/documents/page/dft_transstats_0 23300.doc Tzamourani, P. (2000) “An experiment with promised contribution to charity as a respondent incentive on a face-to-face survey”. Survey Methods Newsletter, 20(2): 13-15. Tzamourani, P. and Lynn, P. (2000) “Do respondent incentives affect data quality? Evidence from an experiment”. Survey Methods Newsletter, 20(2): 3-7. Ward, K., Boggess, S., Selvavel, K.,Mahon, M. (2001) “The use of targeted Incentives to Reluctant Respondents on Response Rate and Data Quality”. Annual Meeting of the American Statistical Association, August 5-9, 2001. Warriner, K., Goyder, J., Gjertsen, H., Hohner, P., McSpurren, K. (1996) “Charities, No; Lotteries, No; Cash, Yes: Main Effects and Interactions on a Canadian Incentives Experiment”. Public Opinion Quarterly, 60(4): 542-562. Willimack, D., Schurnan, H., Pennell, B. and Lepkowski, J. (1995) “Effects of Pre paid Non-Monetary Incentives on Response Rates and Response Quality in Face to Face Surveys”. Public Opinion Quarterly, 59(1): 78-92. Yu, J. and Cooper, H. (1983) “A Quantitative Review of Research Design Effects on Response Rates to Questionnaires”. Journal of Marketing Research, 20: 36-44.

11

SMB 53 1/04

The Business Survey Front Page - First Impressions An Expert Review of the Business Survey Front Page used at the Office for National Statistics (ONS) Sarah-Jane Williams & Stephanie Draper 1. Introduction The Data Collection Methodology Centre of Expertise at the Office for National Statistics (ONS) has just started a project that will review and develop different types of written communication with business respondents. The project is called the Respondent Orientated Communication Project (ROCP). The actual questionnaire is not included within this work. Initially, the main types of communication to be reviewed will be: • the business survey front page; • first reminder letters; • second reminder letters; • final reminder letters; • flyers; and • Chief Executive letters. The first phase of ROCP aims to review the front page used for ONS business survey questionnaires. This phase is currently in progress. The objectives of this paper are to: • outline the importance of respondent orientated communication; • describe how the ONS business survey front page will be reviewed; and • provide a summary of the expert review that has been carried out on the ONS business survey front page. 2. The importance of respondent orientated communication Respondent orientated communication, other than the questionnaire itself, can have an important impact on reducing non-response error and measurement error. Non-response error occurs when the non-responders, if they had responded, would have provided different answers to the survey questions than those who did respond to the survey. For some business respondents the front page is the first piece of written communication they receive from the ONS. The front page can have a key role to play in determining whether or not the respondent decides to respond to the survey.

12

SMB 53 1/04

Sarah-Jane Williams & Stephanie Draper

The Business Survey Front Page First Impressions

Furthermore, if the respondent decides to respond, it will also have an impact on reducing non-response error, as it will affect how diligently they approach the response task. “The questionnaire is only one element of a well-done survey. Moreover, no matter how well constructed or easy to complete, it is not the main determinant of response to mail or other self-administered surveys. Implementation procedures have a much greater influence on response rates. Multiple contacts, the content of letters, appearance of envelopes, incentives, personalisation, sponsorship and how it is explained, and other attributes of the communication process have a significantly greater collective capability for influencing response rates than does the questionnaire itself” (Dillman 2000, p149). Measurement error occurs from inaccurate responses that arise from poor wording and design of the data collection instrument. For example, if the respondent misinterprets the instructions on the front page, they may complete the survey questionnaire for the incorrect business unit. Also, if the respondent finds the layout of the front page cluttered and confusing, they may overlook the instructions and not notice certain key reference points such as the time period that the response is required for. 3. The review process for the ONS business survey front page The ONS business survey front page will be reviewed and developed by working through the stages set out in the “Framework for Reviewing Data Collection Instruments in Business Surveys” (Jones 2003). There are six stages in this framework. • Stage 1: expert review, identification of data requirements and evaluation review. • Stage 2: redesign of the data collection instrument. • Stage 3: pre-field testing and evaluation. • Stage 4: field implementation of the redesigned data collection instrument. • Stage 5: post field implementation evaluation. • Stage 6: identify and disseminate best practice. 4. The structure of the existing ONS business survey front page The ONS conducts approximately 90 statutory business surveys. For these surveys, the ONS has attempted to standardise the design of the front page to meet the needs of all ONS business survey questionnaires. An example front page from the Annual Business Inquiry (ABI) is provided overleaf. The ABI collects data on a wide range of variables, including employment, turnover, stocks and capital expenditure. This data is then used as part of the inputs for compiling the National Accounts. The information included on the business survey front page can be classified into two types. First, “fixed text”, included on all business survey front pages. Second, “multifunction variable text”, included where appropriate for survey specific requirements

13

SMB 53 1/04

Sarah-Jane Williams & Stephanie Draper

The Business Survey Front Page First Impressions

The following “fixed text information is included on all business survey front pages: • National Statistics logo; • survey title; • contact name and postal address of the business respondent; • business address for which the questionnaire should be completed for; • ONS office address; • a box for respondents to provide amendments to address details; • ONS contact name, telephone and fax number; • notice given under the appropriate section of the Statistics of Trade Act; • statement about the penalties incurred if the questionnaire is not returned; • an assurance about data confidentiality; • instructions for: •

writing in “black ink”;



where to write any changes to name and address details;



returning the questionnaire by a specified date; and



providing informed estimates;

• a statement explaining that guidance notes on completing the questionnaire are enclosed; • a brief explanation about what the data collected will be used for; • assurances about how the response burden is being kept to the minimum possible; • a statement explaining where further information about data confidentiality can be found; • a Minicom number for respondents who are deaf; • a free phone telephone number for further information about the survey; • respondent reference number; • a statement thanking the respondent for their co-operation; • reference numbers required for internal processing; • scanning markers; and • a bar code.

14

SMB 53 1/04

Sarah-Jane Williams & Stephanie Draper

15

The Business Survey Front Page First Impressions

SMB 53 1/04

Sarah-Jane Williams & Stephanie Draper

The Business Survey Front Page First Impressions

The following “multi-function variable text” information can be included on the business survey front page:

• the time period to be covered by the data returned on the questionnaire; • a statement explaining how small businesses are rotated out of the sample after a certain period of time; • a statement explaining how the respondent can request a Euro version of the questionnaire; and • for businesses with Welsh address details, a statement providing a phone number for the respondent to use if they would like to request a copy of the questionnaire in Welsh.

5. Summary of the expert review An expert review involves comparing the existing front page against recognised principles of design. For example, analysing the wording, layout, format and content of the front page.

5.1 General The ONS business survey front page contains a considerable amount of information. Dillman (2000) described the purpose of the cover page as follows: an Opportunity to motivate respondents rather than a place to put information that doesn’t seem to fit anywhere else, such as detailed directions or background on why the survey is being done” (p135). Cluttering (overloading) a questionnaire can be detrimental to survey response. This is a weakness that many survey organisations overlook and the ONS is not alone in trying to compress a great deal of information on the survey front page. It is relatively straightforward to find examples of front pages used by other survey organisations which contain a comparable degree of information density. An important part of the next process in this review involves identifying and prioritising the key messages to include on the ONS business survey front page. We need to consider whether all this information is required on the front page and whether it can be more appropriately placed on the questionnaire itself or on a different type of written communication. For example, research has demonstrated that it is helpful to include instructions at the point in the questionnaire where they will be used (Jenkins & Dillman, 1997). This reduces the processing time and the amount of short-term memory required by the respondent to combine separately located questions and instructions. Furthermore, it increases the chance of the respondent remembering to use the instruction in answering the question. The ONS business survey front page provides instructions at a point where the respondent is not yet in a position to use them. For example, how to return the questionnaire, what reporting period to provide data for and providing informed

16

SMB 53 1/04

Sarah-Jane Williams & Stephanie Draper

The Business Survey Front Page First Impressions

estimates. It will be important to consider whether these instructions are most appropriately placed on the front page. Conversely, we also need to consider whether there is any additional information that is missing from the ONS business survey front page. For example, the development of the ONS Survey Charter may require that certain items of information are made known to the respondent at the first point of communication.

5.2 Structure With the current ONS business survey front page, the respondent is simultaneously drawn to different parts of the front page. This seems to occur partly as a result of competing types of visual emphasis and partly as a result of the varied spatial arrangement of information.

Types of visual emphasis The visual emphasis of red text, red background, bold text, bullet points, uppercase, different font styles and lines are used. It can be helpful to use visual emphasis of brightness, colour and shape to highlight important points or questions. Such visual effects can be helpful if they are used sparingly, consistently and in a way that helps to improve questionnaire structure (Dillman & Jenkins, 1997). On the ONS business survey front page, there are areas where this structure has been made unclear. For example, should the respondent consider information typed in red text more, less or of equal importance as information included in a box with a red shaded background? Headings, some instructions, the office name and a note of thanks are all typed in bold font face. Is there any rationale for grouping these elements together and writing them in bold font face? By grouping this seemingly unconnected group of elements together, it becomes unclear what the visual cue of bold font face actually means or adds to the structure of the page. It may only confuse the respondent. If all the headings or all the instructions were typed in bold font face it may help the respondent recognise a consistent pattern which they could use to help process the information on the page. There is also a more fundamental consideration that respondents who are colour blind may not be able to read text typed in red. Further, the use of visual emphasis like uppercase can make it more difficult for the respondent to read and process information. This is because the reader becomes unable to recognise words by their shape; all the words become rectangles. It is helpful to minimise the use of uppercase. Finally, in terms of visual emphasis, the ONS business survey front page uses a variety of font styles. This has occurred for practical reasons. For example, the business address is written in arial post office standard script. The general body of the text is written in a mixture of standard arial font and gilsans font. Trebuchet font is used for some of the reference numbers. Trebuchet font has proved more effective for reference numbers that are scanned. From a methodological perspective if different font families are required, it is helpful to distinguish between the text read by the respondent and the text read by ONS staff. Respondents will recognise this pattern and use it to help them follow through the information. Ideally, we would use one type of font face for the text read by the

17

SMB 53 1/04

Sarah-Jane Williams & Stephanie Draper

The Business Survey Front Page First Impressions

respondent, for example, anal and a different font face for the reference numbers required for internal processing for example, trebuchet This pattern could be further emphasised by using a lighter shade font for the reference numbers required for internal processing. This would have the effect of reducing the colour contrast between the numbers and the white background, making them less visible to the respondent, who is not required to read them.

Layout We will now discuss the spatial arrangement of the ONS front page. The ONS business survey front page is split into two styles. The top half of the page with the address and contact details is almost in a column format. The bottom half of the page has the information reading across the entire width of the page Respondents are likely to find it easier to process and progress through the information presented on the front page if one layout style is used. The layout can be further improved by placing the reference numbers, used for internal processing, in the lower right-hand quadrant of the front page Research canned out on eye movement (Brandt 1945, cited in Dillman & Jenkins 1997) has shown that the top left hand quadrant of the page is where the eye is most likely to focus on (if all quadrants are of equal visual interest) The bottom right hand quadrant is where the eye is least likely to focus on.

5.3 Identity The information included on the front page may not be clear enough to allow the respondent to determine who sent the questionnaire. The front page includes the National Statistics logo and the ONS address However, we do not describe who the ONS is, what the ONS does or what National Statistics are. This is an important consideration, as research has shown that respondents are more likely to respond to a survey if they understand who the authorising survey organisation is White et al (1998) conducted cognitive interview research that looked at the role of advance letters in the survey process. This showed that: “people s positive expectations could be increased in order to raise positive feelings about participating in the survey Examples included…, emphasising the ONS logo as a symbol of authority” (p36). Dillman (2000) has stated that: “Identifying the questionnaire clearly as being from a well known and legitimate source is desirable for fostering trust that the survey is legitimate and useful” (pl39). in a study on the Labour Force Survey leaflet (2000), respondents commented on “the need for ONS to raise its public profile, in particular making it clear that we are an official organisation a market research company” (Corbin & Eldridge, p9).

18

SMB 53 1/04

Sarah-Jane Williams & Stephanie Draper

The Business Survey Front Page First Impressions

Finally Cialdini (1984; cited in Dillman & Jenkins 1997) also argued that a person’s decision about whether or not to comply with a requested task will depend on a number of factors, one of these factors being: “the tendency to comply with requests given by those in positions of power” (p176). It is important to find out what respondents currently understand about the ONS from the information given on the existing business survey front page. This will be examined during the pre-field testing stage.

5.4 Language Comprehension It is helpful to explain in more detail what some of the statements actually mean. For example, at the top of all ONS business survey front pages the following statement is made: Notice is given under section 1 of the Statistics of Trade Act 1947 From reading this statement, the respondent may not understand what notice is being given, who is giving the notice and what the implications are of this notice. A further example: • Failure to make a return can incur penalties under section 4 of the Statistics of Trade Act 1947 Respondents may find this statement threatening if it is given without further explanation. Assurances about data confidentiality are given in the following statement: • It is illegal for us to reveal your data or identify your business to unauthorised persons. Research carried out on the confidentiality pledge included on front pages, covering letters and advance letters has shown that it is important to keep the statement easy to read, concise and definite. Detailed or unclear explanations may create unnecessary concerns and put the respondent off responding to the survey (Dillman 2000; Corbin & Eldridge 2000). During the pre-field cognitive interviewing stage it will be important to find out how respondents comprehend the language used.

Consistency We should aim to be consistent in the terminology that we use to help ease the respondent response task. On the ABI 2 front page illustrated on the fourth page of this article there are a couple of terminology inconsistencies. For example, the words “questionnaire and “form” are both used to mean the same thing.

19

SMB 53 1/04

Sarah-Jane Williams & Stephanie Draper

The Business Survey Front Page First Impressions

5.5 Order Intuitively, it would seem preferable to start the front page with a positive message. For example, providing a succinct description about the purpose of the survey. The description could explain what the data collected are to be used for; why the data submitted by the business respondent are important and how the survey outputs may be of interest to the respondent. For example, Dillman (1978) described how respondents are more likely to complete a questionnaire if they understand the usefulness of the survey to themselves and to groups that they identify with. A weakness of the current ONS business survey front page is that a brief description about the purpose of the survey is most often buried in the last information section at the bottom of the page. In the case of the ABI 2 front page shown on the fourth page of this article, the following statement is included: • The results act as a basis for National’ Accounts and various economic indicators. The order of the information given on the front page of ONS business surveys is different to that of the last ONS census and front pages used by other National Statistics Institutions (NSIs). The Australian Bureau of Statistics front page typically begins by describing the purpose of the survey. This is followed by details on the collection authority. The end of the front page finishes with a positive statement by including a note of thanks.

5.6 Personalisation Research has demonstrated that incorporating elements of personalisation on communication with respondents might help to improve the chance of respondents participating in the survey. Methods of personalising communications include signatures from named people within an organisation (for example, a survey manager); providing details of a named person within an organisation who can be contacted for help or other queries; addressing the questionnaire to a specific person at the respondent organisation and including a date. Dillman (2000) proposes that using personalisation elements in communications with survey respondents can: “Achieve a collective impact of five to eight percentage points.., to the final response rate” (p165). Research carried out by Martin et al. (1998) found that: “including the survey manager signature on the letter can raise positive feelings about participating in a survey” (p36). Survey front pages from Statistics New Zealand contain a personalised thank you statement from their Government Statistician. This may also increase the perceived legitimacy of the survey. The ONS business survey front page contains a number of personalisation elements. For example, the respondent name is included (when these details are available) and a named ONS contact is provided for respondent queries. However, a signature and date are not included.

20

SMB 53 1/04

Sarah-Jane Williams & Stephanie Draper

The Business Survey Front Page First Impressions

6. Conclusions The next step in this review will be to identify the information requirements with users. To collate feedback from internal staff on the views and opinions that respondents have about the existing ONS business survey front page. We will then analyse the common themes emerging from the expert review, information requirements and evaluation review with internal staff. The information collated from these three different sources will be used to redesign the ONS business survey front page. Following the redesign, the ONS business survey front page will then go through the stages of pre-field testing, field testing and post field testing evaluation before full implementation.

References Australian Bureau of Statistics (2000) Forms Development Procedures and Design Standards. Corbin, T. and Eldridge, J. (2000) The Labour Force Survey Leaflet: Testing the Wording of the Confidentiality Pledge. Social Survey Division, Office for National Statistics. Dillman, D.A. (2000) Mail and Internet Surveys. John Wiley & Son. Gower, A.R. (1994) Questionnaire Design for Business Surveys. Design Source Centre, Statistics Canada. Groves, R.M.(1989) Survey Errors and Survey Costs. John Wiley & Son. Jenkins, C.R. and Dillman, D.A.(1997) Towards a Theory of Self-Administered Questionnaire Design. Jones, J. and Scott, L.(2003) The Review Programme for Business Survey Data Collection Instruments. Survey Methodology Bulletin 52, 1-4, ONS, London. Jones, J. (2003) A Framework for Reviewing Data Collection Instruments in Business Surveys. Survey Methodology Bulletin 52, 4-9, ONS, London. Jones, J. (2003) Improving Business Survey Data Collection Instruments: Governance and Methodologies. QUEST. Landreth, A. (2001) Survey of Income and Program Participation (SIPP) Advance Letter Research: Cognitive Interview Results, Implications & Recommendations. Center for Survey Methods Research, Statistical Research Division, U.S. Census Bureau. Martin, J., Bennett, N., Freeth, S. and White, A. (1997) Improving Advance Letters for Major Government Surveys. Survey Methodology Bulletin 41, 1-17, ONS, London.

21

SMB 53 1/04

Sarah-Jane Williams & Stephanie Draper

The Business Survey Front Page First Impressions

White, A. and Freeth, S. (1997) Improving Advance Letters. Survey Methodology Bulletin 40, 23-29, ONS London. White, A., Martin, J., Bennett, J. and Freeth, S. (1998) Improving Advance Letters for Major Government Surveys: Final Results. Survey Methodology Bulletin 43, 36-43, ONS, London.

22

SMB 53 1/04

Focus Group Recruitment of Business Survey Respondents William Davies & James Rushbrooke 1. Background The Office for National Statistics (ONS) has used focus groups in the development and testing of data collection instruments for social surveys and the census but never for business surveys. This was partly due to a belief that business survey respondents would be less willing to participate in such an activity. During 2003, Data Collection Methodology in the Methodology Group (ONS) have been evaluating, redesigning and testing a revised New Earnings Survey (NES) questionnaire. This work was initiated from the National Statistics Quality Review of the Distributions of Earnings Statistics recommendations’1. The NES questionnaire was redeveloped as part of the business survey full review project. This development involved expert review, evaluation review, redevelopment and pre-field testing (Jones, 2003). As part of the pre-field testing of the NES questionnaire it was decided to attempt to conduct a focus group with business survey respondents. The objective of the focus group was to explore business survey respondents’ perceptions and attitudes to the current and redesigned NES questionnaires. This research was also conducted in the wider scope of a European project to research guidelines for reducing business response burden. The objective of this paper is to outline the methodology used to recruit business survey respondents for the focus group. The method of measuring success was by recording the outcomes of the recruitment: The paper provides an overview of the stages involved in the focus group recruitment. These stages include: • objectives of the focus group; • sampling strategy; and • recruitment. A brief overview of how the focus group was conducted is also included. The New Earnings Survey The New Earnings Survey (NES) is an annual sample survey of the earnings of employees in Great Britain. The main purpose of the survey is to produce annual data on the level, composition and distribution of earnings of full time employees (ONS 2003). It has been running since 1970 and is based on a 1 per cent sample of employees who are members of Pay-As-You-Earn (PAYE) income tax schemes. 1

The National Statistics Quality Review of the Distribution of Earnings Statistics Report was published on 10 October 2002 and is available on the National Statistics website

23

SMB 53 1/04

William Davies & James Rushbrooke

Focus Group Recruitment of Business Survey Respondents

The employers of approximately 245,000 sampled employees are sent NES questionnaires. These employers are legally obliged (under the Statistics of Trade Act 1947) to complete and return questionnaires on behalf of the sampled employees. The response rate in 2002 was 84 per cent. The sampling strategy for most ONS business surveys is that businesses are selected based on the number of people employed and the economic sector they operate in. The NES is different in that employees are sampled. Over 90 per cent of businesses won’t receive a NES questionnaire in any particular year. Of those that do, most businesses will receive between 1-10 NES questionnaires, although in general larger businesses can expect to receive more NES questionnaires as they have more employees registered for PAYE. A small number of businesses receive over 100 NES questionnaires. In 2003 the most NES questionnaires received by any one business was 1,937.

2. Focus groups 2.1 What is a focus group? Focus groups can be used in conjunction with another research method or on their own. The central purpose of a focus group is to collect information on a specific topic in a group environment. Morgan (1996:130) defines focus groups as: “a research technique that collects data through group interaction on a topic determined by the researcher. In essence, it is the researcher’s interest that provides the focus, whereas the data themselves come from the group interaction..” Focus groups are commonly characterised by focusing on a specific stimulus, in this case the New Earnings Survey. Other examples may include a newspaper article or film. A trained moderator guides the discussion using a topic guide. The moderator is not in a position of power or influence and may encourage comments of all types — both positive and negative. The moderator is careful not to make judgements about the responses and to control their own body language that might communicate approval or disapproval (Krueger, 1998). An assistant to the moderator may be present to observe the interactions between the group members by taking notes and dealing with housekeeping tasks. The focus group is ideally set in a comfortable, convenient and non-threatening location. The ideal composition of focus groups is between 6 to 10 participants, as this ensures that the group is small enough for everyone to have the opportunity to share insights, and yet large enough to provide diversity of perceptions. A key aim of focus groups is to determine the perceptions, feelings and manner of thinking of participants. Attitudes and perceptions are developed in part by interaction with other people. By maintaining a focused discussion, the topics in a focus group are predetermined and sequenced.

24

SMB 53 1/04

William Davies & James Rushbrooke

Focus Group Recruitment of Business Survey Respondents

2.2 Why use focus groups as a research method? The decision was taken to use focus groups as a research method due to a number of reasons. Morgan (1998) suggests that not only do focus groups uncover what participants think, but also the rationale of their thoughts. Oates (2000) outlines some key advantages of focus groups. • • • •

Participants can react to and build on the responses of other group members. They allow the researcher to interact directly with respondents. They allow the researcher to obtain rich data in the participants’ own words. They provide data from a group of people more quickly than interviewing individuals. • The results of a focus group are accessible and understandable. The group dynamic in focus groups is a primary advantage of their use as a research method. A group dynamic encourages explanation and exploration due to the interaction among participants. The data produced by the focus group draws on the participants own frameworks of reference and often emphasises the ways that respondents understand issues (Oates 2002). 3. Recruitment 3.1 Sampling strategy for the focus group recruitment The target members of the focus group were those that have overall responsibility for ensuring that the questionnaires are completed and returned. This distinction was made because an employee who has overall responsibility for the completion and return of the NES questionnaire does not necessarily complete the questionnaires. For example, the actual respondent completing the questionnaire may be a clerk in the payroll department. On the other hand, the person who has overall responsibility for its completion and return could be a payroll manager. Targeting these business survey respondents centred on the hypothesis that different burdens may be identified according to the position of the respondent in the business. The composition of the focus group was designed to be homogenous in terms of job role (those responsible for completing and returning NES questionnaires). A further consideration was that by focusing on one specific questionnaire the participants are similar to each other in that they represent the same area of business, such as payroll or human resources. Morgan (1998) suggests that a key strategy for selecting focus group participants is commonality, not diversity. People tend to disclose more about themselves to people who resemble them in various ways, rather than people who differ from them. In terms of the businesses that respondents worked in, the number of NES questionnaires received was used as the selection criteria. The aim was to recruit businesses that received a relatively large amount of questionnaires per year. The Inter-Departmental Business Register (IDBR) was used as the focus group sampling frame. The IDBR is a register of all the businesses in the United Kingdom that have either registered with Customs and Excise for VAT or with the Inland Revenue to operate a PAYE scheme for their employees. 25

SMB 53 1/04

William Davies & James Rushbrooke

Focus Group Recruitment of Business Survey Respondents

From the IDBR a list was taken of all businesses in the London area who received NES questionnaires during 2003. These were then sorted according to the number of questionnaires received, so that the 200 largest NES respondents could be identified. Finally, the sample was cross-referenced back against the IDBR to ensure that contact details were up to date and that all the businesses were still in operation. To summarise: Sampling frame: Inter-Departmental Business Register (IDBR) Target population: Respondents who are responsible for completing and returning NES questionnaires, who work in businesses based in London that receive a relatively large amount of NES questionnaires per year. Sample size: 200 businesses.

3.2 Recruiting the focus group respondents The project aimed to recruit 10 focus group participants. This then allowed for some respondents to drop out on the day. Whilst the primary intention of the recruitment was for one focus group, an alternative date (for a future focus group) was provided to those contacted who could not attend the first group. Personalised invitation letters were sent to persons with overall responsibility for completing NES questionnaires (please see Annex A). The letter highlighted the fact that participation was voluntary, travel costs would be reimbursed and information gathered would be treated as confidential. A copy of the 2003 NES questionnaire and an extract of an article press summary which used NES data were included in the invitation letter. Follow-up phone calls were carried out three days after the initial send out. Two ONS staff, using a telephone script as a guide, carried out follow up telephone calls. The sample was contacted by telephone in descending order, i.e. from those who received the most questionnaires to the least. Unfortunately, a postal strike affecting the London area occurred during the post-out stage of the recruitment. This significantly delayed the delivery of the invitation letters. The consequence was that only a small proportion of businesses had received the letter when contacted in the telephone follow up calls. Whilst cold calling is not an ideal recruitment strategy it was the only practical option in light of this unforeseen event. Once the required amount of attendees for the focus groups was achieved, no more calls were made (on account of the postal strike). It was thought highly unlikely that letters would have been received before the date of the focus group. No calls were made to ONS from the businesses that were not contacted in the follow-up calls.

3.3 Recruitment data capture The data from the recruitment process was recorded. The data captured included the time/date and the outcome of the phone call made to the business. The purpose of this exercise was to examine the overall success of the recruitment strategy. Table 1 shows the results of the telephone recruitment exercise. ONS failed to gain access or contact with the correct person in 61 per cent of the business. This was 26

SMB 53 1/04

William Davies & James Rushbrooke

Focus Group Recruitment of Business Survey Respondents

either because there was no number on the IDBR or the relevant person could not be contacted. When contact was made there were slightly more acceptances (21 per cent) than refusals (18 per cent). The ten businesses that declined the invitation did so because they had prior engagements and were unable to attend. Overall there were 12 acceptances. Two of these were acceptances for future focus groups (these are included in table 1 below). Out of the ten that had accepted for the first focus group, eight members arrived on the day. Out of the two absentees, one business contacted ONS to confirm non-attendance due to illness.

Table 1 Results of telephone recruitment

Key: No access: Wrong number! No number on IDBR Access - no contact: No answer/Left message/No contact with correct person Contact - decline: Contact made with the correct person, unsuccessful outcome Contact - accept: Contact made with the correct person, successful outcome

The call outcomes categorise the various levels of access that were achieved in the telephone recruitment. For example, where some phone numbers provided by the IDBR were incorrect or not on the register this was recorded as No access. No attempt was made to find the number through an alternative means (such as using an Internet search engine). Three categories have been created which represent variations of successfully accessing the business. The first option (Access-no contact) included calls where access was made but failed to contact the correct person. This category also included calls where there was no answer or a voicemail message was left.

27

SMB 53 1/04

William Davies & James Rushbrooke

Focus Group Recruitment of Business Survey Respondents

The final two categories represent calls where contact was made with the correct person (i.e. those with overall responsibility for completing and returning the NES questionnaire). They are divided between successful and unsuccessful outcomes.

3.4 Was the recruitment a success? In summary the recruitment for the first focus group using business respondents was successful given that this was the first time Data Collection Methodology had used this research method for business survey respondents. Nevertheless there were important lessons learnt from this recruitment strategy: • unforeseen events can occur such as a postal strike; • the date of the focus group may coincide with other events that the business is committed to; and • businesses may not turn up on the focus group date.

Many items which were captured in the data were not useful for analysis. For example, recording the time of the call did not prove to be a useful variable as most of the calls were made in the morning. A category that could have been included was whether the respondents had received the invitation letter. It would then have been possible to look at the relationship between call outcome and whether the businesses had received the letter.

4. How the focus group was conducted An outside independent researcher was commissioned to moderate the group. This ensured a clear separation between the sponsor (ONS), which outlined the objectives of the project and the outside researcher. The primary intention was to have a neutral person conducting the focus group and carrying out the analysis for the final report. The focus group lasted for just over two hours. It was conducted at the ONS offices in Pimlico. The focus group was tape-recorded for the purpose of analysis. An ONS host was also present in the focus group and acted as an assistant to the moderator. A topic guide was used as a framework to explore key elements of business survey response burden with the participants. At various points in the focus group participants were asked to work in pairs and to complete exercises. For example, respondents were asked to scale the burden they perceived on certain topics and to rank concepts alongside each other. In some of these exercises it was possible to quantify the data by producing a mean average of response burden concepts. Just over one week after the focus group took place participants in the focus group were sent a brief summary of the key focus group findings. The findings were presented in bullet points on one sheet of A4 paper. This was designed to be informative and accessible.

28

SMB 53 1/04

William Davies & James Rushbrooke

Focus Group Recruitment of Business Survey Respondents

5. Future plans A future focus group could change the sampling criteria to businesses that receive a comparatively small quantity of NES questionnaires per year. Further changes may include changing the profile of the participants to those that complete the survey but who are not responsible for completing and returning the questionnaire. This may identify a new area of response burden specific to job role as opposed to the quantity of questionnaires received per year.

References Bradburn, N. (1978) Respondent Burden in Health Survey Research Methods, DHEW Publication No. (PHS) 79-3207, pp. Fisher, S. & Kydoniefs, L. (2001) Using a Theoretical Model of Response Burden (RB) to Identify Sources of Burden in Surveys, Paper presented at the 12 International Workshop on Household Survey Nonresponse, Oslo. Haraldsen, G. (2002) Identifying and Reducing the Response Burden in Internet Business Surveys Paper presented at the International Conference on Questionnaire Development Evaluation and Testing Methods (QDET), Charleston, South Carolina. Jones, J. (2003) A Framework for Reviewing Data collection Instruments in Business Surveys, Survey Methodology Bulletin, 52, p. 4-9. Morgan, D. (1996) Focus groups, Annual Review of Sociology, 22 (1), p. 29-152. Morgan, D. L. (1998) It’s All About Relationships Working Together, in the Focus Group Guidebook, in Morgan, D.L. & Kreuger, R. A. (Focus Group Kit 1) p. London: Sage. Morgan, D. L. (1998) Deciding on the Group Composition, in Planning Focus Groups, in Morgan, D.L. & Kreuger, R. A. (Focus Group Kit 2) p. London: Sage. Kreuger, D. L. (1998) Guiding Principles of Moderating, in Moderating Focus Groups, in Morgan, D.L. & Kreuger, R. A. (Focus Group Kit 4) p. London: Sage. Oates, C. (2000) The Use of Focus Groups in Social Science Research, in Burton, D (ed) Research Training for Social Scientists. London: Sage. The Web The National Statistics Quality Review of the Distribution of Earnings Statistics Report (No.14) was published on the National Statistics website in October 2002. http://www.statistics.gov.uk/methods quality/quality review/downloads/DOER_Fina 1_Report.doc

29

SMB 53 1/04

William Davies & James Rushbrooke

Focus Group Recruitment of Business Survey Respondents

30

SMB 53 1/04

Measuring employment in the Annual Business Inquiry Steven Marsh 1. Introduction The National Statistics Employment and Jobs Quality Review was initiated in 2003. The need for a National Statistics Quality Review focusing on statistics of employment, jobs and hours of work was recommended in the 2002 National Statistics Quality Review of the Framework for Labour Market Statistics. The objective of the National Statistics Employment and Jobs Quality Review was to investigate ways to improve the coherence of information on the labour market. A main area of concern was the differences between the business survey based estimates of employment, derived from the Office for National Statistics (ONS) Annual Business Inquiry, Part One (ABII1), and the household based estimate, derived from ONS’s Labour Force Survey (LFS). As part of the quality review the Data Collection Methodology Centre of Expertise, in Methodology Group of the ONS were asked to undertake qualitative research with previous respondents to the Annual Business Inquiry Part 1. This paper provides an overview of the methodology used and the results of the research.

1.1 Annual Business Inquiry, Part One The Annual Business Inquiry (ABI) is a statutory business survey that is split into two parts. ABIJ1 collects data on employment figures from UK businesses and ABIJ2 collects financial data from UK businesses. ABI/l feeds into national and regional employment estimates, while ABII2 feeds into the National Accounts. The ABI is therefore crucial in providing information about the structure of UK industry. Most businesses receive both parts of the survey, however a small proportion will only receive ABI/l. In total there are six ABIJ1 questionnaires. All the questionnaires are similar and are regarded as either a long or short questionnaire. The short questionnaires have questions relating to total employment whilst the long questionnaires ask for breakdowns of male and female employees and full and part-time employees. Larger businesses receive the long questionnaires. The long questionnaire also asks additional questions relating to “other workers” and number of employees who are operatives dependent on the Standard Industrial Classification (SIC) of the business. A long questionnaire was used in the qualitative research. It contains all the questions that appear within the other five questionnaires.

2. The objectives of the qualitative research The main objectives of the qualitative research were to research respondents’ comprehension of the: •

questions;



definitions; and

31

SMB 53 1/04

Steven Marsh



Measuring employment in the Annual Business Inquiry

guidance notes.

In addition, specific areas of interest identified by Sykes (1997) and reinforced by the ABII1 Results, Analysis and Publications (RAP) branch were also addressed. These were: •

whether respondents are providing data for the correct reference date



whether respondents are including: ƒ

agency staff;

ƒ

self-employed;

ƒ

under 16’s;

ƒ

employees temporarily absent; and

ƒ

unpaid and voluntary workers

in their employee responses.

3. Methodology used in the qualitative research Structured in-depth interviews were conducted with twenty businesses that had previously responded to the ABJJ1. Due to time and cost constraints interviews were limited to twenty, ten of which were face-to-face interviews with the remaining ten being telephone interviews. A standardised interview schedule was used to ensure consistency whilst attempting to reduce bias between interviewers and also face-toface and telephone methods. Structured in-depth interviews use a standardised interview schedule that contains a set of questions in a predetermined order adhered to in each interview (Hall & Hall, 1996). The interview schedule was designed to investigate the processes respondents go through when answering ABIJ1 questions in order to identify possible mis reporting and consequent measurement error. Measurement error or observational error is a component of Groves’s (1989) model of Total Survey Error. Measurement error is the result of inaccurate responses that stem from poor question wording and questionnaire design, poor interviewing, survey modal effects and/or some aspect of respondent behaviour. Tourangeau’s (1984) basic cognitive response model for survey response, aims to reduce measurement error. This is achieved by evaluating and improving survey questions using the following four basic steps: 1. Comprehension: Understanding the meaning of the question. 2. Retrieval: Gathering relevant information. 3. Judgement: Assessing the adequacy of retrieved information relative to the meaning of the question. 4. Communication: Reporting the response to the question, e.g., selecting the response category, editing the response for desirability, etc.

32

SMB 53 1/04

Steven Marsh

Measuring employment in the Annual Business Inquiry

Based on Tourangeau's model, a method of retrospective probing was used in which respondents were probed after they had completed the survey questions. A mixture of comprehension/interpretation, paraphrase, confidence, recall, general and specific cognitive probes were used. This method of probing provides a valuable insight into the respondent’s comprehension of a question, their retrieval of the relevant information from memory or data sources, and the decision process in answering a specific question. Respondents were also probed for their perception of the content and layout of the ABI/l questionnaire. Below are some examples of probes: Comprehension/Interpretation “What do you understand by the term ‘part time employee’?” Paraphrase “Can you explain in your own words what the term. “other workers” means to you?” Confidence “How confident are you that the information provided is accurate?” Recall “Where would you get this information?” General “How would you answer this question?” “What is your overall impression of the questionnaire?” “Is this information easy or difficult to obtain?” Specific “Under what circumstances would you include or exclude ‘working foremen’ The interviews were conducted by staff in Data Collection Methodology. Interviews lasted for approximately one hour and were tape-recorded.

3.1 Sample for the qualitative research Respondents were selected through purposive sampling. The sample was not selected with a view to being a representative sample of ABJJ1 respondents. Instead the sample’s focus was on types of businesses that were likely to experience problems responding, as detailed by the key issues raised by Sykes (1997) and ABJ/1 RAP. These types of businesses were: a) multi-site businesses with high turnover of casual and temporary staff; b) businesses likely to employ under 16’s at weekends or on work experience; c) agencies, and businesses likely to employ agency or contract staff; and d) businesses which employ a high proportion of unpaid staff. The rationale for the identification of these types of businesses is outlined below:

33

SMB 53 1/04

Steven Marsh

Measuring employment in the Annual Business Inquiry

Multi-site businesses with high turnover of casual and temporary staff For multi-site businesses, questionnaires are completed at the head office. There may be a gap in the data available for temporary and casual staffing at individual sites, and at a more general level for new sites. Businesses likely to employ under 16’s at weekends or on work experience Staff under the age of 16 should be excluded from the. survey. Earlier research by Sykes (1997) found that some businesses were including under 16’s in their employment count. Agencies and businesses likely to employ agency or contract staff Sykes (1997) found that businesses correctly excluded agency staff from their employment count. Due to growth in the use of agency staff it would be useful to ensure that this is still the case. Furthermore, it may be useful to investigate the responses of agencies themselves, with particular attention to those employees registered with an agency but not working (or being paid) on the reference date. Businesses which employ a high proportion of unpaid staff Unpaid workers should be excluded from the survey, unless they receive some form of benefit from the business other than a salary or wage. This “benefit” differentiates them from voluntary workers who in effect volunteer their services for no payment or benefit. Employees who receive a benefit of some kind instead of a salary or wage, should be counted as “other workers” rather than employees. The sampling frame was selected from previous ABI/ 1 respondents listed on ONS’ s Inter-Departmental Businesses Register (IDBR). The sample was stratified by types of businesses (by SIC) likely to experience problematic response. Respondents were recruited by a process of cold calling using a standardised interview script. Respondents were faxed a copy of the ABJJ1 questionnaire several days prior to the interview.

3.2 Sample Composition Of the twenty businesses who agreed to be interviewed, nine fell into one of the four types of businesses likely to experience problematic response. Five of the businesses fell into more than one category. Six of the businesses did not fall into any of the categories. These businesses were retained in the sample due to low participation rates. The table below shows the frequencies of businesses in each category by interview mode.

34

SMB 53 1/04

Steven Marsh

Measuring employment in the Annual Business Inquiry

Table 1 Frequencies of types of businesses likely to experience problematic response by interview mode

4. Discussion The following discussion will summarise the qualitative data from the structured indepth interviews, focusing on the key issues that were raised by the ABI/1 RAP.

4.1 Are respondents providing data for the reference date? Only one respondent stated that they completed the questionnaire for the specified reference date ( December 2002). The remaining respondents completed the questionnaire using an end date for a period (for example the end of a week or month) within which the reference date fell. The main reason for this was that the majority of the respondent’s payroll systems did not hold records for individual days. This information would therefore be difficult to access. Two respondents stated that they did not usually provide data for the reference date, but do have daily timesheets detailing who was working on a specific day. For example, a respondent stated: “... to do it for the middle of the month, means that we would have to do a little bit more investigation...” This statement implies that the data was obtainable. However, the amount of data retrieval work would increase compared to the current method of retrieving data for the end of a period including the reference date. The majority of respondents stated that their data was unlikely to change from the reference date to the end of the period that includes the reference date.

35

SMB 53 1/04

Steven Marsh

Measuring employment in the Annual Business Inquiry

When asked about their comprehension of the reference date all respondents stated that they understood that the data was required for the specific reference date of 13 December 2002.

4.2 Are respondents excluding agency staff? Generally, respondents were correctly excluding agency staff from their employment count. A common theme that emerged from the interviews was that agency staff would only be included in their employment count if they appeared on the businesses’ payroll. Given that agency staff were not usually on the payroll of the business that they were working at, they would be correctly excluded. There were however exceptions. For example, a school stated that supply teachers were included in their employment count despite being paid by an agency and not the school. An employment agency reported that all staff on the payroll, albeit working within the agency or as agency staff working for other businesses, would be included in their employment count. How employment agencies deal with staff temporarily absent on the reference date is discussed later in this paper. When respondents were asked about their comprehension of the guidance notes, two respondents stated that they would incorrectly include agency staff in their employment count. This suggests that the guidance notes may be open to misinterpretation.

4.3 What are respondent’s interpretations of the term “self-employed”? Respondents defined the term “self-employed” as someone who dealt with their own tax returns and National Insurance. There were however mixed interpretations as to whether the self-employed should be included in employment counts. The majority of respondents reported that they did not have any self-employed employees and would not include them in their employment count if they did. A large multi-site business stated that they did come into contact with contractors but: “we (head office) wouldn’t hold the information on contractors. That would be done locally by the store managers.” In contrast, two respondents included the self-employed in their employment count as: “...they’re (self-employed) on our workforce so I’d have to include them”

4.4 Are under 16’s excluded from employment counts? The majority of respondents had correctly excluded under 16’s from their employment counts. One of these respondents reported that they did employ under 16’s and these were included in their employment count. Another respondent stated that if they had under 16’s on work experience around the reference date they would have been included in

36

SMB 53 1/04

Steven Marsh

Measuring employment in the Annual Business Inquiry

the employment counts. The other respondents rarely or never employ under 16’s but would include them if they appeared on the businesses’ payroll. However, based on their interpretation of the actual question and guidance notes, six respondents stated that they would have included under 16’s in their employment counts. Four of these respondents were from businesses in type B. “Businesses likely to employ under 16 at weekends or on work experience.”

4.5 What happens to employees who are temporarily absent? On the assumption that they would appear on the business’ payroll, all the respondents had correctly included employees who were temporarily absent on sick/maternity/annual leave. However, there were discrepancies as to whether to include employees who had been temporarily transferred within or outside of the UK. For those that would include these employees, the respondent’s judgement was again based on the assumption that the employee would still be on the business’ payroll. For example, a football club stated that players who had been loaned to other clubs would be included in the employment count as the players “are paid by them (the football club)”, and their wages were claimed back from the other football club. In terms of agency staff, an employment agency stated that employees would only be included in the employment count for the survey if they had worked or been paid during the week including the reference period. Therefore despite being registered with an agency, the employee would be excluded from the employment count if they were temporarily absent.

4.6 What are respondent’s interpretations of unpaid and voluntary workers? Respondents tended to similarly define unpaid and voluntary workers. For example, a common type of statement was: “...who can be unpaid workers if they’re not voluntary workers?” Generally there were mixed interpretations as to which employees to include and exclude. For example, a school that was included in business type D, “Businesses which employ a high proportion of unpaid staff’ incorrectly included employees on work experience in their employment counts, but correctly excluded a governor who runs the school shop on a voluntary basis. A large multi-site business did not have accurate records of unpaid or voluntary employees. This would have to be retrieved at a local level, which would be a time consuming retrieval process. Respondent’s opinion of the usefulness of the guidance notes was also divided. Some respondents considered the guidance notes to be “useful and quite clear” whilst others were confused by the guidance notes and suggested the inclusion of some examples of unpaid and voluntary workers.

37

SMB 53 1/04

Steven Marsh

Measuring employment in the Annual Business Inquiry

5. Conclusions It appears that the ABJJ1 questions and guidance notes produce some degree of measurement error. As we have seen there is a considerable amount of misreporting in employment counts in relation to several factors. The most salient of which are: • data not being provided for the reference date; • inclusion of under 16’s; • exclusion of agency staff not working on or around the reference date; • confusion in defining the term “self-employed”; and • fuzziness in the definitions of “unpaid” and “voluntary workers”. Rewording many of the ABI/1 questions and clarifying guidance notes could reduce measurement error. The full benefits of this would be optimised with respondent pre field testing1 of the questions and guidance notes. It should also ease the burden placed on respondents when undertaking their response task (Tourangeau (1984). For example, attempting to ensure that respondents can comprehend the questions, easily retrieve the information, judge the adequacy of the information and simply communicate the response to the question. Further work will now be undertaken to quantify the full scale of misreporting in the ABII1 and to compare the ABII1 questions and guidance notes to those included in the LFS.

References Bryman, A (1988) Quantity and Quality in Social Research Unwin Hyman. Dillman, D.A. (2000) Mail and Internet Surveys — The Tailored Design New York: John Wiley and Sons Inc. Groves, R.M. (1989) Survey Errors and Survey Costs New York: John Wiley and Sons Inc. Hall, D. & Hall, I. (1996) Practical Social Research Macmillan Press Ltd. Jenkins, C.R. & Dillman, D.A. (1995) Towards a Theory of Self-Administered Questionnaire Design New York: John Wiley and Sons Inc. Jones, J. A Framework for Reviewing Data Collection Instruments in Business Surveys Survey Methodology Bulletin 52 pp 4-9, Office for National Statistics. Marsh, S.A. (2003) Review of the Annual Business Inquiry (Part 1), Employment and Jobs Questionnaire using Qualitative Methods Data Collection Methodology, Office for National Statistics. Simmons, E. (2003) Young People and Social Capital: Summary Report of Cognitive Interviews Data Collection Methodology, Office for National Statistics.

1

Pre-field testing includes methods such as cognitive interviews, in-depth interviews and focus groups. 38

SMB 53 1/04

Steven Marsh

Measuring employment in the Annual Business Inquiry

Sykes, W. (1997) Qualitative Assurance Study on ONS Employment Questions: Qualitative Research Social Survey Division, Office for National Statistics. Tourangeau, R. (1984) in Willimack, D.K. & Nichols, E. (2001) Building an Alternative Response Model for Business Surveys Paper presented at the AAPOR Annual Meeting, May 2001, Montreal. Williams, S.J. (2002) Review of Questionnaires for the Financial and Accounting Surveys Division Treasury Report Data Collection Methodology, Office for National Statistics.

39

SMB 53 1/04

Forthcoming Conferences, Seminars and Courses 1. National Statistics Events Integrating Analysis Seminar Will be held at the Museum of London (location details can be found at www.museumoflondon.org.uk)on Wednesday 25 February 2004 between 13:00 — 15:00. This is the first in a series of seminars aimed at bringing together analysts from across government to consider how, as a group, key policy questions can be answered. The seminar is free of charge. If you would like to attend, please contact Susan Carrick-White via e-mail: susan.carrick-white @ ons.gov.uk

Ninth GSS Methodology Conference Will be held at the Victoria Park Plaza, 239 Vauxhall Road, London, SW1V 1EQ, on Monday 28 June 2004. The conference will include parallel sessions, and again submission of complete sessions is encouraged. Attendance is open to members of the GSS and other interested parties, and is free of charge. Numbers are limited, so if you wish to attend, or submit an abstract, please e mail [email protected] by Monday 31 March 2004.

For more information on the above events, please visit: www.statistics.gov.uk/events 2. Royal Statistical Society Conferences Business Improvement Through Statistical Thinking The conference is aimed at those using statistics in industry, and those with an interest in the business and industrial application of statistics. It will take place on 21 - 22 April 2004, at the Britannia Court Hotel, Keresley (near Coventry), UK. For further information please contact Paul Gentry at: [email protected] International Conference of the Royal Statistical Society This year’s conference, ‘Connecting Practice With Research’, will be hosted by the University of Manchester, from 7-10 September 2004. For information, registration and accommodation enquiries, please contact Lesley Gilchrist at: [email protected] For more information on the above events, please visit: www.rss.org

40

SMB 53 1/04

Forthcoming Conferences, Seminars and Courses

3. The Cathie Marsh Centre for Census and Survey Research CSSR External Short Courses Course

Date

Introduction to Stata/Data Management with Stata

4-5 February 2004

Logistic Regression

11 February 2004

Multilevel Modelling

25 February 2004

Questionnaire Design

8 March 2004

Surveys and Sampling

10 March 2004

Conceptualising Longitudinal Analysis

24 March 2004

SPSS for Social Scientists

7 April 2004

Introduction to Data Analysis Part 1

14 April 2004

Introduction to Longitudinal Analysis

21 April 2004

Longitudinal Data Analysis

28-30 April 2004

Data Reduction and Classification

5 May 2004

Multiple Regression

12 May 2004

Logistic Regression

19 May 2004

Introduction to Data Analysis, Part 2

26 May 2004

CCSR courses generally start at 09:45 and end by 16:30.

For further information, please visit: www.ccsr.ac.uk

4. Social Research Association Automation in the Survey Process - Managing Change and Avoiding Disaster An ASC Software Showcase, to be held on Thursday 22 April 2004. For further information, please visit: www.asc.org.uk

41

SMB 53 1/04

Forthcoming Conferences, Seminars and Courses

5. MSc in Official Statistics The MSc programme has been developed jointly by the University of Southampton with the UK Government Statistical Service (GSS) to cover the core skills and knowledge needed by professional government statisticians working in the UK and in other countries. Individual units of the MSc programme are available as short courses for those who have no need for another qualification but do wish to update their professional knowledge.

STAT6O36 Generalised Linear Models (Data Analysis II) Southampton

23-27 February 2004

STAT6O51 Survey Error and Data Quality London

1-5 March 2004

STAT6039 Analysis of Longitudinal Data Southampton

15-19 March 2004

STAT6O34 Demography II Southampton

19-23 April 2004

STAT6O32 Compensating for Non-Response Southampton

26-30 April 2004

STAT6056 Small Area Estimation Southampton

10-14 May 2004

For further information please contact: [email protected] 6. European Conference on Quality and Methodology in Official Statistics Mainz, Germany, 24-26 May 2004 The first conference in a series of biennial scientific gatherings covering important methodological and quality-related topics of relevance to the European Statistical System (ESS). The conference is supported by Eurostat. For further information, please visit: http://q2004.destatis

7. RC33 Sixth International Conference on Social Science Methodology Amsterdam, The Netherlands, 16-20 August 2004 This conference will concentrate on recent developments and applications in Social Research Methodology. For further information, please visit: http://www.siswo.uva.nl/rc33

42

SMB 53 1/04

The National Statistics Methodology Series This series, aimed at disseminating National Statistics Methodology quickly and easily, comprises of monographs with a substantial methodological interest produced by members of the Government Statistical Service. Currently available: 1.

Software to Weight and Gross Survey Data, Dave Elliot.

2.

Report of Task Force on Seasonal Adjustment.

3.

Report of the Task Force on Imputation.

4.

Report of the Task Force on Disclosure.

5.

Gross Domestic Product: Output Methodological Guide, Peter Sharp.

6.

Interpolating Annual Data to Monthly or Quarterly Data, Michael Baxter.

7.

Sample Design Options for an Integrated Household Survey, Dave Elliot and Jeremy Barton.

8.

Evaluating Non-Response on Household Surveys, Kate Foster.

9.

Reducing Statistical Burdens on Business, Andrew Machin.

10. Statistics on Trade in Goods, David Ruffles. 11. The 1997 UK Pilot of the Eurostat Time Use Survey, Patrick Sturgis and Peter Lynn. 12. Monthly Statistics on Public Sector Finances, Jeff Golland, David Savage, Tim Pike, Stephen Knight. 13. A Review of Sample Attrition and Representativeness in Three Longitudinal Surveys, Gad Nathan. 14. Measuring and Improving Data Quality, Vera Ruddock. 15. Gross Domestic Product: Output Approach, Peter Sharp. 16. Report of the Task Force on Weighting and Estimation, Dave Elliot. 17. Methodological Issues in the Production and Analysis of Longitudinal Data from the Labour Force Survey, P. 5. Clarke and P. F. Tate (Winter 1999). 18. Comparisons of Income Data between the Family Expenditure Survey and the Family Resources Survey, Margaret Frosztega and the Households Below Average Income Team (February 2000). 19. European Community Household Panel: Robustness Assessment Report for United Kingdom Income Data, Waves 1 and 2 (March 2000). 20. Producer Price Indices: Principles and Procedures, Ian Richardson (March 2000).

43

SMB 53 1/04

The National Statistics Methodology Series

21. Variance Estimation for Labour Force Survey Estimates of Level and Change, D. J. Holmes and C. J. Skinner (May 2000). 22. Longitudinal Data for Policy Analysis, Michael White, Joan Payne and Jane Lakey (May 2000). 23. Report on GSS Survey Activity in 1997, Office for National Statistics, Survey Control Unit (December 2000). 24. Obtaining Information about Drinking Through Surveys of the General Population, Eileen Goddard (January 2001). 25. Methods for Automatic Record Matching and Linkage and their use in National Statistics, Leicester Gill (July 2001). 26. Designing Surveys Using Variances calculated from census Data, Sharon Bruce, Charles Lound, Dave Elliot (March 2001). 27. Report on GSS Survey Activity in 1998, ONS Survey Control Unit (September 2001). 28. Evaluation Criteria for Statistical Editing and Imputation, Ray Chambers (September 2001). 29. Report on GSS Survey Activity in 1999, ONS Survey Control Unit (October 2001). 30. The Value and Feasibility of a National Survey of Drug Use Among Adults in the United Kingdom, Eileen Goddard (April 2002). 31. Report on GSS Survey Activity in 2000, ONS Quality Centre (April 2002). 32. Gross Domestic Product: Output Approach (Gross Value Added) — Revised, Peter Sharp. 33. Life Expectancy at Birth: Methodological Options for Small Populations, Barbara Toson, Allan Baker.

Reports No. 5 and 15 are £20 each; the remaining reports are each priced at £5. All reports are available to GSS members free of charge. The above publications are available by calling +44 (0)1329 813060.

44

SMB 53 1/04

The National Statistics Quality Review Programme The National Statistics Quality Review Programme has been introduced to help ensure that National Statistics and other official statistical outputs are fit for purpose and that there is a process to support the continuing improvement in the quality and value of these outputs. It is a key component for quality assuring National Statistics as set out in the Government White Paper, ‘Building Trust in Statistics’. The following National Statistics Quality Reviews are currently available: 1.

Review of Short Term Output Indicators, 4 October 2000 — Economy

2.

Review of the Inter-Departmental Business Register, 18 April 2001 — Commerce, Energy and Industry

3.

Review of the National Travel Survey, 3 May 2001 — Travel, Transport and Tourism

4.

Review of Defence Personnel Statistics, 30 August 2001 — Public Sector and Other

5.

Review of Income Support Statistics, 30 November 2001 — Social and Welfare

6.

Review of Jobseeker ‘s Allowance Statistics, 30 November 2001 — Social and Welfare

7.

Review of Child Support Agency Statistics, 30 November 2001 — Social and Welfare

8.

National Population Projections: Review of Methodology for Projecting Mortality, 14 December 2001 — Population and Migration

9.

Review of Construction Statistics, 19 December 2001 — Natural and Built Environment

10. Review of Forecasting the Prison and Probation Populations, 10 April 2002 — Crime and Justice 11. Review of the Framework for Labour Market Statistics, 5 August 2002 — Labour Market 12. Review of the Labour Force Survey, 4 September 2002 — Labour Market 13. Review of Government Accounts and Indicators, 2 October 2002 — Economy 14. Review of Distribution of Earnings Statistics, 10 October 2002 — Labour Market 15. Review of Higher Education Student Statistics, 4 November 2002 — Education and Training 16. Review of DFID ‘s Statistical Information Systems, 14 November 2002 — Public Sector and Other 17. Review of United Kingdom Defence Statistics Annual Publication, 20 November 2002 — Public Sector and Other 18. Review of Armed Forces Medical Statistics, 20 November 2002 — Public Sector and Other

45

SMB 53 1/04

National Statistics Quality Review Programme

19. Review of Forestry Statistics, 13 December 2002 — Agriculture, Fishing and Forestry 20. Review of Crime Statistics, 30 July 2003 — Crime and Justice 21. Review of Efficacy of Sentencing, 30 July 2003 — Crime and Justice 22. Review of Bus, Coach and Light Rail Statistics, 6 August 2003 — Transport, Travel and Tourism 23. Review of International Migration Statistics, 2 September 2003 — Population and Migration 24. Review of the Initial Entry Rate into Higher Education, 17 November 2003 — Education and Training 25. Review of Homicide Statistics, 3 December 2003 — Crime and Justice 26. Review of Motoring Statistics, 3 December 2003 — Crime and Justice 27. Review of Administration of Justice Statistics, 3 December 2003 — Crime and Justice All of the above reviews are available free of charge from the ONS website: www.statistics. gov.uk

46

SMB 53 1/04

Subscriptions and Enquiries The Survey Methodology Bulletin is published twice yearly, in January and July, priced £5 in the UK and Eire or £6 elsewhere. Copies of the current and previous editions are available from: William Davies Survey Methodology Bulletin Orders Data Collection Methodology Office for National Statistics Room D136 Cardiff Road Newport NP10 8XG [email protected] Telephone: (01633) 813152 www.statistics.gov.uk

Data Collection Methodology Methodology Directorate Office for National Statistics Cardiff Road Newport NP10 8XG www.statistics.gov.uk ISBN 1 85774 567 1 ISSN 0263 - 158X Price £5