Proceedings of Q2006 - Office for National Statistics

4 downloads 0 Views 563KB Size Report
General Principles. The research project must: ... Survey Charter. Commitment to .... Age last birthday ... between that focal person and another party … a belief ...
Proceedings of Q2006 European Conference on Quality in Survey Statistics

The Response Process in Recurring Business Surveys Mojca Bavdaz 1

Key words: business surveys, response process

1. Introduction and Background 1.1 The Role of Recurring Business Surveys In increasingly complex and competitive economic environment, decision-making of enterprises, governments, and other institutions must be based on relevant, accurate and timely economic and business data. Statistical organisations are hence faced with constantly growing expectations for high quality data. An important aspect of economic data is also their comparability over time in order to track changes. Consequently, many business surveys are repeated at regular intervals (e.g. monthly, quarterly, annually). In fact, the demand for compilation of time series often determines the design of business surveys. Business surveys typically employ panel or longitudinal sampling designs which gives them a recurring character (Willimack et al., 2004). For bigger businesses this usually implies a permanent inclusion in the sample, for other sampled units it means just a certain number of repetitions. The perspectives of a statistical organisation and that of a business do not necessarily coincide since a recurring business survey may not be administered to the same units all the time. The paper sheds light on the business perspective. It studies the response process in businesses that respond to a recurring survey request. It thus focuses on the repeated response process as compared to the first survey response. The understanding of this process provides an excellent opportunity to identify possible sources of measurement error and to derive implications for measurement error. It represents the first step in its detection, reduction and prevention. The paper first reviews the treatment of recurrence in existing business survey response models. It then presents a study of the response process in a quarterly business survey and discusses its findings. It concludes with implications for business survey response models and recommendations for reduction or elimination of the sources of measurement error in recurring business surveys.

1

Mojca Bavdaz, University of Ljubljana, Faculty of Economics, Kardeljeva pl. 17, SI-1000 Ljubljana, Slovenia ([email protected])

1.2 Models of the response process At present, there are few models of the response process in business surveys compared to the abundance of such models in household survey research. These models have been presented by Edwards and Cantor (1991), Biemer and Fecso (1995), Sudman et al. (2000) and Willimack and Nichols (2001). They are all based on the cognitive response model by Tourangeau (1984) but they incorporate typical characteristics of business surveys such as the use of records for data retrieval, the importance of respondent selection and the self-administered mode of data collection. Although the recurring character of business surveys is commonly recognised, it has not been treated separately or comprehensively. A review is given in the following chapter. 1.3 Recurrence in existing models Edwards and Cantor (1991) are the authors of the first response model which could be applied directly to business surveys. It encompasses encoding or record formation, comprehension, decision about using memory or records as a source, retrieval or record look-up, judgment and communication of the response. In their study, they distinguish between the units that are regularly required to complete a certain questionnaire and others selected only haphazardly in sample selection. They expose again the issue of recurrence in connection with the record look-up process. They emphasize the “rehearsal of record look-up process” (p. 227) as an important factor in information retrieval. Frequent and continuous survey requests should therefore yield more accurate responses (Alba and Hasher (1983) and Knight and McDaniel (1979) in Edwards and Cantor, 1991). In their pilot study, they found that respondents who faced a higher incidence of an event (i.e. injuries and illnesses at workplace) produced less reporting errors on that event and that those who routinely completed the forms performed better on judgement than those who seldom did it. Biemer and Fecso (1995) do not place attention to recurrence as such; they focus on the ways to obtain a repeated measurement of the same variables. A major upgrade of the response model for business surveys was proposed as a “hybrid model” by Sudman et al. (2000). Two of its co-authors, Willimack and Nichols (2001), refined it to become the “complete model” with the following steps: • • • • • • • •

Encoding in memory / record formation Selection & identification of the respondent(s) Assessment of priorities Comprehension of the data request Retrieval of relevant information from memory and/or records Judgement of the adequacy of the response Communication of the response Release of the data

Both models are based on the same exploratory study on the survey response process in large multi-unit companies. Since the latest model does not introduce any new ideas on the matter of recurrence, this review discusses only the paper by Sudman et al. Sudman et al. explicitly refer to “repeated periodic surveys” only when describing the retrieval of information. They report that the respondents keep documentation of retrieval procedures and calculations and that this documentation assists them (and new respondents) in completing subsequent questionnaires in recurring surveys, which makes new survey requests more burdensome than recurring ones. Such approach brings an advantage of tracing real changes instead of confounding them with methodological changes and a disadvantage of perpetuating errors. The recurrence could also be inferred from other passages. Sudman et al. report that: • • • •

the documentation supports estimation procedures for updating previously reported values in the judgment step; the respondents provide consistent data about the business “from cycle to cycle”; the previous respondent should be indicated on the mailing label; improvements may not produce the desired effect due to the disruption of the routine response process.

Sometime the recurrence is implicitly assumed even if no obvious reference is given. For instance, changes in business record formation to ease data retrieval mentioned by Willimack and Nichols (2001) would be beneficial in a recurring survey but probably inconceivable for a one time survey. To summarise, the models of the response process in business surveys concede the recurrence of the process but they do not address it systematically. They explicitly treat it only in relation with the retrieval step. This paper discusses the entire response process in business surveys in the light of the special features arising from the recurrence of reporting and proposes an adaptation of existing models.

2. Research Methods and Procedures The findings are based on a study of enterprises included in the quarterly survey on trade conducted by the Statistical Office of the Republic of Slovenia (SORS). This business survey uses an 8-page questionnaire accompanied by a 6-page instruction booklet and a booklet with an excerpt of the standard activity classification. On-site visits were organised with 28 enterprises of different sizes; only one of them completed the questionnaire for the first time. In all cases, in-depth interviews were carried out with persons who fill out the questionnaire. In 8 cases, the process of filling out was also observed at least partially; the extent of observation depends mainly on the amount of data retrieved before the visit. The aim of these visits was to observe the process of the questionnaire filling in or to debrief the respondents as soon as possible

after the completion of the questionnaire. The purpose of the study was to understand the respondents’ approaches to questionnaire completion and to identify (potentially) problematic patterns. On-site visits were arranged around two consecutive deadlines for questionnaire completion. One researcher carried out the organisation of the field work and its implementation. Initial contacts were established with persons indicated as respondents on earlier questionnaires. In a telephone conversation this information was verified (or the respondent-to-be was identified). Few respondents refused co-operation mainly due to the work overload during the period when they intended to complete the questionnaire. In some cases the on-site visits were not organised the same day or the day after the questionnaire completion because of the respondents’ other business engagements, sick leaves, holidays or weekends, etc. A short time lag though does not seem to be so damaging for remembering a frequently repeated and well documented process.

3. Results and Discussion The results are presented and discussed according to the steps of the “complete model” by Willimack and Nichols (2001). 3.1 Encoding in memory / record formation As argued by Eisenhower, Mathiowetz and Morgenstein (1991) a retrieval of information can only occur after its encoding; the information has to first enter a person’s memory or business records. Survey requests for the first answering rarely come so much in advance to allow changes in the record formation. Theoretically, there are plenty of possibilities of adapting the record formation in a recurring survey since the respondents are aware of the recurrence. However, in our study only one unit did some sort of upgrade for statistical reporting, i.e. coding of merchandise in order to ease their assignment to merchandise groups. Even in this case the primary goal of the upgrade was to satisfy another (legal) request but the statistical request was taken into account. No other respondent deemed necessary to change the record formation despite evident deficiencies with regard to this reporting. The respondents provided several explanations for such reasoning: statistics do not have a standard character, estimation is acceptable, adaptation of record formation results in better estimates but not completely accurate data, lack of motivation to organise the adaptation, uncertainty about a new activity and the necessity of a reporting system, etc. It seems that for businesses it is not rational to change the record formation in order to increase the accuracy of statistical reporting. They may not consider statistical reporting continuous, frequent, sanctionable and/or otherwise important enough to take such changes into consideration. The more limited the extent to which the business records

support the survey response is the higher the importance of the judgement step in order to avoid or reduce measurement errors. 3.2 Selection & identification of the respondent(s) Respondent selection rightly occupies a significant part of discussion on the response process in business surveys. It is the only variable factor that could be influenced within the business in the short term, the organisational context being usually given and fixed (given business information system, hierarchies, policies and priorities, human resources, etc.). But even this variability is limited because eventually, the respondent selection is a business decision. In a recurring survey this step is usually skipped since previously designated respondent performs the task. The higher the survey frequency, the more likely it is to find the same respondent completing the questionnaire. The respondents in our study were asked about the recurrence and the reasons for their selection. It turned out that most of them had already been completing the questionnaire for quite some time (up to ten years approximately). Most commonly, the task was “inherited from the previous respondent”, it was “part of the job” or “no one else was more appropriate with regard to the data request”. The study showed that the respondent selection becomes an issue only if the respondent’s workplace undergoes a change. Such a change may be due to a reorganisation of work and reassignment of tasks among the employees. The respondent may simply be substituted in the same workplace with the same job description: he or she may leave a particular workplace because of retirement, promotion, resignation, and dismissal, or may be absent for a longer period of time. The most promising finding related to the respondent’s change is that not all respondents accepted previous procedures of questionnaire completion. Some reported they set up new procedures when they took on this task. One even complained about the negligence of the previous respondent. Of course, the new respondent may also choose to downgrade the statistical reporting. 3.3 Assessment of priorities Businesses face competing data requests and survey requests are not among the highly rated ones: provision of data to the management, shareholders or tax authorities will always be prioritised. Initial assessment of priorities is an organisational matter. It is probably carried out jointly with the respondent selection considering all tasks necessary to run the business. The first filling out of a survey questionnaire may also bear some uncertainty about the time consumption, data availability or its retrieval while the repetitions are much more predictable. Our study indicates that in recurring surveys the assessment of priorities may have a narrower scope because it only takes into account the respondent’s priorities. The scheduling of the survey task appears to be relatively stable, which means that early

and late respondents both tend to remain such. Three factors are most prominent in this regard: the data availability, the competing tasks or simply the respondent’s motivation. When data were not available at the deadline, some respondents gave priority to accuracy and waited for data while others gave it to timeliness and resorted to estimation. 3.4 Comprehension of the data request In comprehension phase the respondent has to interpret the survey request for information. In business surveys s/he has to relate the words and their meaning to the business reality. The interpretation of questions (or better said, labels) is often supported by detailed instructions. In recurring business surveys this step may be strikingly different from the first encounter with the questionnaire. In our study many respondents admitted that they do not read the whole questionnaire, let alone the instructions. After they gave the questionnaire a swift scan for changes they plunged into the retrieval processes based on the previously completed questionnaire or other working papers and supporting notes. Comprehension is thus granted a very limited time and seems to be focused exclusively or predominantly on previously reported items. Several observations point in this direction: • • • •

the respondents produced some answers off the top of the head especially about employment in small enterprises; they immediately associated labels consisting of accounting expressions with their data source, i.e. with a particular group of accounts in business records; they instantly started with or referred to computation when they came across questions for which the working papers and supporting notes indicated an estimation procedure, e.g. a break-down of total retail turnover by customer type; they were asked about the “business reality” behind the reported figures, e.g. what business activities generate the reported revenues, who are the customers in a reported category, who are the employees included in a certain category. Few respondents answered all such questions with confidence. Others were totally puzzled with the questions, uncertain about their answers, and exhibited a lot of effort responding to this inquiry; few even had no clue about it and had to ask colleagues for help.

These observations may also have other explanations and implications: immediate response may indicate that there is nothing to deliberate because it is so obvious; the perplexity of the respondents may be due to badly constructed or simply unusual questions; associating the words in the questionnaire with accounting terms may result in accurate reporting even if the respondent is not familiar with the captured phenomenon. However, when the respondents tried to explain and argue their case it sometimes happened that they realised their arguments are not valid, which surprised and embarrassed them.

It could be concluded that this step – as understood in Tourangeau’s tradition – is often omitted or at least performed very superficially and skipped to retrieval of relevant information from memory concerning the previous retrieval and judgement procedures. This is particularly likely to happen in monthly and quarterly surveys. As a result, measurement errors may occur when the questionnaire is subjected to changes or when the initial understanding was not appropriate, which has already been suggested by Sudman et al. (2000). The same may happen if the changes occur in the business reality. 3.5 Retrieval from memory and/or records This step usually consists of a combined retrieval from memory and records but it may also consist of the sole retrieval from memory. The respondents may need to contact other persons to collect the necessary data. The recurrence avoids trial-and-error procedure which may be present at the first retrieval and thus saves time. Our study cannot compare the time spent across repetitions but it revealed that the retrieval is a straight-forward operation carried out without hesitation and supported by notes from previous rounds. In order to ease the compilation of the quarterly data, some respondents in our study retrieved monthly data as they became available and kept them in manual or electronic documents. An interesting point is that they invented their own uses of such intermediate calculations, e.g. to provide this piece of information to their superiors or to have an overview of the monthly changes. This observation is particularly important because the use of data is the best method of their verification. It could be said that “the rehearsal of the look-up” as reported by Edwards and Cantor results in time savings but it may also produce practical side-products for the respondent’s further use. 3.6 Judgement of the adequacy of the response In this step the respondent has to pool all the available information, form an answer and judge its adequacy. The respondents in our study reported high levels of uncertainty about many items at the time of their first completion of the questionnaire. No such conditions were observed during our field work apart from two exceptions: the respondent in the newly selected enterprise, and the respondent who was just introduced to the filling in. The respondents expected a feedback from the statistical organisation after the first completion but generally it did not occur. This lack of reaction made them confident of their approach and they judged it as acceptable. The only problem is that many respondents reported at least one piece of data which was not completely accurate (or not as accurate as they would expect the data should be) and they perceived the lack of complaints as satisfaction with bad data, which is probably the most detrimental conclusion with a significant effect on the perception of statistics.

The respondents were generally convinced that SORS was knowledgeable of their situation. For that reason they rarely provided textual description of seasonal or other oscillations if they perceived them as an inherent part of their business activity. The respondents preserved their initial approach to questionnaire completion because previous reporting was satisfactory and for continuity reasons. The latter means that they would not change the routine during the year even if an error was discovered or a better practice thought of. The question is whether they really change their strategy in the following period or perpetuate the errors. Another variation of the judgment problem in our study was the continuous use of the same percentage break-down in consecutive rounds. It was usually justified by enormous effort and time consumption on the one hand and contentment of the statistical organisation on the other hand. One respondent recalculated the necessary breakdowns because of the on-site visit and the results were significantly different. The most noteworthy were not the discovered differences but his statement that the data would be much more accurate from that time on because he would use the new percentages ever since. Judgement may be one of the most unexplored steps in the response process of business surveys despite of its huge and potentially negative impact on measurement error in recurring surveys. The completion of the questionnaire becomes a routine which removes any uncertainty and hesitation so very few respondents preserve open mind and still question their approach. This represents a huge challenge also for editing procedures. 3.7 Communication of the response This step refers to the reporting of the response to the question by considering the required format. Our study did not reveal any special issues with this step. Some respondents mentioned that inconsistencies due to rounding were permitted. The only remarkable finding is that some respondents expressed the wish of having electronic forms instead of paper questionnaires because it meant less paper, easier copying, later submission and immediate calculation checks. 3.8 Release of the data The last step refers to the ultimate internal inspection of the reported data. Although it may only be focused on key figures and overall picture of the enterprise, it is beneficial as a double-check and for increasing the respondent’s motivation to do the job well. Our study showed that authorities may not have a major impact on the questionnaire completion. Many respondents signed the form themselves, many others got the signature without verification. Such behaviour may be attributed to enterprise size (e.g. double-checks may be less typical in small and medium enterprises because of time-

consumption and an unfavourable attitude toward statistical reporting) and recurrence (e.g. previous verifications may prove the respondent’s reliability).

4. Conclusions The recurring character of a business survey affects the response process significantly. In repeated administrations of the survey to the same business, the first three steps of the response process (record formation, respondent selection, priorities assessment) are less relevant or not relevant at all. The cognitive processes (comprehension, retrieval, judgement, communication) are characterised by routine. Often the comprehension step is performed superficially and refers more to understanding previous questionnaire completion than to understanding survey requests. The retrieval procedures follow the previously established course and exhibit learning curve effects. The respondents’ judgement clings to the initial approach and is unlikely to change. The only welcome change may be the electronic form of the questionnaire to facilitate the communication of the response. Finally, the recurrence may also impact the data release by loosening the respondents’ supervision. Several strategies may be recommended based on this knowledge of the recurring response process and possible sources of measurement error: • •



The first business participation in a recurring business survey calls for intensive and extensive support and control of the statistical organisation because it sets the standards and thus determines the quality of subsequent questionnaire completions. It is also advisable for the statistical organisation to trace the questionnaire respondents and consider every respondent’s change as a second chance for an efficient entry into the response process. The respondent’s first filling in of a questionnaire raises most questions and doubts despite supporting working papers or the initiation by the previous respondent. The routine of repeated questionnaire compilations requires occasional interventions or follow-ups by the survey organisation to trigger the initial cognitive processes and to incorporate unnoticed changes or neglected errors.

The above enlisted strategies are focused on the respondent and her/his motivation and capacity to report accurately. They primarily aim at reducing or preventing measurement errors committed during the cognitive processes. Other strategies should target the remaining steps of the response process, which all suffer from the lack of business resources assigned to statistical reporting. A challenge in this regard is to achieve a higher ranking among other tasks by providing statistical products that would be more usable from business perspective. Bringing more and more personalised products of official statistics into the business environment should be a path to follow. Last but not least, these conclusions are based on a small scale research study. Future research should demonstrate whether these strategies are efficient enough in terms of improving accuracy for acceptable cost and which should be implemented widely.

Acknowledgement I thank the SORS staff for co-operation, and Lars Lyberg and Lea Bregar for their guidance and support in this research.

References Biemer, P. P., and Fecso, R. S. (1995), "Evaluating and Controlling Measurement Error in Business Surveys", in B. G. Cox et al. (eds.) Business Survey Methods, New York: Wiley-Interscience, pp. 257-281. Edwards, W. S., and Cantor, D. (1991), "Toward a Response Model in Establishment Surveys", in P. P. Biemer et al. (eds.) Measurement Errors in Surveys, New York: Wiley-Interscience, pp. 211-233. Eisenhower, D., Mathiowetz, N. A., and Morganstein, D. (1991), "Recall Error: Sources and Bias Reduction Techniques", in P. P. Biemer et al. (eds.) Measurement Errors in Surveys, New York: Wiley-Interscience, pp. 127-144. Sudman, S., Willimack, D. K., Nichols, E., and Mesenbourg, T. L. (2000), "Exploratory Research at the U.S. Census Bureau on the Survey Response Process in Large Companies", ICES-II Proceedings of the Second International Conference on Establishment Surveys, 327-337. Tourageau, R. (1984), "Cognitive Science and Survey Methods", in T. B. Jabine (ed.) Cognitive Aspects of Survey Design: Building a Bridge between Disciplines, Washington, DC: National Academy Press, pp. 73-100. Willimack, D. K., Lyberg, L. E., Martin, J., Japec, L., and Whitridge, P. (2004), "Evolution and Adaptation of Questionnaire Development, Evaluation, and Testing Methods for Establishment Surveys", in S. Presser et al. (eds.) Methods for Testing and Evaluating Survey Questionnaires, Hoboken, NJ: Wiley-Interscience, pp. 385407. Willimack, D. K., and Nichols, E. (2001), "Building an Alternative Response Process Model for Business Surveys", Proceedings of the Annual Meeting of the American Statistical Association.

The Response Process in Recurring Business Surveys Mojca Bavdaz

Outline • • • •

Introduction and background Research methods and procedures Results and discussion Conclusions

Introduction and Background • Recurring business surveys • Models of the response process in business surveys: – Edwards & Cantor (1991) – Biemer & Fecso (1995) – Sudman et al. (2000) – Willimack & Nichols (2001)

• Recurrence in existing models

Introduction and Background • The Complete Model (Willimack & Nichols, 2001): – Encoding in memory / record formation – Selection & identification of respondent(s) – Assessment of priorities – Comprehension of data request – Retrieval from memory and/or records – Judgement of response adequacy – Communication of response – Release of data

Research Methods and Procedures • Quarterly Business Survey on Trade (SORS) • Research methods: – in-depth interview – observation

• Sample size and respondent selection • Implementation

Results and Discussion • • • • • •

Initial steps of survey response Comprehension of survey request Retrieval from memory and records Judgement Organisational context Survey design

Conclusions • Adaptation of existing models: – presence/absence of particular steps – modification of substance

• Implications

The Response Process in Recurring Business Surveys Mojca Bavdaz

Survey Research in the UK’s Ministry of Defence

Aaron Beck & Ian Gouldbourne, The Defence Analytical Services Agency

Survey Research in the UK’s Ministry of Defence

1. Background 2. The Past 3. The Present 4. The Future

Survey Research in the UK’s Ministry of Defence

1. Background

Background

The Ministry of Defence’s “Population” 310,000 Staff By location 2%

1%

11% United Kingdom Mainland European States Mediterranean Other

86%

Background

The Ministry of Defence’s “Population” Age distribution 20% 18% 16%

Percentage

14% 12% 10% 8% 6% 4% 2% 0% 16-19

20-24

25-29

30-34

35-39

Age group

40-44

45-49

50+

Background

The UK Ministry of Defence Vision Statement: “to produce battle winning people and equipment that are”: # Fit for the challenge of today # Ready for the tasks of tomorrow # Capable of building for the future

Background

Defence Analytical Services Agency Our Roles: ~ contribute to the Department’s accountability for its activities ~ provide economic, statistical and related management information, advice and research to underpin decision making, policy development and evaluation within MoD

Survey Research in the UK’s Ministry of Defence

2. The Past

The Past

Question: As a result of one/some of your team being in the development scheme, how strongly do you agree or disagree that their development has had an influence on the following?

The Past

The full range of business processes and tools are evaluated for your projects or programmes and the benefits fully understood. They are used to maximum effect to deliver your projects or programmes to meet their key performance indicators including time, cost and performance and their integration within the Defence environment

The Past

Based on your responses to the above question, do you feel that overall the development scheme has contributed to your team's actual delivery of outputs?

Faster Cheaper Based on a through life approach With more effective integration

Survey Research in the UK’s Ministry of Defence

3. The Present

The Present

Ethical Standards for Research General Principles The research project must: contribute significantly to an official MoD programme have reasonable prospects of yielding worth to the MoD, generally through delivering important, new and statistically valid results

The Present

Survey Charter Commitment to people who respond to our surveys Value their time and ask for information only when their involvement is important Give them the opportunity to understand the background to a survey before they share their information with us Explain their obligations and describe our obligations to them Treat the information we receive with the strictest confidence and ensure that it is only used for statistical purposes

The Present

Survey Charter Commitment to the people and organisations who use our survey statistics Listen to their views, and consider their needs when we design our surveys and survey statistics Produce survey results that are objective, fit for purpose and of high quality Produce information about our surveys and survey results Work towards presenting information in a standard format and to a consistent level of detail

The Present

Survey Charter Commitment to individuals, the department and the government Conduct our survey work efficiently and cost-effectively and strive to obtain maximum value from survey data

The Present

Core Questions for staff surveys in the UK Civil Service Question scales To what extent would you agree with the following statements … Strongly agree, Agree, Neither agree nor disagree, Disagree, Strongly disagree

The Present

The Questions Pride

I am proud to work for this Department

The Present

The Questions Empowerment

I understand how my work contributes to the objectives of the Department I think it is safe to speak up and challenge the way things are done in the Department

The Present

The Questions Feedback and performance management

I receive regular and constructive feedback on my performance Poor performance is dealt with effectively where I work

The Present

The Questions Values I am treated with fairness and respect

The Present

The Questions Skills and professionalism My performance has improved as a result of skills I have developed over the past year The people I manage have the skills they need to deliver their objectives [ask of managers only]

The Present

The Questions Management and leadership My team is well managed The Department as a whole is well managed Overall, I have confidence in the senior managers within my Department

The Present

The Questions Change and communications The Department does a good job of keeping me informed about matters affecting me

The Present Background

Plus 4 “Discretionary” Questions I would recommend the Department as a good place to work I am satisfied with my job My team regularly looks for ways to serve our customers better I feel that change is managed well in this Department

The Present

MOD’s AIM

DEPLOYABLE FORCE

The Present

MOD’s AIM DEPLOYABLE FORCE SUPPLY AND SUPPORT

CENTRAL

PEOPLE

SERVICES

EQUIPMENT

DIRECTION POLICY

The Present

MOD’s AIM DEPLOYABLE FORCE SUPPLY AND SUPPORT PEOPLE

I

EQUIPMENT

DIRECTION POLICY

CENTRAL SERVICES

The Present

KNOWLEDGE

INFORMATION

DATA

The Present

ACQUIRING DATA • COMPARING & LINKING DATA • ANALYSIS • DISSEMINATING DATA

The Present Ba c k g ro u n d In fo rm a tio n W e need to know background details about you so we can group people together when we analyse the data. W e guarantee that only the Pearn Kandola and DASA researchers will have access to your answers under the conditions set out in the Instructions. If you are interrupted or wish to stop part way through the survey, then please SAVE the spreadsheet. You will be able to re-open it later. In order that it may pass through the Department's firewalls, the workbook has not been "protected". It is there

1 Gender (please also see question 24 on gender reassignment) Male

Female

Yes

No

2 Do you consider yourself to have a disability?

3 Age last birthday

25

years

4 Ethnic Background (these categories match those used in the 2001 UK National Census) White:

The Present

The Psychological Contract “An individual's belief regarding the terms and conditions of a reciprocal exchange agreement between that focal person and another party … a belief that some form of promise has been made and that the terms and conditions of the contract have been accepted by both parties”

The Present

Contextual and Background Factors Individual: Age Gender Education Level in organisation Type of Work Hours Worked Employment contract Ethnicity Tenure Income Organisational: Sector Size Ownership Business strategy Union Recognition

Policy and Practice

Psychological Contract

State of the Psychological Contract

Attitudinal Consequences: Organisational commitment Work satisfaction Work-life balance Job security Motivation Stress

HR Policy and Practices Leadership / Climate Employment relations Quality of workplace

Reciprocal promises and obligations

Delivery of the deal

Outcomes

Trust

Fairness Behavioral Consequences: Attendance Intention to stay / quit Job performance OCB

Survey Research in the UK’s Ministry of Defence

4. The Future

The Future

Annual staff census

Working patterns

Pulse surveys

The great big “I”

Administrative data

Exit survey

Harassment survey

Intake survey Health data

Survey Research in the UK’s Ministry of Defence

Any Questions?

Quantifying the response burden of enterprises in Germany caused by official statistics Reiner Stäglin and Ingo Pfeiffer (German Institute for Economic Research, DIW Berlin) Roland Sturm assisted by Birgit Korth (Federal Statistical Office, Wiesbaden)

1. Background and aim of the response burden study



Problems of available bureaucracy studies



No distinction between official and non official statistical surveys



Measuring objectively the response burden of enterprises

DIW Berlin 2006

2. Concept and first results of the study 2.1 Use of the business register of the statistical offices 2.2 Analysis of data from the response burden survey carried out by statistical offices 2.3 Evaluation of additional information from the supplementary DIW response burden survey among selected enterprises

DIW Berlin 2006

Response frequency of enterprises … per cent of enterprises respond to

keiner Statistik

84,7 10,8

einer Statistik

zwei Statistiken (2,2%) drei Statistiken (0,6%) vier Statistiken (0,5%) fünf und mehr Statistiken (1,2%)

Source: Official business register.

DIW Berlin 2006

2.1 Use of the business register



The business register offers no information about the volume of response burden



For that reason one must look for other approaches to gain reliable information on the response burden caused by official statistics

DIW Berlin 2006

2.2 Data from the response burden survey (I)



Components of the burden survey approach



Structure of the burden questionnaire



Results from the quantitative part of the response burden survey for Germany in 2004

DIW Berlin 2006

Responding Time in 2004 Burden questionnaire No. 290 297 046 007 001 016 066 065 401 395 398E 489 398A 026 301 469 081 466 459d 010

Description of survey Intrastat Survey on rail transport Monthly report in building industry Monthly production in manufacturing Monthly report in manufacturing Survey on iron and steel Monthly report on electricity supply Monthly report on energy, water supply Survey on prices of construction Survey on prices of agricultural inputs Survey on import prices EU labour cost survey Survey on export prices Survey on wood Survey on road passenger transport Monthly survey in tourism Cost structure survey on energy Monthly survey in hotels, restaurants Monthly survey in trade Quarterly production in manufacturing

Source: Response burden survey of statistical offices.

Answers

Usable answers

2 533 87 283 249 1 198 118 114 174 323 137 1 583 712 1 424 57 135 1 320 237 445 1 180 499

2 507 85 280 247 1 185 115 113 173 316 136 1 509 695 1 340 55 133 1 277 233 428 1 159 493

Average responding time in minutes per announceper year ment 287 149 93 85 77 74 70 58 57 52 46 534 44 123 122 40 481 39 36 99

3 448 1 790 1 118 1 014 929 886 835 695 689 624 546 534 526 491 487 485 481 462 428 397 DIW Berlin 2006

Amount of average responding time caused by 74 official statistical surveys in 2004 according to frequency surveys 25

23

20 15

13

yearly

quarterly

several years

12 9

10 5

monthly

6

5

5 1

0