Data Collection: Planning for and Collecting All Types of DataISBN: 978-0-7879-8718-3
Paperback
192 pages
February 2008, Pfeiffer
|
1. Using Questionnaires and Surveys.
Types of Questions.
Questionnaire Design Steps.
Pager: Please do not italicize the Contents H1 items or the sublists to this level.
Determine the Specific Information Needed.
Involve Stakeholders in the Process.
Select the Types of Questions.
Develop the Questions.
Check the Reading Level.
Test the Questions.
Address the Anonymity Issue.
Design for Ease of Tabulation and Analysis.
Develop the Completed Questionnaire and Prepare a Data Summary.
Improving the Response Rate for Questionnaires and Surveys.
Provide Advance Communication.
Communicate the Purpose.
Describe the Data Integration Process.
Keep the Questionnaire as Simple as Possible.
Simplify the Response Process.
Use Local Manager Support.
Let the Participants Know That They Are Part of a Sample.
Consider Incentives.
Have an Executive Sign the Introductory Letter.
Use Follow-Up Reminders.
Send a Copy of the Results to the Participants.
Review the Questionnaire with Participants.
Consider a Captive Audience.
Communicate the Timing of Data Flow.
Select the Appropriate Media.
Consider Anonymous or Confidential Input.
Pilot Test the Questionnaire.
Explain How Long Completing the Questionnaire Will Take.
Personalize the Process.
Provide an Update.
Final Thoughts.
2. Using Tests.
Types of Tests.
Norm-Referenced Tests.
Criterion-Referenced Tests.
Performance Tests.
Simulations.
Electromechanical Simulation.
Task Simulation.
Business Games.
In-Basket Simulation.
Case Study.
Role-Playing.
Informal Tests.
Exercises, Problems, or Activities.
Self-Assessment.
Facilitator Assessment.
Final Thoughts.
3. Using Interviews, Focus Groups, and Observation.
Interviews.
Types of Interviews.
Interview Guidelines.
Pager: Please style the following as a sublist to the previous list.
Develop the Questions to Be Asked.
Test the Interview.
Prepare the Interviewers.
Provide Clear Instructions to the Participants.
Schedule the Interviews.
Pager: end of sublist.
Focus Groups.
Applications of Focus Groups.
Guidelines.
Pager: Please style the following as a sublist to the previous list.
Plan Topics, Questions, and Strategy Carefully.
Keep the Group Size Small.
Use a Representative Sample.
Use Experienced Facilitators.
Pager: end of sublist.
Observations.
Guidelines for Effective Observation.
Pager: Please style the following as a sublist to the previous list.
Observations Should Be Systematic.
Observers Should Be Knowledgeable.
The Observer’s Influence Should Be Minimized.
Observers Should Be Selected Carefully.
Observers Must Be Fully Prepared.
Pager: end of sublist.
Observation Methods.
Pager: Please style the following as a sublist to the previous list.
Behavior Checklist.
Delayed Report.
Video Recording.
Audio Monitoring.
Computer Monitoring.
Pager: end of sublist.
Final Thoughts.
4. Using Other Data Collection Methods.
Business Performance Monitoring.
Using Current Measures.
Pager: Please style the following as a sublist to the previous list.
Identify Appropriate Measures.
Convert Current Measures to Usable Ones.
Pager: end of sublist.
Developing New Measures.
Action Planning.
Developing an Action Plan.
Using Action Plans Successfully.
Pager: Please style the following as a sublist to the previous list.
Communicate the Action Plan Requirement Early.
Describe the Action Planning Process at the Beginning of the Program.
Teach the Action Planning Process.
Allow Time to Develop the Plan.
Have the Facilitator Approve Action Plans.
Require Participants to Assign a Monetary Value to Each Improvement.
Ask Participants to Isolate the Effects of the Program.
Ask Participants to Provide a Confidence Level for Estimates.
Require That Action Plans Be Presented to the Group.
Explain the Follow-Up Process.
Collect Action Plans at the Stated Follow-Up Time.
Summarize the Data and Calculate the ROI.
Pager: end of sublist.
Applying Action Plans.
Identifying Advantages and Disadvantages of Action Plans.
Performance Contracts.
Final Thoughts.
5. Measuring Reaction and Planned Action.
Why Measure Reaction and Planned Action?
Customer Satisfaction.
Immediate Adjustments.
Team Evaluation.
Predictive Capability.
Importance of Other Levels of Evaluation.
Areas of Feedback.
Data Collection Issues.
Timing.
Methods.
Administrative Guidelines.
Uses of Reaction Data.
Final Thoughts.
6. Measuring Learning and Confidence.
Why Measure Learning and Confidence?
The Learning Organization.
Compliance Issues.
Development of Competencies.
Certification.
Consequences of an Unprepared Workforce.
The Role of Learning in Programs.
Measurement Issues.
Challenges.
Program Objectives.
Typical Measures.
Timing.
Data Collection Methods.
Administrative Issues.
Validity and Reliability.
Consistency.
Pilot Testing.
Scoring and Reporting.
Confronting Failure.
Uses of Learning Data.
Final Thoughts.
7. Measuring Application and Implementation.
Why Measure Application and Implementation?
Obtain Essential Information.
Track Program Focus.
Discover Problems and Opportunities.
Reward Effectiveness.
Challenges.
Linking Application with Learning.
Building Data Collection into the Program.
Ensuring a Sufficient Amount of Data.
Addressing Application Needs at the Outset.
Measurement Issues.
Methods.
Objectives.
Areas of Coverage.
Data Sources.
Timing.
Responsibilities.
Data Collection Methods.
Questionnaires.
Pager: Please style the following as a sublist to the previous list.
Progress with Objectives.
Use of Program Materials and Handouts.
Application of Knowledge and Skills.
Changes in Work Activities.
Improvements or Accomplishments.
Definition of the Measure.
Amount of Change.
Unit Value.
Basis for Value.
Total Annual Impact.
Other Factors.
Improvements Linked with the Program.
Confidence Level.
Perception of Investment in the Program.
Link with Output Measures.
Other Benefits.
Barriers.
Enablers.
Management Support.
Other Solutions.
Target Audience Recommendations.
Suggestions for Improvement.
Pager: end of sublist.
Interviews, Focus Groups, and Observation.
Action Plans.
Barriers to Application.
Uses of Application Data.
Final Thoughts.
8. Measuring Impact and Consequences.
Why Measure Business Impact?
Impact Data Provide Higher-Level Information on Performance.
Impact Data Represent the Business Driver of a Program.
Impact Data Provide Value for Sponsors.
Impact Data Are Easy to Measure.
Effective Impact Measures.
Hard Data Measures.
Soft Data Measures.
Tangible Versus Intangible Measures.
Impact Objectives.
Linking Specific Measures to Programs.
Sources of Impact Data.
Data Collection Methods.
Monitoring Business Performance Data.
Pager: Please style the following as a sublist to the previous list.
Identify Appropriate Measures.
Convert Current Measures to Usable Ones.
Develop New Measures.
Pager: end of sublist.
Action Plans.
Pager: Please style the following as a sublist to the previous list.
Set Goals and Targets.
Define the Unit of Measure.
Place a Monetary Value on Each Improvement.
Implement the Action Plan.
Document Specific Improvements.
Isolate the Effects of the Program.
Provide a Confidence Level for Estimates.
Collect Action Plans at Specified Time Intervals.
Summarize the Data and Calculate the ROI.
Pager: end of sublist.
Performance Contracts.
Questionnaires.
Final Thoughts.
9 Selecting the Proper Data Collection Method.
Matching Exercise.
Selecting the Appropriate Method for Each Level.
Type of Data.
Investment of Participants? Time.
Investment of Managers? Time.
Cost.
Disruption of Normal Work Activities.
Accuracy.
Built-In Design Possibility.
Utility of an Additional Method.
Cultural Bias of Data Collection Method.
Final Thoughts.
Index.
About the Authors.