Skip Navigation

Publication Detail

Title: Strategies for evaluating the effectiveness of training programs.

Authors: Gotsch, A R; Weidner, B L

Published In Occup Med, (1994 Apr-Jun)

Abstract: This chapter has provided examples of practical and theoretical considerations that should be made when developing evaluation activities concomitant to training. Evaluation choices have been described based on considerations and experiences from others in the training field, sound rationale for program and policy research, and realistic constraints and demands of training. The discussion has presented some of the basic technical issues associated with the collection and analysis of trainee, program, and test data. It also has presented some basic considerations in optimizing procedures and in interfacing within and beyond programs for data collection. Finally, it has presented options available for data analysis from other models and methods in the training field. Whenever possible, select standard variables should be used to facilitate the advancement of training and reporting. Many variables and procedures will differ between programs based on training objectives; others should be part of basic activities. The authors conclude that use of demographic, program, and tests and nontest effectiveness measures are all important in assessing the quality of training efforts and the instruments themselves. When possible, such efforts should expand to include behavior assessment at the worksite, as occurs with behavior-based training developed and provided at the work-site. Most issues relating to evaluation in general are amenable to modification. Indeed, by responding to the issues cited, both in support of and in opposition to various methods, it is possible to address valid technical concerns and maximize data strength. The value of program evaluation will ultimately be based on the selection of critical variables and the use of data collection and analysis activities that maximize their potential. The value of program evaluation to policy development will largely be dependent upon the quality of program evaluation and the extent to which similar programs collaborate and report their outcomes.

PubMed ID: 8085200 Exiting the NIEHS site

MeSH Terms: Curriculum; Evaluation Studies as Topic*; Follow-Up Studies; Hazardous Waste/adverse effects*; Health Knowledge, Attitudes, Practice; Humans; Inservice Training*; New Jersey; New York; Occupational Exposure/prevention & control; Occupational Health*; Protective Clothing

Back
to Top