“It is an honor to lead an association with the caliber of dedicated professionals like our award-winners,” says AEA President Jennifer Greene.
Fairhaven, MA (PRWEB) October 26, 2011
The American Evaluation Association will honor four individuals and three groups for outstanding work at its annual awards luncheon to be held on Friday, Nov. 4, in conjunction with its Evaluation 2011 conference in Anaheim, CA. AEA is an international professional association that comprises more than 7,000 members worldwide. Honored this year will be recipients in six categories who have been involved with cutting-edge evaluation/research initiatives that have impacted citizens around the world.
“It is an honor to lead an association with the caliber of dedicated professionals like our award-winners,” says AEA President Jennifer Greene. “Their work demonstrates the substantial value of evaluation to diverse policy and program arenas in our society and around the globe.”
The recipients of AEA’s 2011 awards include:
Leonard Bickman, Psychology Professor, Peabody College, Vanderbilt University
2011 Alva and Gunnar Myrdal Evaluation Practice Award
Bickman has a distinguished 40-plus year career as a social psychologist and is recognized as a pioneer in applied research and evaluation. He spearheaded a comprehensive study of children’s mental health services more than a decade ago that involved 1,000 children and their families over a five-year period – one of the largest mental health services demonstration projects ever conducted on children and adolescents. He and his colleagues have since developed the Contextualized Feedback Systems (CFS) model in an effort to improve mental health services and educational leadership. “Driven by the lack of positive findings for children’s mental health services, Len has dedicated this stage of his career to trying to improve this area of intervention,” notes Deb Rog, AEA’s 2009 President who nominated Bickman for the award.
Margaret Hargreaves, Senior Health Researcher, Mathematica Policy Research
2011 Marcia Guttentag Promising New Evaluator Award
Hargreaves, the mother of a college student herself, brings more than 20 years experience with state and local government in Minnesota first as an EEOC investigator for a state human rights agency, then as a management analyst and then a public health planning supervisor. Hargreaves, who has considerable experience as a professional trainer of systems evaluation – in both face-to-face seminars as well as online webinars – in 2010 wrote Evaluating System Change: A Planning Guide, used in graduate courses at Harvard, by the University of Chicago Medical and Social Service Administration Schools, and other institutions including the National Institutes of Health and the Living Cities’ consortium of foundations.
David Jenkins, UK-based Independent Consultant
2011 Outstanding Evaluation Award
Co-director of PLEY (Proactive Learning from Early Years), a collective of artists, teachers and researchers, Jenkins is being honored for A TALE Unfolded, his thought-provoking evaluation of a European teaching program for mid-career professionals working with youths. Notes nominator Bob Stake, a professor at the University of Illinois, “The final report, A TALE Unfolded, is a substantial document arising out of an eclectic evaluation methodology that in context was far from risk free but was conducted in an exemplary fashion that has led to an analysis of outstanding quality and usefulness to the sponsors…It deploys such literary devices as narrative vignettes, irony, metaphor and wit, seeing humour as a legitimate way of addressing ambivalence…Given the setting, this is a report that does not pull any punches, pointing up serious unresolved tensions and ambiguities both in policy and practice that demand attention, but doing so with considerable grace and elan.”
Robin Lin Miller, Psychology Professor, Michigan State University
2011 Robert Ingle Service Award
Miller is being recognized for her more than 15 years of active service to AEA. She helped transform AEA’s annual conference from a small intimate gathering of professionals and colleagues to an international gathering that today attracts more than 2,500 attendees worldwide and also oversaw the conversion of its print-only journal (American Journal of Evaluation) into an electronic alternative that has experienced an increased number of editorial submissions, a greater diversity of content reviewers and impressive gains in subscribership worldwide. She served as Program Chair from 2000-2003, as associate editor for the American Journal of Evaluation (AJE) from 2001-2004 and was appointed Editor-in-Chief for two consecutive terms, serving from 2005-2009.
“AJE held its position among interdisciplinary social science journals in the Thomson Reuters Journal Citations Report and continued to outrank other evaluation journals in its category,” says MSU colleague Rebecca Campbell, who nominated Miller for the award. “AJE saw a steady growth in submissions (a 55% percent increase over the five years) and journal circulation rose from 4,307 in 2005 to 10,528 subscriptions in 2009. In 2005, AJE had no institutional subscriptions in the Caribbean, Southern Asia, Africa, or South America; by 2009, AJE had 162 institutional subscribers in these parts of the world.”
The Changing At-Risk Behavior Team
Joyce Ranney, Michael Zuschlag, and Michael Harnar with Boston-based Volpe National Transportation Systems Center and Michael Coplen, U.S. Department of Transportation’s Federal Railroad Administration (FRA)
2011 Outstanding Evaluation Award
Changing At-Risk Behavior (CAB) is a peer-to-peer safety intervention pilot project sponsored by the U.S. Department of Transportation’s Federal Railroad Administration that incorporated 1) peer-to-peer observation and feedback, 2) safety leadership development and 3) continuous process improvement. The CAB program evaluation team is recognized for evaluating this comprehensive new initiative that resulted in significant day-to-day safety improvements for Union Pacific Railroad (UPRR) and that influenced a broader shift in safety culture in the railroad industry. Long recognized as a leader for innovative safety programs, UPRR debuted new safety initiatives in its San Antonio, TX, service unit, which spanned 800 miles and included more than 1,000 locomotive engineers and conductors. Results of the CAB evaluation showed an 85% reduction in at-risk behaviors, a 72% drop in locomotive engineer decertification rates, and a 69% drop in the rate of human factor-caused derailments as a result of the CAB effort.
Since CAB, similar projects have been initiated at other railroads, including Amtrak, Toronto Transit and Burlington Northern Santa Fe.
“The data generated by the evaluation, and the strategically-valuable findings, convinced me to try this on a larger scale,” says Joe Boardman, former FRA Administrator and current President/CEO of Amtrak. “When I came to Amtrak, I initiated the “Safe-2-Safer” program, which is modeled on the CAB approach. Safe-2-Safer is a $14 million multi-year effort intended to improve safety and safety culture in every department across the entire company.”
Idaho Legislature’s Office of Performance Evaluations
2011 Alva and Gunnar Myrdal Government Award
The Idaho Legislature’s Office of Performance Evaluations was recognized for work that led to changes in policy and legislation as well as favorable media coverage. OPE’s work led to (1) more streamlined efforts and new initiatives—including the creation of a drug czar position—to effectively deal with Idaho’s substance abuse issues, (2) substantive organizational changes at the Idaho Department of Health and Welfare, and (3) better ways of managing Idaho’s transportation infrastructure assets and saving millions of dollars in highway construction, maintenance, and preservation.
Laura Leviton, Laura Kettel Khan, Nicola Dawkins
2011 Outstanding Publication Award
The Systematic Screening and Assessment Method: Finding Innovations Worth Evaluating
Countless programs are launched annually. Which ones merit evaluation for outcomes and how do we know? The Systematic Screening and Assessment Method: Finding Innovations Worth Evaluating (SSA) describes a cost-effective way to assist program funders, practitioners, and researchers in selecting the most promising innovations already in use and then prepares them for more rigorous evaluation. SSA combines existing evaluation methods into a six-step process, which has been adopted or adapted by the Centers for Disease Control and Prevention’s National Center for Chronic Disease Prevention and Health Promotion, Division of Nutrition, Physical Activity, and Obesity; Division of Heart Disease and Stroke Prevention; Division of Cancer Prevention and Control; and the National Center for Injury Prevention, Division of Violence Prevention. In addition, it is recommended by a consensus panel of the Institute of Medicine and cited by the General Accountability Office.
The American Evaluation Association is an international professional association and the largest in its field. Evaluation involves assessing the strengths and weaknesses of programs, policies, personnel, products, and organizations in order to improve their effectiveness. AEA’s mission is to improve evaluation practices and methods worldwide, to increase evaluation use, promote evaluation as a profession, and support the contribution of evaluation to the generation of theory and knowledge about effective human action. For more information about AEA, visit http://www.eval.org.