New York, NY (PRWEB) February 20, 2008
A newly released research study concludes that Merit Software's (http://www.meritsoftware.com) reading and language arts programs delivered higher test scores for students who used the software than those who did not. The study, which was conducted on 6th and 7th graders at the Calhoun Middle School in Mount Zion, West Virginia, showed that students performed better on the state's standardized test when Merit was used in conjunction with regular classroom instruction for 24 weeks. Year-end test scores for Reading/Language Arts averaged 30 points higher than test scores for students in the control group.
According to the researchers, 37% more Merit 6th graders and 19% more Merit 7th graders achieved Mastery level, or higher, on the state's performance rubric than control students. Helping lower level students raise their test scores to mastery levels is a subject of great interest to U.S. policy makers. The Department of Education aggregates ranges of test scores into performance levels when examining schools' annual academic progress.
Despite the need for scientific-based research on educational interventions there have been very few evaluations published showing effective uses of educational software in schools. The U.S. Department of Education conducted a study of the effectiveness of educational software in schools and published the results in 2007. The study, which did not include Merit, concluded that many widely used educational software programs have no impact on student achievement.
Merit began commissioning evaluations of its products in 2003. The company asked faculty at the Marshall University Graduate College in South Charleston, West Virginia to conduct several scientific-based, quantitative research studies on the impact of its educational software in West Virginia schools.
Prior to this study four evaluations were conducted. The earlier studies examined the impact of Merit reading, writing and math software on students in grades 3 through 8. These studies had some shortcomings including the lack of random assignment of pupils and a short time frame of implementation. The longest evaluation lasted 9 weeks, less than a typical school year semester. The researchers, however, were able to make valuable observations. The studies showed that using Merit improved student achievement and raised standardized test scores. The studies also indicated that a more lengthy use of the software might show statistically significant gains for lower quartile students.
The purpose of the present study was to document results obtained with a more rigorous design. This study offered the opportunity for an analysis with random assignment of students and pairings based on previous levels of achievement. It also provided the opportunity for researchers to evaluate the use of Merit for an extended time-period of 24 weeks.
Based on the analysis of the researchers:
- Year-end scores for Merit Reading/Language Arts averaged 30 points higher on West Virginia's state standardized test (WestTest) than scores for students in the control group.
- Thirty-seven percent more Merit sixth graders, and nineteen percent more seventh graders, achieved Mastery level, or higher, on the state's performance rubric than control group students.
- Test scores were also significantly higher for Merit students among Title I and female participants, with an 18 to 22 point advantage.
- An effect size of .94 was calculated for 6th graders' test scores, and .70 for 7th graders.
Merit provides individualized, context-sensitive, help throughout its software. Help is available to students in many forms while they use the software, whenever they want it, and as often they need it.
Using Merit, teachers are interrupted less frequently and have more class time to teach prepared lessons. In addition, the built-in tracking features help teachers discover just when they need to provide additional assistance to individual students.
A study summary is available on the Merit Software web site: http://research.meritsoftware.com