Silicon-driven Training and Education: What it Means for Lean Six Sigma – Part 1

One area that is ripe for innovation and improvement is education. Whether it is basic education for children, high school, undergraduate or graduate studies, or the vast field of training and education for organizations, there are few people who are satisfied with the state of nation.

For some time many have looked to web-delivered courses (both real-time with a live instructor and self-paced) as well as various stand-alone software programs, computer simulation games, and other computer-enabled approaches for ways to improve both the efficacy and efficiency of education and training.

Not only is education and training are a potential area for the deployment of Lean Six Sigma-type improvement methods but, because of how much money organizations spend on training employees on continuous improvement techniques, the performance improvement methodologies themselves need to examine how they conduct Lean Six Sigma and related training. For example, far too many Lean Six Sigma training sessions depend on the presentation of PowerPoint slides and page-flipping through binders rather than using other methods that might be more effective for certain participants. But to what extent is software-driven and web-delivered training effective? Given the sums of money involved it is natural to see more than a little self-interest in software companies, educational consultants, and the learning and development staff who work in companies to promote the new gadgets, if only because it may help justify budgets or provide an interesting topic of research for their next conference or PhD paper.

In a recent New York Times piece, “Inflating the Software Report Card” (October 8, 2011), authors Trip Gabriel and Matt Richtel wrote:

The Web site of Carnegie Learning, a company started by scientists at Carnegie Mellon University that sells classroom software, trumpets this promise: “Revolutionary Math Curricula. Revolutionary Results.”

The pitch has sounded seductive to thousands of schools across the country for more than a decade. But a review by the United States Department of Education last year would suggest a much less alluring come-on: Undistinguished math curricula. Unproven results.

The federal review of Carnegie Learning’s flagship software, Cognitive Tutor, said the program had “no discernible effects” on the standardized test scores of high school students. A separate 2009 federal look at 10 major software products for teaching algebra as well as elementary and middle school math and reading found that nine of them, including Cognitive Tutor, “did not have statistically significant effects on test scores.”

Amid a classroom-based software boom estimated at $2.2 billion a year, debate continues to rage over the effectiveness of technology on learning and how best to measure it. But it is hard to tell that from technology companies’ promotional materials.

Many companies ignore well-regarded independent studies that test their products’ effectiveness. Carnegie’s Web site, for example, makes no mention of the 2010 review, by the Education Department’s What Works Clearinghouse, which analyzed 24 studies of Cognitive Tutor’s effectiveness but found that only four of those met high research standards. Some firms misrepresent research by cherry-picking results and promote surveys or limited case studies that lack the scientific rigor required by the clearinghouse and other authorities.

“The advertising from the companies is tremendous oversell compared to what they can actually demonstrate,” said Grover J. Whitehurst, a former director of the Institute of Education Sciences, the federal agency that includes What Works.

School officials, confronted with a morass of complicated and sometimes conflicting research, often buy products based on personal impressions, marketing hype or faith in technology for its own sake.

“They want the shiny new one,” said Peter Cohen, chief executive of Pearson School, a leading publisher of classroom texts and software. “They always want the latest, when other things have been proven the longest and demonstrated to get results.”

Carnegie, one of the most respected of the educational software firms, is hardly alone in overpromising or misleading. The Web site of Houghton Mifflin Harcourt says that “based on scientific research, Destination Reading is a powerful early literacy and adolescent literacy program,” but it fails to mention that it was one of the products the Department of Education found in 2009 not to have statistically significant effects on test scores.

Similarly, Pearson’s Web site cites several studies of its own to support its claim that Waterford Early Learning improves literacy, without acknowledging the same 2009 study’s conclusion that it had little impact.

And Intel, in a Web document urging schools to buy computers for every student, acknowledges that “there are no longitudinal, randomized trials linking eLearning to positive learning outcomes.” Yet it nonetheless argues that research shows that technology can lead to more engaged and economically successful students, happier teachers and more involved parents.

“To compare this public relations analysis to a carefully constructed research study is laughable,” said Alex Molnar, professor of education at the National Education Policy Center at the University of Colorado. “They are selling their wares.”

In another piece, Richtel noted that for better or worse, schools are spending big money on the bet that technology will address the issues facing education. But in this article, as in others, there is still a lack of convincing, objective evidence that the investment is actually paying off. Here’s a vignette from a school in Chandler Arizona in Richtel’s piece:

Amy Furman, a seventh-grade English teacher here, roams among 31 students sitting at their desks or in clumps on the floor. They’re studying Shakespeare’s “As You Like It” — but not in any traditional way.

In this technology-centric classroom, students are bent over laptops, some blogging or building Facebook pages from the perspective of Shakespeare’s characters. One student compiles a song list from the Internet, picking a tune by the rapper Kanye West to express the emotions of Shakespeare’s lovelorn Silvius.

The class, and the Kyrene School District as a whole, offer what some see as a utopian vision of education’s future. Classrooms are decked out with laptops, big interactive screens and software that drills students on every basic subject. Under a ballot initiative approved in 2005, the district has invested roughly $33 million in such technologies.

The digital push here aims to go far beyond gadgets to transform the very nature of the classroom, turning the teacher into a guide instead of a lecturer, wandering among students who learn at their own pace on Internet-connected devices.

“This is such a dynamic class,” Ms. Furman says of her 21st-century classroom. “I really hope it works.”

Hope and enthusiasm are soaring here. But not test scores.

Since 2005, scores in reading and math have stagnated in Kyrene, even as statewide scores have risen.

To be sure, test scores can go up or down for many reasons. But to many education experts, something is not adding up — here and across the country. In a nutshell: schools are spending billions on technology, even as they cut budgets and lay off teachers, with little proof that this approach is improving basic learning.

This conundrum calls into question one of the most significant contemporary educational movements. Advocates for giving schools a major technological upgrade — which include powerful educators, Silicon Valley titans and White House appointees — say digital devices let students learn at their own pace, teach skills needed in a modern economy and hold the attention of a generation weaned on gadgets.

Some backers of this idea say standardized tests, the most widely used measure of student performance, don’t capture the breadth of skills that computers can help develop. But they also concede that for now there is no better way to gauge the educational value of expensive technology investments.

“The data is pretty weak. It’s very difficult when we’re pressed to come up with convincing data,” said Tom Vander Ark, the former executive director for education at the Bill and Melinda Gates Foundation and an investor in educational technology companies. When it comes to showing results, he said, “We better put up or shut up.”

And yet, in virtually the same breath, he said change of a historic magnitude is inevitably coming to classrooms this decade: “It’s one of the three or four biggest things happening in the world today.”

Critics counter that, absent clear proof, schools are being motivated by a blind faith in technology and an overemphasis on digital skills — like using PowerPoint and multimedia tools — at the expense of math, reading and writing fundamentals. They say the technology advocates have it backward when they press to upgrade first and ask questions later.

The spending push comes as schools face tough financial choices. In Kyrene, for example, even as technology spending has grown, the rest of the district’s budget has shrunk, leading to bigger classes and fewer periods of music, art and physical education.

As a lay person in the world of education, I am neither for nor against the use of technology. But as a performance improvement professional, devoted to arts and sciences of using facts and logical argument, the lack of strong evidence, or perhaps more accurately, the sense that facts and inferences are used in the service of a multitude of agendas, motives, egos and doctrines makes the whole arena seem over-due for objective and scientific study. But that is easier said than done in the gray world of learning. Notes Richtel:

Many studies have found that technology has helped individual classrooms, schools or districts. For instance, researchers found that writing scores improved for eighth-graders in Maine after they were all issued laptops in 2002. The same researchers, from the University of Southern Maine, found that math performance picked up among seventh and eighth-graders after teachers in the state were trained in using the laptops to teach.

A question plaguing many education researchers is how to draw broader inferences from such case studies, which can have serious limitations. For instance, in the Maine math study, it is hard to separate the effect of the laptops from the effect of the teacher training.

Educators would like to see major trials years in length that clearly demonstrate technology’s effect. But such trials are extraordinarily difficult to conduct when classes and schools can be so different, and technology is changing so quickly.

And often the smaller studies produce conflicting results. Some classroom studies show that math scores rise among students using instructional software, while others show that scores actually fall. The high-level analyses that sum up these various studies, not surprisingly, give researchers pause about whether big investments in technology make sense.

One broad analysis of laptop programs like the one in Maine, for example, found that such programs are not a major factor in student performance.

“Rather than being a cure-all or silver bullet, one-to-one laptop programs may simply amplify what’s already occurring — for better or worse,” wrote Bryan Goodwin, spokesman for Mid-continent Research for Education and Learning, a nonpartisan group that did the study, in an essay. Good teachers, he said, can make good use of computers, while bad teachers won’t, and they and their students could wind up becoming distracted by the technology.

A review by the Education Department in 2009 of research on online courses — which more than one million K-12 students are taking — found that few rigorous studies had been done and that policy makers “lack scientific evidence” of their effectiveness.. A division of the Education Department that rates classroom curriculums has found that much educational software is not an improvement over textbooks.



Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.