Learn How to ThinkPosted: July 22, 2014
Whether called “Black Belts,” “Performance Improvement Experts,” “Lean Sensei” or any other term for a person whose role is to help transform how things are done, some form of process analysis is part of the job. For xray-delta, one of the traps for people in these roles is to focus on the mechanics of calculating sample sizes, confidence intervals, or performing any number of process or financial analyses through tools like Excel Minitab, JMP, SAS, SigmaXL etc. rather than learning how to reason from first principles, to have the skills to know not only what to calculate but also to filter the answers software might spit out through mental models that identify nonsensical results.
This balancing act — between knowing the algorithm and equation and understanding the underlying principles and the ability to apply the rote formula — as also at the root of many educational debates over what constitutes a more effective way of imparting mathematical thinking in young students.
In an article, “Time for a Ceasefire,” The Economist in looking at the recently published PISA results for maths (we looked at PISA in a previous post “Doing a DMAIC on Education“), reported:
If the world’s education systems have a common focus, it is to turn out school-leavers who are proficient in mathematics. Governments are impressed by evidence from the World Bank and others that better maths results raises GDP and incomes. That, together with the soul-searching provoked by the cross-country PISA comparisons of 15-year-olds’ mathematical attainment produced by the OECD, a club of mostly rich countries, is prompting educators in many places to look afresh at what maths to teach, and how to teach it.
Maths education has been a battlefield before: the American “math wars” of the 1980s pitted traditionalists, who emphasised fluency in pen-and-paper calculations, against reformers led by the country’s biggest teaching lobby, who put real-world problem-solving, often with the help of calculators, at the centre of the curriculum. A backlash followed as parents and academics worried that the “new math” left pupils ill-prepared for university courses in mathematics and the sciences. But as many countries have since found, training pupils to ace exams is not the same as equipping them to use their hard-won knowledge in work and life.
Today’s reformers think new technology renders this old argument redundant. They include Conrad Wolfram, who worked on Mathematica, a program which allows users to solve equations, visualise mathematical functions and much more. He argues that computers make rote procedures, such as long division, obsolete. “If it is high-level problem-solving and critical thinking we’re after, there’s not much in evidence in a lot of curriculums,” he says.
Estonia’s government has commissioned Mr Wolfram’s consultancy in Oxfordshire to modernise maths courses for secondary-school pupils. Starting this month, it will pilot lessons built around open-ended problems which have no single solution. One example: “What’s the best algorithm for picking a romantic date?” (Possible answer: go on more dates with a lower quality threshold to maximise the chance of success.) Another: “Am I drunk?”, which leads into quantitative analysis involving body masses, rates of alcohol absorption and other variables.
Some PISA stars are also seeking a new approach. Singapore’s government commissioned David Hogan, an Australian maths educator, to assess its syllabus. It wants pupils to be able to explain what they know to their classmates, and to apply it in unfamiliar situations.
Israel is also experimenting, aware of the imminent end of the windfall provided by the arrival of many mathematicians from the former Soviet Union. Some Israeli maths lessons from primary school onwards will soon be taught using inexpensive tablets, an approach inspired by Shimon Schocken. An academic at Herzliya University, from 2005 he has created a large online group of “self-learners” who build computers and programs from scratch. Pupils are encouraged to visualise their calculations, not just to get the right answers.
In the experience of this author gleaned from training and coaching many hundreds of Black Belts, process improvement professionals must also learn not only the fundamental mechanics of calculating things with programs, but more importantly, to know why they are calculating these things, what the numbers mean, and how to apply them to actual problems. I also think that to a certain degree, understanding how to calculate certain core things, such as the control limits on a control chart, with pen, paper, and calculator, is also very helpful for a process professional to know how to do and why it works.