By Diana d’Ambra
In 1998, my daughter’s fourth grade teacher’s project assignment included a requirement that each student reference at least four sources and read at least twenty pages. But she had one non-negotiable rule – no Internet research! To search the web then, you used Yahoo, Excite, and my favorite at the time, Alta Vista.
When questioned, the teacher replied that if the kids used the web, she couldn’t see how many pages they read since she’d have to print them out, while if they use books or articles, it would be easier for her to track. She also strongly recommended using the local town or school library. But I countered, limiting research to books may make it easier for you to count pages, but if you use the Internet, your library has no walls. All that extra information! All those additional sources!
Well, she was obsessed with counting pages, and needless to say, didn’t budge on this limitation. She told me that people would always use books, never the Internet for research and I was doing disservice to my daughter by letting her even think of doing research online. Of course, she is now a school principal.
We still use books, eBooks and the old paper kind. But research is now dominated in many ways by the Internet, the latest evidence being the recent discontinuation of publication of Encyclopedia Britannica other than online.
But while it is easy to look back and chuckle at my daughter’s teacher’s predictions, many people who are experts in the field and paid to do this for a living seem to have hardly done any better.
And if we can’t predict overall trends, to say nothing of specifics, then what choices do we make when determining how to educate the next generation? And, a related issue is who are we educating for what? All students will use technology, but not all will become technologists no more than all students studying English will become professional writers or all students studying music become musicians. Yet we all understand the merits and necessity of studying English. (I realize, to my person dismay, that the case for studying music these days may be more problematical.)
The rapidity of change, and often the unexpected directions, has surprised many observers including professionals. The shifting of reading from “paper” to eBooks and the decline of newspapers and magazines due to the shift in reading habits have been faster and more dramatic than originally expected. The New Orleans Times Picayne is now published as a print edition only three times a week. Magazines, not only weekly, such as Newsweek but nearly all magazines, are struggling to find a viable financial model.
Reading about how Smartphones would be used from articles published when they first came out in the market, the diminished use of established very “old” technological products such as watches and alarm clocks was intuitively expected. But the decline in “point and shoot” cameras and later, standalone GPS products, even low end Flip camcorders, was not. And, yet innovation is rarely uniform. High end SLR cameras are still selling well and cable channels seem to sell watches nearly every night of the week.
And this is one product, the Smartphone, cutting a swath across new and old products alike while changing behavior. The Internet has changed everything from how we shop, to how we bank and file taxes. Now one out of six couples in a committed relationship met online.
All this change, often unforeseen and hard to predict, especially longer range, truly creates problems when trying to figure out how technology will affect people, to say nothing of what skills, education and job training will be required.
So, how do we train technologists and how do we train the future users of technology? Some even question if we should try to train people to use technology or just assume that they will learn it as they need it. While there tools ranging from books to YouTube videos to learn how to do online banking, if you do not have access to a computer, they are of little good. What is sometimes now assumed – that is we no longer work on typewriters, electric or manual, and use word processing ubiquitous – is not necessarily the case even n wealthy countries due to the cost of access and the hardware required.
Three approaches have arisen to answer these underlying questions:
The first approach is to go double down on technology. Recently, President Obama called for greater emphasis on STEM – science, technology, engineering and mathematics. Due to the low interest and concurrent graduation rates in the US, along with the increasing demand for such skills, it seems a safe approach. And certainly having more technologists is good, assuming the job market for them holds, but it doesn’t truly address the overall educational focus. Again, not everyone will be a technologist, so this is a good answer, but not a complete one.
The second approach is adding technology to the curriculum as an enabler, to let all students access and benefit from the Internet, while evening out disparities between lower income homes, the “digital divide”. There’s much research here with mixed results. Several nations — including Brazil, Uruguay, Peru, and Colombia — have used subsidized programs to get personal computers into poor households. Governments have promulgated such programs despite little credible evidence that the technology improves children’s academic performance or their behavior (Home Computers and Human Capital, National Bureau of Economic Research, http://www.nber.org/digest/jun10/w15814.html). In fact, one country, Romania, showed a decrease in academic performance. However, research is still being done to determine the success factors here among with have been the subsidizing of Internet connectivity as well as the physical location of the computer in the home.
The third approach is “back to basics”. Teach students to think, analyze and write well. Sometimes this includes no access to IPads and laptops, other times the tools are embraced, albeit somewhat reluctantly. Somewhat ironically, this seems to be the preferred approach of some private schools where the student’s parents work in technology. (http://www.nytimes.com/2011/10/23/technology/at-waldorf-school-in-silicon-valley-technology-can-wait.html?pagewanted=all). However, this approach seems more limited to elementary grades. I once witnessed a college administrator complaining to a firm’s hiring managers that while you state you want students who can read, write and analyze and this is more important than any other skill including specific IT skills, you reject our philosophy or history majors out of hand.
So other than offering more technological tools to all, the approach is to learn specific skills in technology or learn to think, analyze and write. Now, these should not be exclusive options. Indeed, they should go together. You can major in mathematics, and learn to analyze not only math problems but about literature. You can learn to write well under both circumstances, although it does seem to be less common than all of us would like. Which will provide more long term value to people? Which approach will prepare people for a world rattled by all aspects of the technology it so embraces? And, to bridge the digital divide, do we need to train on how to use technology or will most adapt by ease of use and social and school pressure? Who are we training to do what? □
Diana d’Ambra is a consultant at Cortelyou Consulting, LLC, as well as the Managing Director of New Ventures, Outsourcing Institute. She has more than thirty years of business experience with a focus on resource management, all aspects of outsourcing including near shoring, offshoring and onshoring as well as vendor management. She may be contacted at firstname.lastname@example.org.