Updated 8/7/07

Dave Moursund

Dave Moursund's Blog for the discussion of his current and past writing projects.

You are currently viewing the Home Page for Dave Moursund's book:

Planning, Forecasting, and Inventing Your Computers-in-Education Future

Dave Moursund's Websites

Planning, Forecasting, and Inventing Your
Computers-in-Education Future
Creative Commons License
This work is licensed under
Creative Commons Attribution-Noncommercial 3.0 License.

Cite this book as:

Moursund, D.G. (2005). Planning, forecasting, and inventing your computers-in-education future. Access at http://darkwing.uoregon.edu/~moursund/Books/InventingFutures/index.htm. (Second edition, June 2005.)

Second Edition (PDF)

Second Edition (Microsoft Word)

Second Edition Table of Contents & Preface (HTML)

Additional "Futures" Resources. These are new materials that seem relevant to me, but that have not yet been integrated into the most recent edition of the book.

HTML Version of the Table of Contents and Preface.

Send Email to Dave Moursund..

Other free educational materials developed by Dave Moursund are listed at http://darkwing.uoregon.edu/~moursund/dave

Recent News Items

Alter, Altan E. (2/14/07). Alan Kay: The PC Must Be Revamped—Now. CIO Insight. Retrieved 2/16/07 from http://www.cioinsight.com/article2/0,1540,2089567,00.asp. Quoting from the article:

While most people regard the personal computer as a modern miracle, Kay sees the PC as a chronic underachiever. To him it's an invention that, like television, has fallen far short of the potential foreseen by its early proponents. Today, at age 66, Kay runs the Viewpoints Research Institute, his own nonprofit research organization in Glendale, Calif. He is busy with several projects involving education and technology, including the "One Laptop per Child" project overseen by MIT's Nicholas Negroponte, which Kay hopes will one day transform the PC into a machine that not only changes the way we work, communicate and entertain ourselves, but improves how people—especially children—learn and think.

Mullins, Robert (2/1/07). IBM Labs sees five innovations in five years. IDG News Service. retrieved 2/1407: http://www.networkworld.com/news/2007/

Additional "Futures" Resources

Abney, Barb (10/11/06). UCF Research Team Achieves Milestone Toward More Powerful Computer Chips. Retrieved 10/17/06: http://news.ucf.edu/UCFnews/index?page=article&id=

This article discusses breakthrough technology that will help extend the future of putting more transistors on silicone chips.

Adler, Robert (2 July 2005). Entering a dark age of innovation. NewScientist.com news service. Accessed 7/9/05: http://www.newscientist.com/article.ns?id=dn7616 Quoting from this article:

"But according to a new analysis, this view couldn't be more wrong: far from being in technological nirvana, we are fast approaching a new dark age. That, at least, is the conclusion of Jonathan Huebner, a physicist working at the Pentagon's Naval Air Warfare Center in China Lake, California. He says the rate of technological innovation reached a peak a century ago and has been declining ever since. And like the lookout on the Titanic who spotted the fateful iceberg, Huebner sees the end of innovation looming dead ahead. His study will be published in Technological Forecasting and Social Change."

The article also contains counter arguments, such as:

"At the Acceleration Studies Foundation, a non-profit think tank in San Pedro, California, John Smart examines why technological change is progressing so fast. Looking at the growth of nanotechnology and artificial intelligence, Smart agrees with Kurzweil that we are rocketing toward a technological "singularity" - a point sometime between 2040 and 2080 where change is so blindingly fast that we just can't predict where it will go."

See also: http://www.accelerating.org/. This organization, and other people, talk about an continuing rapid gowth in invention, and an approaching singularity in which the rate of growth becomes completely overwhelming.

Anthes, Gary (April 3, 2006). Bits to atoms (and atoms to bits). Computerworld. Accessed 4/5/06: http://www.computerworld.com/softwaretopics/
. Quoting from the Website:

Neil Gershenfeld says we're on the threshold of the third digital revolution, one in which matter and information merge. He has kicked off the revolution in a fabrication laboratory, dubbed the Fab Lab, at MIT, where he's the director of the Center for Bits and Atoms (CBA). In the Fab Lab, students design and manufacture their own products using inexpensive fabrication and electronics tools driven by open-source software and programs written by MIT researchers. Other "fab labs" based on these principles are spreading around the world, especially in less-developed countries. Gershenfeld, who has laid out the precepts of personal fabrication in his book Fab: The Coming Revolution on Your Desktop—From Personal Computers to Personal Fabrication (The Perseus Books Group, 2005), recently explained the mission of the CBA to Computerworld's Gary Anthes.

Apple, Inc. Knowledge Navigator. Retrieved 12/6/06: http://www.digibarn.com/collections/movies/knowledge-navigator.html.

De Guzman, Mari-Len (9/27/06) Intel vows to triple chip performance by 2010. ComputerWorld Canada. Retrieved 10/4/06: http://www.itworldcanada.com/a/Enterprise-
. Quoting from the article:

Moving along a path of energy-efficient computing, Intel Corp. has pledged technology evolution in microprocessors that will improve performance per watt by 300 per cent by the end of the decade.

Speaking to attendees at the Intel Developer Forum in San Francisco, Intel president and CEO Paul Otellini said this goal would be achieved through continued evolution of Intel’s semiconductor manufacturing technology.

Today, the company uses 65nm process for microprocessor design and by 2008, it plans to reduce that to a 45nm design process. By the end of 2010, Otellini said, Intel aims to achieve 32nm microprocessor design.

"We’re not going to slow down on Moore’s Law; we have the design and capability to make it happen," the Intel president said. He was referring to Gordon Moore, co-founder of Intel, who made a popular observation – or forecast - that the number of transistors on integrated circuits doubles every year.

Desmond, Michael (December, 2004). Tech 2005: What's new and what's next. PC World. Accessed 1/7/05: http://www.pcworld.com/news/article/0,aid,118242,00.asp. Quoting from the article:

The products you use are about to get smarter, faster, smaller, cheaper, and more colorful. Here's your guide to what's ahead in PCs, software, mobile gadgets, home electronics, and more.

Upgrading your future PC could become child's play. You'll plug in building-block-like modules that pack anything from twin 64-bit processors to dual graphics boards. You'll gaze at stunning flat-panel monitors that display a wider range of colors than today's best LCDs, and you'll carry cell phones with 10GB hard drives.

Eventually, smarter technologies will turn most every car, wall, and appliance into an intelligent resource. The foundations for a truly connected future are being laid right now. Take a look in the following pages to see what the next two years will bring us.

Dongarra, Jack (2005). A not-so-simple matter of software. HPC Wire. (This article originally appeared in the Summer 2005 issue of NCSA's Access magazine.). Accessed 11/03/05: http://news.taborcommunications.com/msgget.jsp?
. Quoting from the article:

When we try to assess how much progress we have made in computational modeling and simulation, recalling some history about the related approaches of experiment and theory can help keep things in perspective. For example, we can trace the systematic use of experiment back to Galileo in the early seventeenth century. Yet for all the incredible successes it enjoyed over its first three centuries, the experimental method arguably did not fully mature until the elements of good design and practice were finally analyzed and described in detail by R. A. Fisher and others in the first half of the twentieth century. In that light, it seems clear that while computational science has had many remarkable youthful successes, it is still at a very early stage in its growth.

The reason for including this article here is that modeling and simulation underlies much of the supercomputing effort. From my point of view, modeling and simulation is a core area in ICT. Both teachers and their students should have some insight into this area.

Donoghue, Andrew (May 17, 2006). 2010: The year of the techie.Retrieved 5/22/06: http://news.zdnet.co.uk/business/0,39020645,39269493,00.htm. Quoting from the article:

Being labelled a techie used to be an impediment to advancement, but by 2010 technical skills will be more highly prized than ever, according to a study by management consultants Deloitte.

According to the report, "Eye to the Future — How Technology, Media and Telecommunications Advances Could Change the Way We Live in 2010", technology will continue to change the workplace and give an edge to those employees who are able to master and manage it.

Earls, Sali (September 25, 2006). Ian Pearson, Futuroligist: The ITWales Interview. Retrieved 9/28/06: http://www.itwales.com/997789.htm. Quoting from the excellent article:

In terms of the 2015 timeframe, I think that's quite realistic. By 2015 we'll be starting to get prototypes of the first computers which are roughly as intelligent as people. In terms of raw processing power we're not far off that right now, but of course it's not just a matter of taking an ordinary chip and making it run faster. It's a matter of figuring out how you actually do the thinking inside your head, because it's not just a simple computer program, it's much more sophisticated. We're starting to get an awful lot of insights into neuroscience and what's happening to the individual synapses, so neuroscientists are going to get a heck of a lot of information about how the brain does things, and we'll be able to use that to stimulate ideas for new computer science.

Gettier, Leon (10/22/07): The future is here right now, if you can read the signs. Business Day. Retrieved 10/26/07: http://www.theage.com.au/news/business/the-future-is-here-right-now-if-you-can-read-the-signs/2007/10/21/1192940901417.html. Quoting from the article:

It's the year 2040 and Ray Hammond is getting advice from his software assistant Maria. Located in an implant just behind his left ear, Maria has a direct connection with Mr Hammond's brain. Linked with Google and other search engines, Maria is able to filter, search and speak softly to Mr Hammond, as quietly and as transparently as if she were his own thoughts.

Another key driver of change, he says, is medical technology, including stem cell research. Within the next 25 years, he says, humans in the developed world, or at least the very rich, will have the opportunity to extend their lives considerably.

The other four drivers of change? Globalisation, the world population explosion, the climate crisis and the looming energy crisis.

KnowledgeWorks Foundation (n.d.). Education map of the decade. Retrieved 12/22/06: http://www.kwfdn.org/map/organization/index.asp. Quoting from the Website:

The map is structured as a grid that presents the intersection between six key drivers of change-trends-and five critical impact areas. Taken together, their multiple intersections suggest an emerging future landscape of issues, concepts, and phenomena that will shape the broader context for education and its stakeholders. Consider the map a guide to a future landscape rather than a comprehensive representation of a definitive future.

2006 - 2016 Core Drivers of Change

Six driving forces of change form the core of the map forecast. These major trends form the vertical column on the left side of the map:

Grassroots Economics
Smart Networking
Strong Opinions, Strongly Held
Sick Herd
Urban Wilderness
The End of Cyberspace

Evans, Mark and Restivo, Kevin (December 31, 2005). The top 10 techtrends. National Post. Accessed 1/5/06: http://www.canada.com/nationalpost/financialpost/

Gorbis, M. and David Pescovitz (September, 2006). Burstin tech bubbles before they balloon. Retrieved 9/16/06: http://www.spectrum.ieee.org/sep06/4435. Quoting from the 10 page article:

As our population ages and needs more care, there will be fewer young people to provide it. But don’t expect to fill the personnel gap with humanoid robotic nurses, say a majority of the more than 700 IEEE Fellows surveyed in a joint study by the Institute for the Future (IFTF) and IEEE Spectrum.

The survey was conducted earlier this year to learn what developments IEEE Fellows expect in science and technology in the next 10 to 50 years. They ought to foresee such things better than most, because they have so much to do with bringing them about.

Goo, Sara Kehaylani (May 19, 2006). Computing, connecting, competing: Microsoft looks ahead. TechNews.com. retrieved 5/19/06: http://www.washingtonpost.com/wp-dyn/content/article/
. Quoting:

People often ask me what's the next big thing or what's the world going to be like in 10 years. It's relatively easy for me to tell you how fast processors will be, how much disk storage there will likely be, how fast networks will be. What's really hard to predict is what is the social impact of technology. It even varies by culture. In Europe, they just latched onto the texting. In Korea, people love playing games on their cellphones. It's a huge, multimillion-dollar business there . . . I think that's what's very, very hard to predict.

Hapgood, Fred (December, 2004). More than human. CIO: Essential Technology. Accessed 1/8/05: http://www.cio.com/archive/121504/et_article.html. Quoting from the article:

THINKING AHEAD | This fall, the editors of a leading public policy magazine, Foreign Policy, asked eight prominent intellectuals to identify the single idea they felt was currently posing the greatest threat to humanity. Most of the suggestions were merely old demons: various economic myths, the idea that you can fight "a war on evil," Americaphobia and so on. Only Francis Fukuyama, a member of the President's Council on Bioethics, came up with a new candidate: transhumanism.

Transhumanism might be described as the technology of advanced individual enhancement. While it includes physical modifications (diamondoid teeth, self-styling hair, autocleaning ears, nanotube bones, lipid metabolizers, polymer muscles), most of the interest in the technology focuses on the integration of brains and computers—especially brains and networks. Sample transhumanist apps could include cell phone implants (which would allow virtual telepathy), memory backups and augmenters, thought recorders, reflex accelerators, collaborative consciousness (whiteboarding in the brain), and a very long list of thought-controlled actuators. Ultimately, the technology could extend to the uploading and downloading of entire minds in and out of host bodies, providing a self-consciousness that, theoretically, would have no definitive nor necessary end. That is, immortality, of a sort.

Horizon Report (2007). The Horizon Report. Retrieved 2/13/07: http://www.nmc.org/horizon/.

The six trends identified in this year's report are:

•User-created content;

•Social networking;

•Mobile phones;

•Virtual worlds;

•New scholarship and emerging forms of publication; and

•Massively multiplayer educational gaming.

IEEE Predititions for 2005. Accessed 1/8/05: http://www.prnewswire.com/cgi-bin/stories.pl?ACCT=109&STORY=/www/

Infoworld Predictions for 2005. Accessed 1/8/05: http://www.infoworld.com/article/

Internet2 (April 28, 2006). Internet2 aims to boost capacity. eSchool News Online. Retrieved 4/28/06: http://www.eschoolnews.com/news/showStoryts.cfm?ArticleID=6271. Quoting the article:

By fall 2007, researchers, academics, and even K-12 users of the ultra high-speed Internet2 network will have the ability to conduct even more eye-popping applications for education and research, thanks to an upgrade in the works that reportedly could boost capacity of the network by as much as 80 times its current bandwidth.

Joy, Bill (August, 2005). The Dream of a lifetime. Technology Review.com. accessed 10/16/05: http://www.technologyreview.com/articles/
. A nice review of some of the early history. Quoting from the article:

You've likely heard stories about the birth of the PC: of Xerox PARC as the Mecca of computing; of its creation of the Alto, Ethernet, and the laser printer; of the Homebrew Computer Club, the MITS Altair, Bill Gates and the theft of his Micro-soft Basic; of Steve Jobs and Stephen Wozniak, the founding of Apple, and the Jobs visit to PARC that inspired the Macintosh.

But what you may not know about is the really early history. The stories of Doug Engelbart and John McCarthy, of the Augmentation Research Center, and of the early days of the Stanford University AI Lab (SAIL) are not well known. Yes, you may have heard that Engelbart invented the mouse, and that SAIL and Stanford led to companies like Sun and Cisco. But there are better stories, great and old ones from the early days of computing, about the events that led to personal computing as we know it.

Kotala, Zenaida Gonzalez (12/4/06). UCF Researcher’s 3-D Digital Storage System Could Hold a Library on One Disc. University of Central Florida News & Informaiotn. Retrieved 12/07/06: http://news.ucf.edu/UCFnews/index?page=article&id=
. Quoting from the article:

University of Central Florida Chemistry Professor Kevin D. Belfield and his team have cracked a puzzle that stumped scientists for more than a dozen years. They have developed a new technology that will allow users to record and store massive amounts of data -- the museum’s entire collection or as many as 500 movies, for example -- onto a single disc or, perhaps, a small cube.

Belfield’s Two-Photon 3-D Optical Data Storage system makes this possible.

“For a while, the community has been able to record data in photochromic materials in several layers,” Belfield said. “The problem was that no one could figure out how to read out the data without destroying it. But we cracked it.”

Belfield’s technique allows for storing on multiple layers with the capacity of at least 1,000 GB and high-definition quality.

Markoff, John (October 27, 2005). Engineers make leap in optical networks. NYTimes.com. Accessed 10/29/05: http://www.nytimes.com/2005/10/27/technology/27chip.html.

SAN FRANCISCO, Oct. 26 - A team of Stanford electrical engineers has discovered how to switch a beam of laser light on and off up to 100 billion times a second with materials that are widely used in the semiconductor industry.

The group used a standard chip-making process to design a central component of optical networking gear that is potentially more than 10 times as fast as the highest-performance commercial products available today.

Markoff, John (December 29, 2005). Chip industry sets a plan for life after silicon. The New York Times. Accessed 12/20/05: http://www.nytimes.com/2005/12/29/technology/29nano.html?
. (Free registration is required.) Quoting:

What has changed in the industry's road map is the growing confidence in new technologies that make electronic switches from single molecules or even single electrons.
The development of nanoswitches has reached a point where it will be possible to manufacture them reliably at low cost, according to several researchers who have been involved in the preparation of the report.

Intel, based in Santa Clara, Calif., is now preparing to make the shift from chips made using a process where the smallest dimensions are 65 nanometers (one nanometer is a billionth of a meter) to 10 nanometers or less. Today's microprocessors already have more than one billion transistors. But it is almost certain that new types of switches and new materials will be needed to build chips that have 1,000 times the capacity of current chips, Mr. Gargini said.
The goal over the next decade, he added, is to build chips that can hold more than one trillion switches. Intel's new chips will be used first in low-cost laptop computers and in home media devices, further evidence that the semiconductor industry is driven by consumer electronics. Those low-cost products with their vast markets are now pushing technology forward rather than supercomputers and other highly specialized machines.

Mashey, John R. Languages, Levels, Libraries, and Longevity ACM Queue vol. 2, no. 9 - Dec/Jan 2004-2005. Accessed 5/30/05: http://www.acmqueue.com/modules.php?
. Quoting from the article:

Any old Unix person would be amused to think that Unix’s January 1, 1970, date would be enshrined so long. We have begun a process in which many people’s lives are already dependent on the correct working of software, and likely to become even more so. Software once runnable only on large systems migrates downward onto larger numbers of smaller computers. Some current cellphones use 300-MHz CPUs, running at a rate higher than any CPU commercially produced by 1990. Some have 64 MB of memory, competitive with many expensive systems of the late 1980s. Vinge’s book extrapolates from current small “smart dust” computers to assume that 5,000 years from now, most computing will be done by their hyper-powerful, barely visible descendants, containing layers of software (and more than a few trapdoors). In the United States, we already have approximately 100 CPUs per person, and this number has traditionally increased tenfold each decade. As wireless sensor networks proliferate, we face a future in which most objects have CPUs and are linked together via radio.

MIT Media Laboratory. Future of Learning Group. Accessed 7/16/05: http://learning.media.mit.edu/. I found the list of Projects/Research at http://learning.media.mit.edu/projects.html to be especially interesting. Quoting the Website:

The Future of Learning Group explores how new technologies can enable new ways of thinking, learning, and designing. The group creates new "tools to think with" and explores how these tools can help bring about change in real-world settings, such as schools, museums, and under-served communities.

Moore's Law

Poletti, Therese (December 8, 2003). IBM researchers move toward smaller chips. Mercury News. Accessed 3/28/04: http://www.siliconvalley.com/mld/siliconvalley/7441257.htm. Discusses progress in nanotechnology that may lead to continued progress as projected by Moore's Law.

Kanellos, Michael (December 1, 2003). Intel scientists find wall for Moore's Law. CNET News.com. Accessed 3/28/04: http://zdnet.com.com/2100-1103_2-5112061.html. The article indicates that the fundamental laws of physics will prevent continuation of Moore's Law "progress" beyond 2018.

Moull, Chandra and Carriker, Wayne (March 2007). Future lab: How software is helping Intel go nano—and beyond. IEEE Spectrum. Retrieved 3/30/07: http://www.spectrum.ieee.org/mar07/4941. Quoting from the article:

In a single day, a state-of-the-art fab can make nearly 100 trillion transistors, roughly 250 times the number of stars in the Milky Way galaxy. Such facilities are by any standard the most complex, and, at an average cost of US $3 billion to build and equip, the most costly factories ever built by humankind.

Today’s state-of-the-art fabs produce chips on 300-millimeter-diameter wafers [see “Special Delivery”]. By our count, there are 43 commercial 300-mm fabs in production as this issue goes to press, with perhaps a dozen more slated to come online by year-end.

This close coupling of our development operations and manufacturing allows us to stay on the approximate two-year cycle defined by Moore’s Law.

NSF 05-625 9/28/05). High Performance Computing System Acquisition: Towards a Petascale Computing Environment for Science and Engineering, Accessed 9/29/05: http://www.nsf.gov/pubs/2005/nsf05625/nsf05625.htm. Quoting from the RFP:

NSF’s five-year goal for high performance computing (HPC) is to enable petascale science and engineering through the deployment and support of a world-class HPC environment comprising the most capable combination of HPC assets available to the academic community. By the year 2010, the petascale HPC environment will enable investigations of computationally challenging problems that require computing systems capable of delivering sustained performance approaching 1015 floating point operations per second (petaflops) on real applications, that consume large amounts of memory, and/or that work with very large data sets. Among other things, researchers will be able to perform simulations that are intrinsically multi-scale or that involve the simultaneous interaction of multiple processes.

Nikkei Electrolnics Asia (october 2005). Optical Disks Used for Long-Term Storage by 2010. Accessed 10/19/05: http://neasia.nikkeibp.com/neasia/002272. Quoting from the article:

From 2010 there will be fewer opportunities for optical disks to be used as a distribution medium. Instead, they are likely to develop new applications in long-term data storage.

The role of optical disks as a medium for content distribution will end with Blu-ray Disc and High-Definition Digital Video Disk (HD DVD), many believe. So what will happen with storage systems from 2010? A number of engineers in the field responded with a common forecast: the future looks bleak for optical disks. Optical disks have achieved widespread use as packaged media for content distribution, with compact disks (CD) used for audio and digital video disks (DVD) for standard-definition television (SDTV) imagery. At the end of 2005, the next-generation disk market for media like Blue-ray Discs and HD DVD will take off, providing the distribution medium for high-definition TV (HDTV).

Many engineers believe it will be impossible, however, for any future generation to achieve lift-off as a distribution medium. There will be fewer and fewer opportunities for the disks to be used as packaged media, and they are expected to gradually vanish from view, for most consumers. It is even possible that today's next-generation optical disks may be ensnared in this trend. Optical disks will instead develop new applications in long-term data storage for networked systems.

Rupley, Sebastian (08/03/05) The Net's Next 10 Years. Accessed 8/28/05: http://www.pcmag.com/article2/0,1759,1843287,00.asp. Quoting from the Website:

Just ten years after Netscape went public, the Internet has woven its way into the social fabric and the rhythms of our personal and working lives. In fact, most of us find it impossible to imagine life without the Web. For one thing, it would be less efficient, in both easily measurable and intangible ways. We would be less connected with family and friends, and less tuned in to the world at work. Look for much more to come over the next ten years, both positive transformations and potentially devastating changes.

Rogers, Michael (12/13/06). What will replace the laptop? Tiny handheld computers could be the next big thing. Retrieved 12/14/06: http://www.msnbc.msn.com/id/16042808/. Quoting from the article:

Laptop computers now outsell desktops in stores; for the majority of consumers, the smaller devices serve perfectly well as their main computer. That would have been hard to imagine back in the mid-Eighties, when IBM and Apple introduced their first primitive laptops, each weighing in at about twelve pounds. And for long after that, laptops remained either much less functional or much more expensive than the dominant desktops. But now, twenty years later, the laptop has truly become the “desktop replacement.”

A decade or two from now, will there be a laptop replacement? Will an even smaller, more mobile device — perhaps something the size of today’s smartphone — replace the laptop in the lives of consumers? While it’s once again hard to imagine, in fact much of the technology already exists to make it so.

Roth, Mark (April 23, 2006). Experts see computers getting bigger and smaller at the same time. Celebrating 50 years of computing at CMU. Pittsburgh Post-Gazette. Accessed 4/26/06: http://www.post-gazette.com/pg/
. Quoting from the article:

GETTING BIGGER. Rick Rashid, the head of Microsoft Research and a former Carnegie Mellon professor, said it's now possible to buy a terabyte of computer memory for about $700.

A terabyte, 1,000 gigabytes, is enough memory to "store every conversation you ever have from the time you're born until you die," Dr. Rashid said, or a full year's worth of full-time video.

Sayer, Peter (January, 2005). Computer scientists identify future IT challenges. Infoworld. Accessed 6/26/06: http://www.infoworld.com/article/
05/01/25/HNfuturechallenges_1.html. Quoting form the article:

Computer scientists identify future IT challenges Goals for IT include harnessing the power of quantum physics, building systems that can't go wrong By Peter Sayer, IDG News Service January 25, 2005. These seven challenges are, in some sense, a forecast as well as targets for inventing the future. The idea of a grand challence seems closely related to forecasting and inventing the future.

SEOUL (ThomsonFinancial) (*/7/07). SKorea draws up code of ethics—for robots. Retrieved 8/8/07: http://www.abcmoney.co.uk/news/072007114733.htm. Quoting from the article:

South Korea, at the forefront of the drive to develop robots which can do anything from guarding the border to caring for the elderly, is now drawing up a code of ethics for them.

The nation, which has set an ambitious goal of a robot in every home by 2013, has launched a project to write what it believes will be the world's first Robot Ethics Charter. It will be released by year end.

'We are setting rules on how far robotic technology can go and how humans live together with robots,' said Kim Dae-Won, a professor at Myongji University who heads a team of 12 scientists, doctors, psychologists and robot developers.

'A society in which robots and humans live together may come faster than we think, probably within 10 years.'

Stein, Rob (July 6, 2005). Video Robots Redefine 'TV Doctor.' Machines Let Physicians Make Rounds From a Distance. Washington Post. Accessed 7/9/05: http://www.washingtonpost.com/wp-dyn/content/article/2005/06/27/AR2005062701842.html. Quoting from the article:

Robots are turning up in more medical roles. Some help surgeons perform procedures, especially those requiring extreme precision. Others ferry supplies and equipment around hospitals and even dispense medication. Pittsburgh researchers are testing the Nursebot to lead nursing home residents to physical therapy sessions and remind them to take their medicine. GeckoSystems Inc. of Conyers, Ga., plans to soon begin marketing its CareBot to help nurses, doctors and relatives monitor and care for the elderly at home.

Face-to-face encounters between doctors and patients are increasingly giving way to technology in other ways, with the goal of avoiding frustrating telephone tag, long drives to the office and time wasted sitting in waiting rooms.

Physicians are turning to e-mail to reach and respond to patients. Hospitals, clinics and doctors groups are setting up secure Internet portals allowing patients and doctors to consult electronically.

Stitt, Joson (4/19/06). Next generalion dimilaions? Not on the Internet you're used to. Technology Network. Accessed 4/26/06: http://wistechnology.com/article.php?id=2877. Quoting from the article:

Madison, Wis. — A crowd wearing paper goggles gasped frequently as they watched a human cadaver being dissected – in a 3D image sent live from La Crosse to Madison.

It wasn't just a display of graphics technology that could form the foundation for educational software on anatomy. It was also a demonstration of WiscNet, the high-bandwidth alternative to the common Internet that allowed an audience in the Monona Terrace in downtown Madison to watch a real-time demonstration of software running at the University of Wisconsin-La Crosse, which developed the 3D visualization technology in a partnership with Stanford's National Library of Medicine.

Sun Microsystems (). STARFIRE A Vision of Future Computing. Retrievved 12/06/06: http://www.asktog.com/starfire/starfireHome.html.

Tummala, Rao R. (June 2006). Moore's Law meets its match. Spectrum Online. Retrieved 7/8/06: http://www.spectrum.ieee.org/jun06/3649. Quoting from the article:

By 2010, the "More Than Moore's Law" movement—which focuses on system integration rather than transistor density—will lead to revolutionary megafunction electronics

Remember when combining a camera with a cellphone seemed daring? Or adding a cellphone to a PDA? Such technical tricks relied on Moore's Law, which holds that the number of transistors on an IC doubles every 18 months. In the computing world, having more transistors on a chip means more speed and possibly more functions.But in many cases, those Moore's Law ICs deal with only 10 percent of the system. The other 90 percent is still there, showing up as an array of bulky discrete passive components—such as resistors, capacitors, inductors, antennas, filters, and switches—interconnected over a printed-circuit board or two. Real miniaturization requires something more, and we have it in the system-on-package (SOP) approach we're pursuing at the Microsystems Packaging Research Center at the Georgia Institute of Technology, in Atlanta. SOP leapfrogs well beyond Moore's Law. It combines ICs with micrometer-scale thin-film versions of discrete components, and it embeds everything in a new type of package so small that eventually handhelds will become anythingfrom multi- to megafunction devices [see illustration, preceding page]. SOP products will be developed not just for wireless communications, computing, and entertainment. Outfitted with sensors, SOPs could be used to detect all manner of substances, toxic and benign, including chemicals in the environment, in food, and in the human body.

This last application will see the convergence of biology, chemistry, and digital technology to produce capsules small enough to be introduced into the human body to monitor personal health daily. A capsule could be used, for example, to check vital signs and monitor parameters such as glucose levels, blood pressure, and even signs of cancer. The capsule would then wirelessly communicate the person's health status to a Web terminal outside the body or, via the Internet, to a physician (or to anyone, anywhere). Fitted with a reservoir, the capsule could also deliver drugs at programmed intervals to selected places within the body.

Whiting, Rick (May 29, 2006). Businesses mine data to predict what happens next. Information Week. retrieved 6/2/06: http://www.informationweek.com/showArticle.jhtml;jsessionid=

So what's next? What's next is what's next--the ability to forecast where events are heading, then make informed decisions based on that assessment. Predictive analytics, the scientific name for using a data warehouse as a crystal ball, is where business intelligence is going. It involves running historical data through mathematical algorithms--neural networks, decision trees, Bayesian networks--to identify trends and patterns and predict future outcomes. Will product demand surge? Will a patient relapse? Will a customer take his business elsewhere? Our ability to make such educated guesses is key to improving service, cutting costs, and exploiting new market opportunities.

Blue Cross Blue Shield of Tennessee now predicts the health care resources postoperative patients will need years down the road. The Federal Aviation Administration is identifying links between pilot health conditions and aviation accidents, with an eye toward avoiding them. FedEx anticipates which customers are most likely to respond to a new service or defect to a competitor.


Second Edition Table of Contents and Preface (HTML)

Moursund, D.G. (June 2005). Planning, Forecasting, and Inventing Your Computers-in-Education Future

Don't worry about what anybody else is going to do. …
The best way to predict the future is to invent it. (Alan Kay)

David Moursund
Second Edition, June 1, 2005.


Preface 2
Chapter 1: An Invitation 5
Chapter 2. Inventing the Future 10
Chapter 3: Some General Background Information 17
Chapter 4: The Art and Science of Planning 31
Chapter 5: Art and Science of Forecasting 40
Chapter 6: The Future, Writ Large 56
Chapter 7: Forecasts for ICT as Content in Non-ICT Disciplines 71
Chapter 8: Forecasts for Computer-Assisted Learning and Distance Learning 86
Chapter 9: Inventing the Future for an Individual Classroom Teacher 94
Chapter 10: Summary and Concluding Remarks 102
Appendix A: Technology 108
Appendix B: Goals of Education in the United States 111
Appendix C: Goals for ICT in Education 114
Appendix D: Miscellaneous Unused Quotations 120
References 122
Index 126

These materials are Copyright (c) 2005 by David Moursund. Permission is granted to make use of these materials for non-commercial, non-profit educational purposes by schools, school districts, colleges, universities, and other non-profit and for-profit preservice and inservice teacher education organizations and activities. Additional free materials written by David Moursund are available at http://darkwing.uoregon.edu/~moursund/dave.
The first edition of this book was published on January 31, 2004. In this second edition, all of the references have been brought up to date and a number of references have been added. Many small corrections have been made. A small amount of material has been deleted and/or replaced by more recent material, and about eight pages of other new materials have been added.


Great deeds are usually wrought at great risks. (Herodotus, (fifth century BCE)

When you are up to your neck in alligators, it's hard to remember
the original objective was to drain the swamp. (Adage, unattributed)

It would be a “great deed” to substantially improve our educational system. I strongly believe that our education system can be a lot better than it currently is. Indeed, I predict that during the next two decades we will substantially improve our educational system. In this book, I enlist your help in making this prediction come true.

The focus in this book is on two aspects of improving our educational system:

  1. Improving the quality of education that K-12 students are receiving.
  2. Improving the professional lives of teachers and other educators.

This book is mainly designed for preservice and inservice teachers and other educators. If you fall into this category, you will find that this book focuses on your possible futures of Information and Communication Technology (ICT) in education. It will do this by:

  • Helping you make and implement some ICT-related decisions that will likely prove very important to you during your professional career in education.
  • Helping you to increase your productivity and effectiveness as you work to improve the quality of education being received by your students.

A second audience for this book is individuals and stakeholder groups that represent schools, school districts, and other educational organizations. This book is designed to help such audiences carry out long-range strategic planning for ICT in their organizations. The goal is to help improve the productivity and effectiveness of our education system as it works to improve the quality of education of the students it serves.

Formal school-based educational systems have existed for more than 5,000 years (Vajda, 2001). During this time, the goals of education have changed to meet the changing needs of our societies. Many of the changes have been driven by changes in technology and science. Some have been developed by educational practitioners, while others have been developed by educational researchers.

Here are two models of educational change:

  1. Continuous improvement model of small, incremental changes. All teachers are familiar with this, as they continually learn on the job and try new things to better meet the needs of their students.
  2. Paradigm shift model of discontinuous jumps. The invention of reading and writing, and later the mass production of books, were each paradigm shifts that greatly changed education (Printing Press, n.d.). In this book, we will examine some possible ICT-based paradigm shifts in education.

You know that at the current time science and technology are progressing at an unprecedented pace. Information and Communication Technology is one of the most rapidly changing areas of technology. Over the past several decades, capabilities of ICT hardware systems (computer speed, primary memory size, storage capacity, bandwidth) have been doubling every 1.5 to 2 years. Current estimates are that this rapid pace of change may well continue for another 10 to 15 years or so. (Remember, a doubling in two years means as much additional change as in all previous years put together. Five doublings is a factor of 32.)

Notice how we have “slipped in” a forecast or prediction for the future. Suppose that this forecast proves to be accurate. Then today's toddlers will reach adulthood in a world where ICT systems are perhaps 100 to 1,000 times as powerful as they are today. What might this suggest we should be doing during the years of formal education these students will be receiving? Will schools be the same 15 to 20 years from now as they are now?

It is easy to make forecasts or predictions about the future. However, it is not so easy to make predictions that are backed by careful analysis of current situations, trends, an understanding of change processes, and so on.

Here is a 1997 quote from Peter Drucker, one of the leading gurus of business management during the past half century:

Thirty years from now the big university campuses will be relics. Universities won't survive. It's as large a change as when we first got the printed book. Do you realize that the cost of higher education has risen as fast as the cost of health care? ... Such totally uncontrollable expenditures, without any visible improvement in either the content or the quality of education, means that the system is rapidly becoming untenable. Higher education is in deep crisis... Already we are beginning to deliver more lectures and classes off campus via satellite or two-way video at a fraction of the cost. The college won't survive as a residential institution (Forbes 10 Mar 97).

Notice the 1997 date on this prediction. If you have been paying attention to higher education in the past eight years, you will have seen a number of things going on that are consistent with this forecast. The most obvious change that is going on is Distance Learning, with more and more higher education and precollege learning opportunities being made available through this environment. But there are other important changes going on, such as higher education students now making more use of the Web than “traditional” libraries as an information source, and most college students both owning and regularly using a microcomputer. In addition, most institutions of higher education are facing steadily growing financial problems and there is steadily growing competition for students and grant funding.

What do you think might happen in precollege education during the next couple of decades? ICT has proven to be an aid to solving problems in every academic discipline. It is obvious that ICT is a powerful aid to helping to accomplish a wide range of educational goals. Moreover, ICT has created new challenges or goals in our educational system, such as that of providing students with appropriate education in this new field.

Whether you like it or not, your professional career in education is being affected by ICT, and the affect will grow over time. You can view the rapid growth in the education-related capabilities of ICT as providing you with both major challenges and major opportunities. In either case, you can think about doing some planning for what now exists and what will exist.

This book is about forecasting and inventing your personal future in the field of Computers and Information Technology (ICT) in PreK-12 education and in teacher education. In this book, the word “your” covers the reader and any organization that the reader happens to be involved with. This book will help you to plan for some of the ICT aspects of your future as a professional educator.

David Moursund
June 2005