Welcome to the World of the External Mind, where the amount of information we attempt to process every day far exceeds the limitations of internal memory, both computer and human. The quote above, from Lars Arge, describes the "outsourcing" of memory in computer-based applications, but it could just as easily be referring to the outsourcing of human memory to external systems such as BlackBerrys, Google, Global Positioning System devices and any of the many other devices we use to navigate through work and life.
In this article we will look at outsourcing both memory (stores of data, information and knowledge) and thinking (calculating or analyzing). Humans have used various tools throughout history to off-load these processes, but modern technology gives us an unprecedented opportunity to avoid both having to remember things or think about them.
I wonder: Is this a good thing?
The human brain is the most complex thinking machine on the planet. Humans invented languages, writing, metallurgy, architecture, algebra, calculus, chemistry and atomic theory — all without the aid of modern computers. However, while human beings are the ultimate thinking machines, how many of these advances did we make because we could store long-term memory in external media like books?
Today, with the explosion of external memory media and calculating tools, will they free our brainpower to achieve the next great leaps in human achievement — or just make us lazy?
We have had external memories for a long time: cave paintings, hieroglyphics, stone tablets, scrolls and books. But, as Professor Arge points out, the restrictions of the input/output interface (which applies to humans as much as it does to computers) limit how much information we can access, remember and retrieve at one time. For example, to read six books at once, you need copies of all six books and a big table to put them on. Creating a book was a difficult task before the advent of the printing press, word processor and digital memory. Typewriters improved the process a little, but an author was still constrained by the inability to store information.
Outsourcing thinking began with calculators. Humans have off-loaded math to the abacus, slide rule and digital calculators. The first two devices took a certain amount of skill and training; you still had to know something about math to use them.
The calculator, however, is so simple that preschoolers can punch in 3 + 2 and the equal sign and get 5. They may not understand the mechanics of addition, but they can get the correct answer. If they use a calculator through elementary school, students will never need to memorize multiplication tables or understand more complex forms of mathematics. The calculator will provide the answer, but will students learn to think quantitatively as a result?
In a more complex example of outsourcing calculations, World War II submarines carried a device called the Torpedo Data Computer (TDC) Mark III, an electromechanical analog computer used to solve torpedo targeting problems. The TDC Mark III consisted of two sections. The first was a position keeper that kept track of the position of the target. The second was an "angle solver" that adjusted the gyro of the torpedo to focus on the predicted position of the target based on information from the position keeper.
Even with the TDC, some World War II submarine skippers apparently insisted on calculating their own firing solutions, trusting their ability to do the math over the Navy's Bureau of Ordnance, which also provided their problematic Mark 14 torpedoes. It likely took a competent submariner, with a slide rule (which was invented around 1620), a straight-edge, and pencil and paper, about the same time to calculate a firing solution as it did to load all the target data into the TDC and get a result.
Maybe it was a control issue, maybe the sub commanders just wanted to be able to confirm the computer’s results themselves — or maybe they just felt that if the machine failed they should be able to do their own calculating.
Today, the input/output barriers to access and duplicate information have almost vanished. We can open six documents on a Web browser and cut and paste their entire contents in a few minutes. Calculators are not just used for balancing checkbooks. Digital computers using sophisticated algorithms control complex systems, like nuclear power plants, aircraft in flight and investment strategies, with only very high-level inputs from human operators. Now weapons locate and track targets without human intervention. It is a different world and has been since the digital revolution.
The Memory Effect
As little as 20 years ago, we looked upon people with specialized knowledge as gurus from whom we sought enlightenment on a multitude of topics. There was always someone we knew who could provide exhaustive, exclusive data on baseball statistics, do-it-yourself home projects, geography, travel, wine or health care. They ranged from hobbyists to professionals, and all had obtained their knowledge through years of study and experience. They were “walking books,” much like the book people in Ray Bradbury’s classic science fiction novel, Fahrenheit 451.
Now, there is little need to remember much of anything ourselves as long as we have access to digitized information. We can find what we need by going to any number of Internet search sites that will bring up dozens, hundreds or even thousands of references containing information on any subject. We can have a GPS device in our car that will not only show us a map of where we are going, but will tell us when to make the turns.
In 1986, I bought my first digital external memory device, a Casio Databank watch. Like most digital watches of the day, it told the time and had a stopwatch. However, it also had a calculator and an internal memory that could hold 50 “pages” of information such as telephone numbers and calendar dates.
For the first six months, I loved it. I spent several hours playing with the databank function, entering phone numbers and appointments. The watch reminded me when to go to appointments and gave me a handy (pardon the pun) reference if someone asked me if I was free at a particular time. I like to think that people were in awe of my technological superiority, though it is more likely that they were internally shaking their heads at my hopeless geekiness.
The problems were subtle at first. I started having trouble remembering phone numbers. Then I started forgetting events that were not entered in the databank. It finally occurred to me that memory, like muscle, must get periodic exercise to remain healthy. But I could not quite bring myself to give up my gizmo until the battery died. At that point I put the watch in a drawer and set about recapturing my ability to remember things.
Since then I have flirted off and on with various external memories, including a Day-Timer, Palm Pilots, and lately with a BlackBerry. I have had to accept that resistance is futile. My brain’s limited long-term memory simply does not have enough room. Much like the characters in the classic Japanese anime series, Ghost in the Shell, I have outsourced useful information that I need infrequently to “external memories.”
In 1986, a television sitcom called ALF appeared. A.L.F. referred to “alien life form,” the show’s title character, who fled the destruction of his planet and crash landed his spaceship in suburbia. Does he share the fruits of superior alien technology with the human race? Does he bring philosophical enlightenment to humanity? Does he become a beacon of justice and liberty fighting crime with incredible superpowers or technology?
Nope. He could drive his spacecraft but had no idea how it worked, much like most people can drive a car but have no idea what goes on inside an internal combustion engine. ALF is an example of what can happen when technology becomes so advanced that you don’t need to be strong, fast, nimble, alert or smart to survive and prosper.
A much repeated concern among scientists, academia and industry is the decline in the number of college students pursuing engineering and higher math degrees. In contrast, we are seeing an increase of degree programs devoted to using existing computing skills, like video game design or graphic arts. There will still be people working on the next generation of tools, but if college course registration is an indication, we appear to be shifting from innovators of computing technology to mere users.
Forty years ago, if you wanted to use a computer, you needed to know how to program it yourself. There were no operating systems or standard office software suites. Today, you pull a PC out of a box and plug it in.
Today, we have become as dependent on an uninterrupted flow of free information as we have to clean water and electricity. However, how much does the average person turning a tap, flipping a light switch or entering search terms into Google know about water and electrical systems or page ranking? We trust that our water is potable, that electrical appliances are safe, and that the result we get from Google is accurate without always knowing what it takes to deliver those commodities.
Companies, like Microsoft, Apple, Google and their competitors, work hard to insulate users from the complexity of the systems we use on the theory that consumers want easy to use computing solutions. This theory proves true. Classic examples are MS-DOS, the first relatively universal operating system for PCs that made desktop computing easier for a generation of new users, and America Online, the company that opened the Internet to the average user. This freedom from repetitive and time consuming computing steps allows us to spend more time working or finding information.
I’ll repeat my earlier question: Is this a good thing?
In the sense that we are now free to spend more time working with tools and less time developing them, the answer is yes. Advances in technology free us to be more productive, give us access to more information than ever before and allow us to communicate instantly with people around the world.
However, the real answer lies in what we do with this potential for productivity. Extra time can be squandered playing endless “Halo 3” death matches instead of designing the next great game. Reading and parroting someone else’s work are not the same as doing your own research and analyzing the results. Likewise, seeing the world through a computer monitor is no replacement for experiencing life firsthand.
In short, are we taking advantage of technology to be productive or merely busy? We may well be on the slippery slope to ALFdom where advanced technologies are concerned.
Closing the Skill Gap
British philosopher Alfred North Whitehead said, “Without adventure, civilization is in full decay.” In that vein, here are a few bits of advice to put some adventure back in computing.
Maintain control. A computer will only do what you tell it, or what someone else has programmed into the system. The reason so many people get their computer infected with various types of malicious software is that they either do not know enough about their computer to lock it down or do not recognize potential malware when it asks them to click on it.
You do not have to become a microchip engineer or be able to decompile and recode your operating system, but you should know enough to do basic things like setting system performance preferences and use most functions without, for example, having to resort to the manual (or your teenage offspring) when you want to change the ringtone on your cell phone.
Smell the flowers, but check the roots. When you use a Web search engine, you should have at least a basic understanding of how that engine returns results. Not all searches are equal; some engines will favor certain results over others. If you want to test this, pick five search engines at random and enter in the same search terms (at least five words) and see if they give you the same results. Hint: Any engine that refers you to more shopping sites than informational sites should be avoided.
Look behind the curtain. There was at one time a theory that stated that a million monkeys banging randomly on keyboards would randomly recreate the works of Shakespeare. The Internet has pretty much disproved this theory, with many more than a million monkeys and without the random banging. Someone, somewhere, wrote every page on the Internet. Some of these keyboard bangers are more competent than others. Some just copy (monkey see, monkey cut and paste) existing information.
Unless we can look behind the information presented and judge the credibility of the source, we may not recognize the difference between valuable information and junk.
Behave like someone is always watching you. Why? Because they probably are. A recent poll, by Samuelson Clinic at the University of California, Berkeley and the Annenberg Public Policy Center at the University of Pennsylvania, came to the following two conclusions:
Every single move you make online can, and often is, tracked by online marketers and advertising networks that gather and use the information for serving up targeted advertisements.
The average American consumer is largely unaware that such tracking goes on, the extent to which it is happening or how exactly information is being used.
And that does not count your own organization’s network security team or any hostile groups whose sole goal in life is to hack into a U.S. government system. Thinking about security may seem a bit surreal while sitting in a nice, safe, climate-controlled building, but once you turn on your browser you have entered the wilderness. Try not to behave like prey.
Don’t lose your mind. Or your laptop, flash drive, address book or that CD with the Social Security numbers of everyone in your organization — having external memories means keeping track of them!
Do your own thinking. A computer is really a big, dumb electronic calculator. A Web search engine knows where lots of things are, but not what is of value in any particular situation. Ultimately, the human mind is the power source that drives the Information Age, not the technology.
The most egregious example of non-thought: relying on a spell checker to proofread what you write. Just because a word is spelled correctly does not mean it is the correct word (as the CHIPS editors gently remind me periodically).
In 1967, Arthur Koestler wrote The Ghost in the Machine. One of Koestler’s central themes was that as the human brain has grown, it built upon earlier, more primitive brain structures, and that these earlier, primitive components are the “ghost in the machine” that affect our higher cognitive functions.
The development of computing technology has similarly added another, technology-based layer to the human brain. We have computers calculating numbers so large they do not allow human comprehension; storage media that can store all the books ever written many times over; radar, sonar, satellites and telescopes that extend our senses; worldwide communications systems that allow us to communicate with anyone, anywhere in the world, instantly. We are the ghosts in this marvelous machine, driven by our brainpower.
But thinking, like any skill, takes practice. Keep learning new skills: play a musical instrument, take up astronomy, study Machiavelli, do your income taxes — without a tax preparation application — anything that requires more then just knowing how to punch in numbers and read a result. No matter how advanced technology becomes, our systems can only do what we tell them to. Our limits become theirs.
Raise the bar.
Until next time, Happy Networking!
Long is a retired Air Force communications officer who has written regularly for CHIPS since 1993. He holds a Master of Science degree in Information Resource Management from the Air Force Institute of Technology. He is currently serving as a telecommunications manager in the U.S. Department of Homeland Security.
The views expressed here are solely those of the author, and do not necessarily reflect those of the Department of the Navy, Department of Defense or the United States government.