2008-12-06

Life Imitating Computers: The Evolution of Human Thinking

I have been working on this short essay for a long time, sorting out my thoughts on the issue and trying to convey what I'm thinking in a clear and concise manner. I sincerely hope that you enjoy it, find it insightful, and do not think of me as a ranting lunatic after reading it.

For thousands of years, mankind has relied on oral history to pass along anecdotes, stories of our history, lessons learned, and any other bit of collective knowledge that societies felt necessary to preserve in order to facilitate the survival of the species - explicitly or otherwise. It is the recognition of this benefit that has largely enabled humans to thrive in societies which wisely chose the knowledge to pass along, and has led to the creation of such constructs as "conventional wisdom," "wives tales," fables, stories, and even religion. While it was initially feared as challenging this status quo of knowledge transfer, Gutenberg's invention of the printing press around 1439 was an amplification of these constructs; an argument reinforced by the first book to be pressed - the Bible - and proven to be correct over time. This invention was the mother of all evolutionary inventions in man's history at that time.

While the pairing of the printing press and widespread literacy opened the door of knowledge to many more of our species, the spread of and access to this information was still spotty and slow. It had been, and still was, necessary for mankind to keep much of the knowledge needed to process information and analyze various aspects of one's own life, surroundings, and society in our collective heads for daily use. This was the driving need for the continuity of our legacy constructs: while we could gain knowledge and share it far more easily, to leverage it in a practical sense we had to be able to keep that information in our heads. We had evolved to easily store knowledge in terms of these constructs through natural selection, and thus our conventional mechanisms for knowledge transfer between generations survived, and even thrived, under this new regime of recordation.

Computer systems also have a problem of information access, for which various components have been developed to address. ENIAC, and early computers like it, had to be able to store information that would be processed in the "processor" itself. It only had one type of memory - essentially, a flipflop. It was this single mechanism that was available for all type of data. This limited the computation to that which could be crammed into this expensive memory. Later, the concept of a slower "core" memory unit was developed. Data that had to be immediately operated upon had to exist in registers (memory) on the processor itself. That which did not need to be operated on immediately could be swapped out to the slower, larger "core" memory. Modern computers have many levels of memory, from registers that operate at the speed of the processor, to multi-layer on-chip cache from which the registers are populated, to RAM which holds necessary but less-immediately accessed data, to disk which holds infrequently accessed data. Along with evolutions in mechanisms for storing data have come evolutions in how to most effectively leverage them, including predictive algorithms for data caching and swapping from the slower to faster storage devices to minimize execution delays due to memory access.

Like the development of slower, larger memory to support data computation in our modern computers, we have collectively invented this revolutionary tool known as the internet. As the ready availability of data to mankind increases, we are forced to rely less and less on our conventional (less accurate) mental constructs, just as computers needed to store smaller and smaller portions of the data and instructions that could be processed at ready access to the CPU. As a result of all of this, in the case of computers as well as mankind, the set of information available increased exponentially. When performing tasks, we now have a wealth of available information that doesn't have to be at the tip of our fingers, or on the top of our brain, in order to be processed in a reasonable period of time. We read things on the internet, perform research in a few minutes, and - if necessary - remember it to perform a task more quickly the next time. We may "swap out", or forget, something that we previously needed on a regular basis with confidence that if we need it again later, we will be able to find it. This is a rudimentary memory management algorithm, adapted to human nature.

All of this raises some important questions that mankind needs to reckon with in the not-too-distant future. How might this revolution in the very essence of our thinking change our constructs? In what ways will fictional literature be impacted? Will we still tell our children stories? How will religion survive? Can computer memory management techniques be adapted by psychologists to train humans to more effectively leverage our new tools like the internet? Is this evolution leaving us vulnerable should we somehow "lose" this tool through war or regression in civilization like that which happened after the first Roman empire? These questions will be answered, implicitly or explicitly, in coming generations. How we answer these questions and resolve the inevitable conflict in between the question and answer will shape no less than the future of our species. It is essential that we recognize the existence and significance of these questions now, if we have hope of answering them as a civilized society, rather than through war or deterioration of our hard-won civilization.

Research that recognizes the issue of technology fundamentally changing ourselves and society is now being highlighted by mainstream media outlets. Recently, USAToday published an article that discusses technology's impact to our social interactions. Closer to the point I make above is this article discussing how surfing the internet alters how one thinks. The latter seems to infer that this model of cognition will be more efficient than our legacy constructs by suggesting those who are able to leverage it will be ahead of others intellectually and socially in future generations.

No comments: