Is Anybody Home?

Monday, July 9, 2012

By Stan Flink
 
The suggestion was made that I provide some kind of a follow-up on an alumni lecture I gave at Yale University in the spring of 2010. One idea in particular seemed to engage considerable interest—the difference between "the mind” and "the brain.” The plausible explanations are interesting to me because they specifically affect how we look at education and journalism, the two fields in which I have worked most of my life. What is more, the particular issue that has been transformational in each field is the inescapable influence of digital technology. Put another way—will the Internet change the way we think? Will it diminish our capacity for critical reasoning? Will it alter the cornerstone of our identity, the mind, even as it expands our capacity to process more and more information?
 
To adequately examine those questions in a brief article is impossible, but let me try to open the door.
 
According to Plato, his teacher—a chunky, heroic war veteran, named Socrates—was opposed to handwriting. Socrates, who earned his living as a carver of tombstones, had developed a singularly dedicated curiosity about knowledge—demonstrable and logical understanding of the world around him. He met with groups of young Athenian men in discussions of philosophy and politics, employing a method called dialectics. Dialectics consisted of cross-examination. He would offer a question to his young followers, listen to their answers, and then ask more questions that might deflate some arguments, and build on others. He was seeking truth and logic in the process.
 
"Knowledge,” he said, "is virtue.”
 
"The knowers,” Plato extrapolated, "should be the rulers.” They would need no laws or legislatures, because their virtuousness assured wise and just policies.
 
In that context, Socrates resisted writing. He never wrote anything himself. What little we know of him, we learn primarily from Plato’s writing. In one of Plato’s dialogues, presumably between Socrates and his students, called Phaedrus, Socrates observes that writing will weaken memory and produce a "semblance” of wisdom, "but no real judgment.” Socrates feared that written words would be a substitute for what we used to carry around in our heads. The oral tradition was then at its peak—around 600 B.C.
 
Memory is the issue, and in examining the function of brain and mind now in the digital age, memory is still an issue. Do we need much memory if we have the massive data banks available on computer?
 
Aristotle, Plato’s most eminent student, like his masters, placed great value on education, but he believed no one could know it all. There were no "Philosopher Kings” in Aristotle’s canon. Law was the one "Just King,” and laws would be written by educated men in the best interest of all the citizens of any state. A good education, in Aristotle’s view, would provide each citizen at least an "acquaintanceship” with every branch of knowledge. In a viable democracy, if government requires the consent of the governed—and Aristotle believed it must—education was quintessential. It feeds the mind.
 
Mind over matter has been a central interest of intellectuals since the birth of scientific inquiry. It begins with the use of written language which was developed three thousand years ago, but confined for centuries to a small number of scribes whose primary duty was to keep track of the economic and administrative concerns of their rulers.
 
Writing beyond record keeping evolved slowly. The printing press as a device that could be replicated and widely used was not invented until 1467 by a German named Johannes Gutenberg. He died, destitute and uncelebrated, a few years after his invention became public. He had also invented the molds for moveable metal type, and inks of many colors, which he used beautifully in his first books—elaborate editions of the Bible. But it took more than 150 years after that for the publication of what came to be called novels. Miguel de Cervantes’ Don Quixote was the first known among them, published in 1605. Newspapers appeared for the first time in 1609.
 
Reaction to the printing press was not entirely favorable. Similar to the apprehensions of Socrates and his students regarding handwriting, many contemporary scholars in the 17th century believed printing would make error permanent. Fear of blasphemy or inaccuracy or falsehood also animated opposition to the printing press. Any new way of changing traditional practices suffered resistance, but those that endured could be seminal. Philosopher Francis Bacon, in 1620, declared that printing, magnets, and gunpowder were the three inventions that "changed the whole face and state of things throughout the world.” In the 18th century, the poet Alexander Pope called the printing press "a scourge for the sins of the learned,” a sentiment shared by many other writers of the period. In the 19th century, Tolstoy declared in his monumental War and Peace that the "dissemination of printed matter” was "the most powerful of ignorance’s weapons.”
 
For most of human history, large changes in technology and behavior moved incrementally but two communication developments in modern times have made remarkably swift inroads—television and the Internet. Television entered half of all American homes in its first eight years—during the 1950s and ‘60s. The computer has not made that great a penetration in the nearly thirty years since the debut of the IBM-PC, but its influence has grown very rapidly.
 
Meanwhile, it is estimated that half the planet’s population—three billion people give or take—spend more time watching television than any other leisure pursuit, and much of that medium can, and will, be "streaming” into websites one way or another as has already begun. The Internet is the most significant of all the new technologies because it can alter how we think. A five-year research project, recently completed in England, combining the facilities of the British Library and a consortium of English universities, concluded that:
 
"It is clear that users are not reading in the traditional sense:
Indeed, there are signs that new forms of ‘reading’ are emerging
as users ‘power browse’ horizontally through titles, contents pages,
and abstracts, going for quick wins. It also seems they go on line
to avoid reading in the traditional sense.”
 
Bruce Friedman, a pathologist at the University of Michigan Medical School, who participated in one of the American research projects reported, "I can’t read War and Peace anymore. Even a blog of more than three or four paragraphs is too long.”The most common finding in a dozen of these research reports made by scholars and technicians is the "deep reading” loss. The digital technology invites jumping to the next site, the next link. Brain studies reveal new configurations, nerve cells making new connections in the brains of those who are regularly immersed in on-line work. What these new patterns of connectivity mean is a matter of some dispute.
 
The scientists and academics who have conducted these research projects here and abroad believe that the diminution of deep reading erodes deep thinking, and therefore the ability to reason critically—which requires reflection, contemplation, and creativity—all functions of the human mind. At the very least, individual consumers of the Internet must consider the necessity for "due diligence” in their choice of data. When Google offers dozens of citations in regard to a single question, the burden of examining the full range of answers and making choices falls on the individual consumer.
 
A global Web, now virtually in place, is seen by some as a provider of unlimited information services, a gift to minds and brains alike . It is seen by others as an inexorable engine of fragmentation. What it does to the function of independent deep thinking, and how enduring are its effects, has not yet been fully determined, but the evidence clearly points to the increasing inability to read extensively and attentively. The digital world seems always in a state of partial attention.
 
I can attest to similar findings anecdotally. Students I have taught for over twenty years are reading books and newspapers less, using the digital instruments more. The quality of writing, as seen in their essays has not deteriorated greatly, but how much content is "lifted” from the computer is difficult to measure. The convenient accessibility to information and analysis provided on-line makes plagiarism, as our generation understood it, a minor offense. David Pritchard, a physics professor at M.I.T., observed recently: "The big sleeping dog here is not the moral issue. The problem is that kids don’t learn if they don’t do the work.”
 
Young people who have used computers since kindergarten regard the product on the screen as public domain. Attribution needs to be given much more attention. Creative thinking may be, on the other hand, changing in form and process. Acceleration is the hallmark. The question not yet answered is, "At what speed does wisdom come to us?”
 
Recently, I divided my class in half and each week one group would prepare by using a blog they created for the purpose of discussing the assigned reading. The other half would conduct the discussion, Socratically, in the classroom. The following week, the two groups would reverse the procedure.
 
My participation would be, I thought, as a moderator. I expected the classroom discussion to be the preferred method. As it turned out, that was not the case. All but one student reported that they were far more comfortable working on the computer. They were very frank about two points: 1) On computer, they were not being observed and could revisit the text, or change their "postings” in response to digital comments from members of their team. 2) They had very little experience over the years in classroom debate or declamation and felt intimidated by "other students” and/or "the professor.” They also confirmed that a computer had been central to their education since childhood.
 
The postings to the student blog were forwarded to me each week and I read them all. These postings were almost always more eloquent and polished than the ad hoc comments made in class. This was understandable in one sense, but disturbing in another.
 
Working in isolation with the ability to check facts and rewrite commentary is obviously less challenging than delivering opinions or evaluations in the clamor of classroom debate. But to be intimidated by the prospect of tapping into memory and judgment based on the reading of a text, or responding to the views of classmates spontaneously, seems astonishing to those of us who grew up without Google. Surprise, however, is not the most important reaction. Concern for cognition—for the learning process—is more worrying. That concern may be modified over time by the evidence of learning in a new and different manner, which is the point of these ruminations.
 
Richard Foreman, a playwright who was a participant in research on the effect of digital education, wrote: "As we are emptied of our inner repertory of dense cultural inheritance, we seem to be turning into ‘pancake people,’ spread wide and thin as we connect with the vast network of information accessed by the mere touch of a button.” The world of print, of books and magazines and newspapers, which were our primary sources of information for five centuries after the arrival of the printing press, was based in the words of Professor Neil Postman on the "emphasis on logic, sequence, history, exposition, objectivity, detachment, and discipline.” The Internet, now overtaking print, in the words of writer and critic of the digital world, Nicholas Carr, puts its emphasis on "immediacy, simultaneity, contingency, subjectivity, disposability, and, above all, speed.”
 
Therein, I think, is a major dilemma we face in the future of education, journalism, and a healthy democracy. Computer scientists, in the early days of their explorations, viewed the possibilities of the web as a tool that would enhance the power of the brain. In theory, the human mind would shape that enlarged intelligence by deep thinking and critical reasoning. If, however, the mind is not dominant, the ultimate question is, Do we become tools of our tools?
 
At stake is the quality of education, journalism and what is called "the national conversation.” Either we master the World Wide Web without losing our ability and appetite to interpret texts and make connections propelled by memory and thought—the core of a liberal arts education—or we slip into a dependency on the "instantly available” data of the net.
 
What may become unavailable is the educated, articulate personality who can write with irony and nuance, employ evocative literary illusions, and produce the concentration that creates imaginative new ideas. We are what we are because we make choices that determine our destiny. Nobel Laureate Roger Sperry once wrote, "Mind controls matter. It is superior to brain in its capacity to will, intend, command and direct.
 
”The mind lives on memory. If we let memory wither because we can click on Google and it will artificially remember for us, what will we become?
 
To paraphrase one answer—all technological change is generational. The full effect of a new technology is felt only when those who grow up with it become adults.
 
Stay tuned.
 
©2015 Stanley E. Flink. All Rights Reserved. Posted with permission of the author.