After taking a few months of time away from writing, I decided to jump back into reading and reviewing books with a recommendation from a coworker. It was maybe not my best idea to get back into reading with such a technical book. The book in question, “The Cult of Information: A Neo-Luddite Treatise on High-Tech, Artificial Intelligence, and the True Art of Thinking,” by Theodore Roszak and published in 1994, is an updated edition of his 1986 book, “The Cult of Information: The Folklore of Computers and the True Art of Thinking.” While the edition I read was published about a decade after the original book, it’s still about 20 years outdated in its review of the technology of computers.
Though it’s a technology-laden philosophical treatise and as such a bit of a slog to get all the way through, it still had interesting points to make about technology, information, knowledge, and the physical act of human thought compared to “thinking” by machines. Some of his points were eerily accurate descriptions of our current point in technology that managed to make me laugh and cringe at the same time.
For instance, in the second chapter, he talks about so-called futurologists in the '80s and the unbelievable hype they preached about the computer, predicting, “All company records will be on-line, data bases for every purpose will be instantly accessible through vastly integrated, all-in-one organizing-accounting-managing programs. Word-processed documents will be distributed far and wide from terminal to terminal and simultaneously filed with abundant cross-indexing. Electronic mail will be the rule. ... When it is necessary to hold a meeting, it will be done by teleconferencing among colleagues and contacts at all points in the building or around the world. As for the home... It will become an ‘information center’ organized around a computer that is linked by its busy modem to a worldwide array of data bases. The new electronic family will read its mail and the news of the hour off a video screen; it will bank, shop, invest, learn, and play at its interactive terminal.”
While no one uses “terminal” to describe the personal computer these days, most of the above has come to fruition and then some. A few pages later, he says, "Even friendship and personal warmth will be electronically mediated: the home terminal will be linked with numerous computer bulletin boards that will supply conversation, advice, gossip, humor, dating services -- all the social commerce for which people once had to go in search of human beings in clubs, cafes, pubs, parks, and bars." At this point in the book, I actually had to stop reading because I started laughing; in my mind I was seeing the cheesy commercials for specialized online dating services.
So I guess maybe these "futurologists" were on to something? I would love to be able to read an updated edition of THIS updated edition to see what the author has to say about computers in 2018. Especially since he then states that the "worst casualty of such megahype may be the computer itself." Not only do most people in America have a personal computer or laptop, we have smartphones, smart TVs, smart watches, and I'd love to hear his take on our internet fad of having electronic personal assistants such as Alexa and Siri. I think it’s safe to say he was wrong on this point.
You have free articles remaining.
The main thrust of his book, though, is that computer supporters have spent decades convincing us that computers would be able to "think" and eventually surpass human brains, and his belief that this can never come to pass. Frequently, he states that the mind thinks with ideas, not information, but computers are only able to mimic human thinking by quickly parsing large amounts of data rather than producing actual ideas. To this point, he notes, "Ideas create information, not the other way around. Every fact grows from an idea; it is the answer to a question we could not ask in the first place if an idea had not been invented which isolated some portion of the world, made it important, focused our attention, and stimulated inquiry."
As advanced as technology has already become, especially in the fields of robotics and artificial intelligence, it's easy to think he was just off base again. However, his arguments are quite sound. Does our new AI technology actually have the capability to think? Or does it just appear to do so but cannot because we can't program AI to have imagination or creativity of its own? His argument boils down to this: The truly creative function of the brain can't be understood well enough to replicate it convincingly in order to attempt to program it into AI or other technology.
One particular caution included in the book was a quote from Dan Kennedy in the July/August 1993 issue of The Media Culture Review where he wondered if "people will travel the information highway with blinders on, reinforcing their prejudices, closed to new ideas." Essentially, the technology expected to bring us together would in fact overwhelm us with so much information that we would become selective in choosing only to acknowledge that which already fit our beliefs. We need only look at the political discourse and atmosphere over the past few years to know his fears were well-founded.
In short, while I often found my mind wandering and having difficulty focusing on such a technical exploration, it was still a fairly interesting read and I'm glad I took a chance on it. But for now, it's time to go back to the distraction of scrolling through Facebook.