News
Garber Announces Advisory Committee for Harvard Law School Dean Search
News
First Harvard Prize Book in Kosovo Established by Harvard Alumni
News
Ryan Murdock ’25 Remembered as Dedicated Advocate and Caring Friend
News
Harvard Faculty Appeal Temporary Suspensions From Widener Library
News
Man Who Managed Clients for High-End Cambridge Brothel Network Pleads Guilty
In the two years that I've penned the Crimson's Tech Talk column, I've definitely jumped on many bandwagons. I hope I've been right more often than wrong. Sure, I goofed in thinking big screen TV-computers, push media and Java would be watersheds in technology, and I'm still looking for a buyer for my beloved eMate now that Apple's dumped the whole Newton line.
It's almost too easy for self-appointed pundits like me to serve as the yes-men of technology, praising each and every advance in computing or the Internet as the best thing since, well, the last one.
But in this final column, indulge me as I turn to a bigger frame than a particular innovation or trend. The question is, how do we view technological change in our society; more specifically, do we even bother to view the Information Age in a critical light at all?
Computers dazzle and inspire their followers, creating so many eunuchs before the temple of silicon and fiber optic. They can be magical tools that bring you a radio broadcast from Omaha to your desktop or run a complex multivariable regression analysis or provide up-to-the-minute stock quotes.
The problem is, most people who write on, work in and think about technology are usually in love with the subject matter. Pick up a copy of Wired, or PC World, and you expect to see glowing articles about the latest release from Compaq or Microsoft. But a cheerleader's enthusiasm, not a critic's curiosity, pervades the entirety of society's views on technology.
Time and Newsweek fall all over each other to add the best "tech pages" to each issue. Local news stations plaster human-interest stories about the latest "can't-miss" Web site on the five o'clock news.
Politicians who couldn't even find Silicon Valley on a map two years ago all praise the importance of universal 'Net access and computers in education. The current robustness of the economy is often laid at the feet of increased productivity due to the intelligent application of computers.
Of course, the occasional "dark side" of technology peeks through the panegyric every now and then. But stories of Internet addiction and chat-room sexual misrepresentation tend to titillate rather than criticize. The advent of the Information Age is, in many ways, taken as an unqualified good for America and the world.
So, initially, was the Industrial Revolution. Like the wiring of society in recent decades, humanity's move from agriculture to industry, from farm to city, brought with it many new boons.
Standards of living, per capita income and education levels slowly began to rise. Ricardo's long-promised windfalls from comparative advantage appeared as countries traded manufactured goods across nation-state lines. Millions of new jobs were created, providing new outlets for the skills of the world's citizenry.
But critics like Upton Sinclair were quick to remind America that industrialization did not create a rosy, costless modernity. Children exploited in factories, unsafe working conditions, shoddy products and questionable business practices-all these accusations, and more, fostered debate and critical examination of the new cities thriving on mechanization.
In time, society learned to balance industry and community, regulating the functions and practices of the new economy in the interests of all. It simply took the sardonic gaze of the critic to get there.
Where is today's Upton Sinclair? He's not writing articles in Time about the dangers of RSI. Repetitive stress injuries threaten debilitating pain to an entire generation raised with keyboards from practically the moment of birth.
He's not addressing the dangers of the new assembly-line mentality, the dehumanization of "information workers" who do nothing but feed raw data into The Machine: store cashiers, data processing agents, telemarketers...
He's failing to link the rise of the Information Age to the trend of corporate downsizing. He's missing the question of what happens to the social fabric in an age where once-valued workers become unnecessary, in which profitable corporations can use technology to achieve the same product with less human input.
We need to remember where we stand in the history of information technology. We're just at the threshold of sea changes in the way society works, lives and plays, all inspired by the microchip. The Internet and the PC are just the first steps in a whole series of developments we can scarcely begin to anticipate.
These changes can't occur in a vacuum; public debate and scrutiny of the effects of technology are sorely missing in our rah-rah culture. It's not merely possible to question the Information Age without throwing the proverbial loom in the river--it's one of the civic responsibilities we bear in a sometimes brave, sometimes frightening, new world.
Kevin S. Davis '98 has written Tech Talk since February 1996. This is his final column. He will join American Management Systems in Charlotte, N.C. this fall as a business analyst in their finance industry group. He can be reached at kevin_davis@post.harvard.edu.
Want to keep up with breaking news? Subscribe to our email newsletter.