About the Readings
What is intelligence? How do you know if someone or something is intelligent? A dog may seem intelligent if it responds correctly to command. A human is deemed intelligent if she or he performs well on an exam or in solving a complex problem. In short, we can only really identify intelligence through observing behavior and comparing that behavior to a mental model of what we think represents intelligence.
What, then, is artificial intelligence (AI)? A broad definition would be that it is any behavior that appears intelligent but is not. Similar to magic, it is an illusion created by someone. AI has been associated with the field of Computer Science where a person or persons program a computer or robot to demonstrate intelligence.
What happens when the computer seems more intelligent than the human? Is it really an illusion? IBM’s Watson famously defeated all-time Jeopardy winners Brad Rutter and Ken Jennings in 2011. Is Watson actually more intelligent than these two humans? Does this make Watson more human than humans?
Brian Christian, author of The Most Human Human, was intrigued by the questions about what it means to be human when compared to the behavior of computers. The following article is a story about how he participated in a competition held annually to determine if intelligent judges could determine differences in behavior of computers and humans in a blind test (Turing Test).
—Richard Orwig, Associate Professor of Information Systems
Do machines that perform complex tasks have the ability to think? Can they retaliate against their creators? Popular culture has been warning us for over a century that technology, particularly robotic machines, will one day play a more prominent role in society.
The short story “Moxon’s Master,” published in 1909, was one of the first descriptions of a robot in literature. Though written almost 50 years before the first robot was invented, this story raises the question of whether we, as humans, have control over technology or whether technology ultimately has control over us.
“Moxon’s Master” is written from the perspective of a nameless narrator. The narrator is conversing with his friend Moxon—an inventor who has created a “thinking,” chess-playing robot. The two men are engaged in a philosophical discussion of the nature of consciousness. The narrator and Moxon are debating the qualities a “thing” must possess for it to be considered a “living” being. Through the eyes of the narrator, we are led to consider these questions more deeply as we are told of the events that transpire between Moxon and his creation.
As you read this story, consider the questions raised by Moxon and our narrator. What do you think? Can technology reach the point of no longer needing us? And who is ultimately in control, us or the machines we have created?
—Mikaela Klimovitz, Class of 2016
Has our obsession with technology eroded our ability to focus attentively on a task from start to finish? Does concentrating intensely on what we are doing—whether it be reading a book or writing an essay—seem impossible to do without constantly succumbing to distractions? Many of us likely answered yes to the above questions. A lack of focus of this type is often assumed to be a character flaw or a problem in need of a remedy. But is that really a valid assumption?
In this 2009 article titled In Defense of Distraction, Sam Anderson argues that perhaps there are some positive consequences of the “distracted” state of mind. According to Anderson true focus is not without distraction; true focus leads to numerous distractions within the realm of the task at hand. The mind tends to wander effortlessly recognizing and collecting various fragments of information and in that way a distracted mind offers exploratory freedom. In fact, distraction in today’s world might be an adaptation that improves and strengthens the mind’s ability to survive the constant bombardment of information.
As you read this article, consider where you stand: have the abundant distractions of technology truly left society handicapped? How devoted would you be to your job, studies, or assignments without technological temptations? Is pointed focus the ideal? What would you miss out on if you could prohibit your mind from its journey of wonder and curiosity?
—Victoria Warren, Class of 2016
What if you didn’t have a phone? What if every single communication in your life were face to face? Your circle would be small, wouldn’t it? Your world would be your family, schoolmates, co-religionists, tradespeople, and neighbors. Your communications would be purposeful, because they would involve planning to see people and laboring to get to them.
This article by Kevin D. Miller that appeared in a journal of Christian ethics proposes that we might benefit if we considered why the Plain sects—locally, the Amish and Mennonites—debate the use of each new wave of telephone technology. He writes from his own background; one branch of his family is in a technology-using Conservative Mennonite church, while another branch of the family is Old Order Amish.
Members of Plain sects accept that having a life rich in interpersonal relationships requires that they give up their own free will and submit to the rules imposed by church leaders. It does not seem like a sacrifice to them, because it preserves the separation from “the world” that they believe essential to a life of faith.
What would be richer about your friendships if they were all “in real life”? Are the friendships with people you’ve known your entire life different from more recently forged friendships? How would you change what you communicate if you knew that your entire web of family and friends were likely to overhear every word?
—Kate Hastings, Associate Professor of Communications and Coordinator of the Film Institute
What the Luddites Fought Against, by Richard Conniff
Richard Conniff surveys two histories in his 2011 article from Smithsonian magazine. First, he takes us to early nineteenth-century England to understand why the machine-smashing Luddites wrecked the textile looms with which they worked. As Conniff explains, the skilled weavers who adopted the Luddite name did not resent the mere introduction of machinery into their working lives, but rather the way in which the machines allowed owners to break customs that determined who received jobs, who trained them, and how much they earned. Conniff’s second focus is the popular use of the concept of Luddism over time. Although some measure of the original meaning remains, the term has become a catch-all for criticism of or inexperience with the latest and greatest machines. Calling someone a Luddite in the early twenty-first century highlights the awkwardness we might experience with the advent of the next big thing.
As you read, think about whether the Luddite label has relevance today. Do you embrace new technologies before you reflect on their downsides? Are there any social rewards for living—or faking—a “screen-free” life? Does Luddism today have anything to do with generational divides? Finally, consider what it means to take Conniff’s advice and actually “stand up” against technologies that sacrifice human values for money or convenience.
—Ed Slavishak, Associate Professor of History
Technological Revolutions and the Gutenberg Myth, by SD Noam Cook
Try to imagine your daily routine without the aid of technology. For how many days could you keep life “under control” without your phone, your computer, your iPad? As consumers we are led to believe that our lives will unravel unless we acquire the latest gadget. After all, who wants to be the one left behind? But what if the media has inflated our perception of this technological revolution? Has such a revolution been inflated before?
In this article by S.D. Noam Cook, you will read a thought-provoking account of the way modern Western society has chosen a particular technological development, namely the invention of the printing press, and developed a convincing but mistaken narrative in which the printing press is the sine qua non of Western civilization’s development.
No educated person questions that the printing press played an important role in the transformation of Renaissance society, but how fast did these changes take place across all social strata? Did the printing press change the lives of everyone in Europe or was the ability to read, for instance, an obstacle to reaping the benefits of the printing press? Did the printing press change society or did society have to change before the printing press could begin to make its mark? Ultimately, can we expect technology to remove obstacles, or must obstacles be removed before technology can be effective?
—Marcos Krieger, Associate Professor of Music
Breaking Down the Walls of Sound, by David Talbot
No one disputes that new technologies have transformed life both utterly and rapidly, and that the making, storing, distributing and experiencing of music have been radically altered. But imagine for a moment that you are a student at Susquehanna during its first decade in the 1850s. Perhaps the loudest sound that you have ever heard is thunder, or a church bell, or a steam engine. When you think of music, you think of picking up an instrument and playing it yourself, or perhaps hearing the small choir at your church. And now imagine walking all day to hear a full orchestra play a Beethoven symphony in a large, resonant hall. The fullness, complexity, volume and beauty of the music must have been overwhelming. People would later describe concerts like this as being among the most profound hours of their lives.
Today the availability of music has never been greater, but has the experience of music been diminished? Could it be that technology has taken as much as it has given? Author David Talbot says that the invention of recording technology, with its ability for multiple takes and editing, “offered full control over how a piece of music would be experienced.” It is certainly clear that it offered full control over what sounds made it to tape, but did it really offer full control of a listener’s experience of the music? What aspects of the experience of listening to music remain beyond the recording artist’s control?
—Pat Long, Associate Professor of Music
How We Get Our Daily Bread, or the History of Domestic Technology Revealed is an historical article that traces the connection between gender and technology. Gender refers to how a society defines masculinity and femininity. The document’s author, Ruth Schwartz Cowan, shows that, despite the tendency to see technology as masculine because tools are associated with men and with work outside the home, women have for centuries used technology. Specifically, Cowan studies the evolution of technology in relation to bread production and women’s domestic work in the United States. In the colonial period, both men and women
labored to provide their families with bread. Men grew, ground, and transported grain, while women prepared and baked bread. With the advent of industrialization and the use of cast iron stoves and store-bought flour in the nineteenth century, men’s contributions to domestic bread production decreased, while women’s work continued. Over the course of the twentieth century, store-brought bread transformed women’s domestic work, which now stressed purchasing bread products at local supermarkets.
According to Cowan, how did the development of household technologies influence the development of gender roles from the colonial period to the twentieth century? What are the limitations of Cowan’s argument?
—Karol Weaver, Associate Professor of History and Director of the Women’s Studies Minor
Bioethics: The beginning and end of life, by Lori Andrews
After Dolly the sheep’s historic birth in 1996, genetic technologies continued to advance at a staggering pace. In addition to cloning, better tests for genetic conditions, as well as in-vitro fertilization and other reproductive technologies progressed rapidly. Though such research holds the promise of increasing our quality of life, it also comes with great risks: unintentional eugenics, discrimination based on genes, and a host of privacy issues.
Who should be granted access to an individual’s genetic data? While physicians can offer better treatment when informed with a patient’s complete history, should employers and insurance companies also be able to make decisions based on these data? Should parents be allowed to build “designer babies” and select for traits they deem desirable? In terms of class structure, would a gulf between those who can afford to give their babies pre-birth advantages and those than cannot form? Or is a parents’ genetic helping of their children akin to reading to them every night and feeding them healthy meals to ensure the best possible outcome? Moral gray areas quickly develop alongside the advances, and politicians are crunched with the difficult task of developing protective measures to keep the technology’s potential for damage to a minimum while not ignoring its capacity for good. There’s a fine line between preserving freedom and preventing misuse. As you read, I encourage you consider both sides, and develop if, and where you think a line should be drawn concerning biotechnological research.
—Lauren Beck, Class of 2016
The You Loop, by Eli Pariser
Thanks to the internet, information is everywhere. Resources that were once limited to the largest libraries are now available to us any time, anywhere. We have access to the truth in just a few key strokes. Or do we?
While the internet was once touted as the “great democratizer,” delivering information and transparency to all, former MoveOn.org director Eli Pariser argues in this chapter that it has failed to live up to that promise. In fact, he suggests that rather than opening our world and expanding our possible choices, it threatens to make our experiences even more insular.
As third parties like Google, Facebook, and Acxiom monitor our viewing and track our purchases, they are also making our web experiences increasingly individualized. Even basic web searches, are now filtered according to dozens of personal “markers” and customized to fit our assumed interests. As a result, the search results for an undergraduate in Pennsylvania on an iPhone may differ substantially from those of middle-aged father in Oklahoma on a PC. In other words, information from the internet may often serve to reflect our existing knowledge rather than increase it, and ultimately reinforce rather than challenge our beliefs, in an endless—and invisible—cycle.
As you read this chapter, consider what you would be willing to hand over for this personalized web experience. Similarly, how much convenience would you give up in exchange for more accurate, unfiltered information?
—Jennifer Asmuth, Assistant Professor of Psychology
You For Sale: Mapping, and sharing, the consumer genome, by Natasha Singer
Is it marketing, profiling, or stalking? A new breed of marketer, one that analyzes customer buying patterns (and more) to “profile” you the consumer has emerged due to advanced technologies such as data aggregation, data mining, and customer analytics. Acxiom Corporation, and other firms like it, maintain and “refine” data on millions of individuals and households across the US and sell “elements” to retailers to supplement their targeted marketing efforts. Acxiom is a new breed database marketing firm that is largely out of public eye and with little to no regulation on what data they can collect, package, and sell.
As you read this article, think about whether you believe this is good marketing or an invasion of privacy? Would this be convenient to you as a consumer or scary? If you could, would you periodically check to see if your information was correct? What kind of data that you would like to keep out of the hands of these marketers? What kinds of safeguards you would like to see implemented to keep your data truly private? Marketing is becoming a brave new “technology-driven” world; the question is are you ready for it? Do you feel safe?
—James J. Pomykalski, Associate Professor of Information Systems
Plagiarism Lines Blur for Students, by Trip Gabriel
Susquehanna University’s Academic Honesty Code states that “Plagiarism results when students neglect to acknowledge in footnotes, endnotes or other forms of documentation their use of the words and ideas of others. The failure to acknowledge and properly document your use of sources and materials, even if unintentional or innocent, amounts to representing as your own the work of someone else.”
In an era when much of the world’s knowledge can be accessed instantly through smart phones and other devices, how should we think about concepts like originality and fair use?
In this New York Times article, columnist Trip Gabriel and the students, instructors, and researchers he spoke with, offer a number of explanations for why members of your generation of learners may find this definition of plagiarism difficult to understand or accept.
As you read it, consider the following: Do you agree “that many students simply do not grasp that using words they did not write is a serious misdeed”? Or, is this an excuse for being unprepared or unwilling to engage in the “intellectual rigors of college writing?” Does the ease with which information is available change how we should think about its “ownership?” In what ways can the unprecedented access to information that your generation enjoys help or hinder your learning? How would you feel if another student benefited from work you created?
—Phil Winger, Vice President for Student Life & Dean of Students
Prior to 2005, Chinese dissident artist Ai Weiwei had little to no contact with the internet, but he had always considered himself an activist. After being invited as a celebrity blogger by Sina.com, a Chinese media platform, Ai found a new means to express his ideas. In the wake of the 2007 earthquake in Sichuan, China, the artist used his blog to unite like-minded Chinese as they pressured the government for the truth regarding the death of school children in shoddy, state-constructed buildings. And, when met with government resistance, Ai used his blog to compile a list of the victims, circumventing the government and effectively protesting their treatment of the event and those involved.
Even with continued threats and detention by the Chinese government, Ai will not be silenced. His tweets now reach more than 200,000 followers, and stand as a continued protest against Chinese censorship. This is not to say that these tweets are without humor. In fall 2012 the artist tweeted and released his own Gangnam style video. He dances wearing handcuffs—a sly reference to his recent arrest. While Ai continues to work as an artist, he’s found in the internet and Twitter a new mode of artistic and activist expression.
As you read this selection, consider how social media changed the way we think about activism and protest. Have you ever used social media in a similar context? How has the internet and social media changed the role of the artist? And, would you consider Ai Weiwei’s online activities as art?
—Ashley Busby, Assistant Professor of Art History
The Role of Digital Media, by Philip N. Howard and Muzzamil H. Hussein
The political uprising known as the “Arab Spring,” which occurred in the Middle East and North Africa during late-2010 and early-2011, reminded the world that lawless autocrats are not immune from mass mobilization. In this article, Howard and Hussain discuss a new frontier in the politics of mass movements: the role of digital media. More specifically, they explore how the Egyptians and Tunisians used the internet to organize against their respective regimes. The authors also discuss governmental efforts to control digital communications in an attempt to quell mass uprisings.
Though it is difficult to definitively conclude that Arab dictators such as Hosni Mubarak and Ben Ali lost power as a result of digital media like Facebook, Twitter, or YouTube, the role such technologies played in mass mobilization must be recognized and understood. As you read this article, consider whether you think Egypt’s Hosni Mubarak would have stayed in power had it not been for the Facebook memorial page “We are all Khaled Said”? Would Tunisia’s Ben Ali remain in power today if Mohammed Bouazizi’s self-immolation had not reached the masses?
The “Arab Spring” arguably offers a natural laboratory to observe the effect of digital media on the politics of protest. How do you think these relatively new technologies will shape future political movements? Have you ever followed or participated in a political movement through social media sites?
—Baris Kesgin, Assistant Professor of Political Science
36 Ways to Learn a Video Game, by James Paul Gee
In his introductory chapter to What Video Games Have to Teach Us about Learning and Literacy (2007), James Paul Gee outlines the argument that video games owe their popularity in part to their learnability. Games through their design entice players to spend significant time and effort engaged in activities (learning to play the games) that are cognitively demanding and frequently frustrating. Gee uses video games to critique much of school-based learning. Learning in schools is too frequently devoid of meaningful context or motivation, disconnected from applications, and carried out without collaboration. Many young people work to avoid school learning experiences at the same time they gravitate to the cognitive challenges of video games.
The chapter concludes with a discussion of the consequences of playing video games on individuals’ behavior in the real world. Gee disputes the arguments that video games lead to social isolation and violence, presenting evidence that games are no more dangerous than other technologies: television, movies, or books. Finally, he addresses the increasing popularity of video games among girls and women and the portrayal of females in games and popular culture.
In what ways do Gee’s arguments resonate or conflict with your own experiences of playing or observing others who play video games? What, if any, “lessons” should educators learn from the gaming industry?
Games for Science, by The Scientist Staff
“It’s all fun and games until someone gets . . . helped!” These three brief articles from The Scientist magazine turn your mother’s old saying on its head. Gamers solve actual scientific riddles—like the genetic structure of a retroviral AIDS-like disease—by harnessing the power of competitive multi-player online games. Games like World of Warcraft engage struggling learners in the science classroom and help them succeed. Medicine harnesses Wii Balance Boards, Dance Dance Revolution (DDR) and other video games to fight disease or at least to empower and educate its victims.
Fifteen years ago the question of the computer’s abilty to compete with humans was settled for good as a computer program called Deep Blue finally defeated chess grand master Garry Kasparov. But if thousands of gamers can outperform the best available RNA-sequencing algorithms, it is pretty clear that the ultimate battle will not be fought between a computer and a single human, but rather between a computer and a connected group of gamers—who really just might end up saving the universe.
Each of these three articles explores the competition and the synergy that exists between computers and humans. Do games really have an impact in the science classroom? Can DDR really fight disease? Or are these claims of the efficacy of technology overhyped?
-- Michael Nailor, Director Teacher Intern Program
Rethinking Education in a Technological World, by Allan Collins and Richard Halverson
Rethinking Education in a Technological World, by Allan Collins and Richard Halverson
Playing video games. Spending time on Facebook. Participating in online fantasy sports leagues. All of these activities, your teachers and parents have probably told you, are distractions that get in the way of learning. But that might not be entirely true, according to Allan Collins and Richard Halverson, who argue in this article that video games and social networks may become important components in a new approach to education.
As the authors see it, we’re entering a technologically-driven revolution that will change the way we learn—and the way schools teach. Emphasis in schools, they say, will be placed on guiding communities of online learners who interact with each other to build knowledge collectively. On an individual level, technology will mean more and more learners intent on seeking out information that is of special interest to them will be able to do so—as opposed to being content to sit passively in a walled classroom where they are fed information about topics. Collins and Halverson predict that such changes will ultimately give students greater control over their learning. And with that, they say, will come a greater love for learning in general.
Can you foresee a day when your teachers will actually encourage you to play video games or log on to Facebook as a way of learning? How have you seen technology change the face of education as you moved from grade school, to middle school, to high school?
--Dave Kaszuba, Associate Professor of Communications and Director of the Center for Teaching and Learning
SU Learning Goals, by SU Faculty, staff and students
One of Susquehanna’s distinctive hallmarks is the Central Curriculum, which represents the faculty’s commitment to provide the broadest opportunities for students to achieve the Susquehanna University Learning Goals. The Learning Goals emerged from a campus-wide dialogue about what kinds of learning, both knowledge and skills, should distinguish Susquehanna graduates, and the Central Curriculum was designed to provide you a pathway to attain the Learning Goals.
The Central Curriculum provides the breadth of your Susquehanna education, complementing the depth of learning provided by your major. Your learning doesn’t stop at the classroom door, however. You will find the influence of the Learning Goals in your residential life experience, your participation in athletics and student organizations, and your work experiences in academic or administrative offices.
The common reading is the first of many invitations you will receive in your time at Susquehanna to learn in a mix of formal and informal settings. Just as the Central Curriculum is a common framework for the education of all Susquehanna students, the common reading is a shared experience that enriches learning for all first-year students and strengthens your connections within our intellectual community. I hope you will find rewarding your opportunities to discuss the readings in class, participate in campus events built around the theme of the readings, and share thoughts about the readings with your fellow students.
--Carl O. Moses, Associate Professor of Earth and Environmental Science