Skip to Content, Navigation, or Footer.
Support independent student journalism. Support independent student journalism. Support independent student journalism.
The Dartmouth
November 23, 2024 | Latest Issue
The Dartmouth

An iPhone in Hand... Worth Two in the Bush?

In the know. Savvy. Informed. Tuned in. Appraised. Knowing what’s what. With it. Au courant. Plugged in. All of these qualify the state of the average human being today. Technology has stretched its web over and around the world. The strands are pulling people from all corners of the world together into close quarters; they cross paths in the same online markets, the same news servers, the same online forums, game rooms, articles, chat messages and FaceTimes. People have access to information and answers beyond what could be found in a stack encyclopedias. They observe, or even participate, in technological innovation happening in the fields of science and health and society every day. They have the power to order things to their doorstep, finish errands with the flick of a finger and get directions to anywhere and everywhere they would like. 

Technology has given us all this — plus some more. And people seem to be making good use of these privileges. They seem to be enjoying it all, quite a lot, quite often — almost constantly. Very rarely are people of the developed world unplugged. Laptops and iPhones are attached to the lap and the hand, lines and waiting rooms are now times for (almost obligatorily) catching up on Instagram instead of places to stand around. Commercialism is at a high as Amazon packages veer left and right along the roads. Workspaces, fueled by startup culture, are dominated by virtual interaction and online platforms. 

The 21st century might be afflicted with a slight addiction. 

But what kind of addiction is this? Is it serving us well? The advancement of technology is spinning at a stunning pace — and the global stage it is setting is unlike anything the world has ever experienced before. How has it changed the way in which the human mind works? Is it truly a benefit?

Take the classroom here at Dartmouth, for instance. More and more curricula revolve around online resources. Lesson plans integrate and make use of media, assignments are completed and submitted online and pages of online shopping and social media outlets slide up and down laptop screens with alarming frequency. The moment that class is dismissed, iPhones emerge from pockets, earphones go in and eyes are cast downwards to the screen for a recalibration with the social world. 

     Humans are a very social species. According to Facebook in 2014, each of their 1.23 billion users might sign onto the site for an average of 17 minutes day. This might not seem like much, but those minutes eventually tally up to more than a collective 39,757 years of time spent on the site in a single day. Humans are also a very narcissistic species. This is not just a value statement. There exists, in the brain, a neural circuit called the default network — a series of regions that is actively firing when the brain is not occupied by any task. This is considered a state of “restful wake,” or daydream. What is interesting is this: one of those active default regions is in the medial prefrontal cortex, which also engages when the mind is in restful wake, as when the mind is thinking about oneself.  

  This overlap in neural engagement is what psychology professor Meghan Meyer, has been investigating in her research of human social cognitive neuroscience. 

“The tendency [of the human brain] to activate those regions at rest basically nudges you to think about yourself,” Meyer explained. 

This could very well be an indication of why we, as humans in this day and age of technology, are so smitten with social media. It is a medium where we may spend all the time that we would like pondering and playing with our virtual image and its relation to those around it. 

“When we disclose information about ourselves, we are activating reward circuitry in the brain,” Meyer said. 

However, Meyer points out that how these inherent neural patterns are going to play out on the evolving social stage is unclear. 

“We have all this new technology, but we have evolutionarily old systems that are dealing with it; it might be what’s leading to some of these odd behaviors,” she said.

Developmental psychologist Howard Gardner proposed a theory of what he coined the “naturalist intelligence.” This, according to his theory, is the human’s ability to make critical decisions in the natural world. Our naturalist intelligence is our reference book of instinct for use in the navigation of our world —  it is the Charles Darwin intelligence, a vestige of our ancestry, the tailbone of the brain, one might say. We may no longer be picking between the poisonous and the edible mushroom. We may not be out hunting. We may not be fashioning a dagger from a rock and stick. But we are choosing which winter jacket to buy or which haircut to get. We are choosing which photo to post on social media or when best to buy that plane ticket. We are still making decisions with the aim of optimizing our safety and our status. The cognitive processes remain fundamentally the same. We have just evolved to reapply them to a different setting. 

Technology presents us with a wealth of choice. It takes time and attention to filter through all of the available possibilities put before us. Education professor Holly Taylor tries for this very reason to discourage her students from having laptops out in class. 

“There are lots of things that are happening cognitively within the classroom,” she said. 

Taylor will use multimedia resources to supplement her lesson plans, which she believes bolsters engagement with the material and encourages active, real-time investigation into class topics. The variety of modalities with which Taylor can share her curriculum, can round out the learning experience for her students. But therein lies the irony: the same variety of modalities can also become overwhelming. If students have access to the internet during class time, the possibilities are endless. They can check the news, Facebook, read their emails and messages, check that online sale and then check Facebook again. The student’s attention can become divided. Why is his technological juggling act so addictive? Is it an earnest desire to know everything about everything? Most of the time, Taylor argues, it’s not. 

“People are cognitively lazy” she said. 

Humans are categorizers, taking incoming information and quickly, reflexively sorting it. In this way we are able to manage the rich and complex and fast-paced world around us. Too much information could be paralyzing. When students perform this juggling act, they are skimming the surface for the information, but pushing off the critical thinking. Taylor cited Gardner, saying that if we went back out into the woods to be with our naturalist intelligences, people would have time to think again. But today this may not be possible.

“Ever-present technology is taking over the time to think,” Taylor said. 

Matt Magann ’21 is an anomaly — he does not have a Facebook. He deleted it a year ago. And now?

“I think I have a pretty healthy relationship [with my phone] — I’ll use it like I use any other practical thing,” he said.

I found him and two other students, Fracis Sapienza ’21 and David Vonderheide ’21, coming back to campus Sunday afternoon from a day in the mountains of Coos County, New Hampshire. When questioned about the alleged “addiction” of the 21st century, there was a chorus of ardent agreement, yet the three of them managed to escape it for a day. Their phones snapped a couple photos of the view from the snow-covered summit of the 4,000-footer in the 20-degree weather — but otherwise they remained in-pocket. 

Magann is a member of The Dartmouth.