Common Ground hosts a variety of different events to raise awareness of critical issues and spark discussions between students. Photo Courtesy of The University of Central Florida
On Thursday Oct. 9, Common Ground held a campus-wide dialogue regarding the benefits, risk and the unknown consequences of artificial intelligence (A.I.). While the previous campus-wide event last semester, where students, faculty and staff debated if environmental protections are more important than economic growth, this campus-wide event served as a channel for members of the college to share their thoughts on A.I. Like last year, the event opened with opening speakers to provide the audience with a starting point to discuss the impact of A.I. After both speakers had shared their perspectives, they opened the floor to the crowd.
Throughout the course of the discussion, speakers who were critical of A.I. were concerned with its environmental costs and the erosion of human critical thinking skills. For example, a speaker observed that their siblings were outsourcing their school work to ChatGPT. Many were concerned with the developmental impact of A.I. They believe it is not only corroding the development of younger children, but also leaving its marks on older generations, like college students. One speaker expressed her concern that “some of [her] peers can’t even form independent thought without A.I.”
One speaker mentioned a recent MIT study that observed that cognitive activity decreases as the use of external tools, like ChatGPT, increases. Many students said that reliance on A.I. for school work is rooted in its convenience and how effortless A.I. is able to produce well-written works. Much of the early discussion centered on the educational implications of A.I.
Besides the impact A.I. has on cognitive development, other speakers drew attention to environmental and health effects of A.I. One speaker pointed to the health effects of A.I. data centers in a low-income town in Memphis, Tennessee. These data centers “have been reported to be emitting natural dioxide gas, which is known to be a detriment to a lot of respiratory illnesses such as asthma,” the speaker said. Democracy Now reported that the data centers increase local smog by 30-60%.
Another speaker highlighted the use of natural, finite resources to build and operate these data centers. “Not only is the amount of water an issue, it’s also the type of water. It’s clean drinking water that used to cool down the generators” in the centers. Smaller data centers consume approximately 18,000 gallons of water per day, and larger centers can consume up to 550,000 gallons of water per day according to Dgtl Infra.
Although most of the discussion revolved around the costs and risk associated with A.I., some speakers believed that A.I. could be beneficial. One speaker learned about an “A.I. technology that can predict folding patterns of amino acids such that it can help us create different proteins which can then create new medicines and vaccines that can benefit humanity greatly” in one of their biology classes. Another speaker noted that A.I. is being used to predict wildfires and help with resource allocations.
Students delves into the different regulations of A.I. at the latter half of the discussion. Some speakers called for environmental regulation of A.I. facilities and law enforcement protection for intellectual properties.
One speaker touched on an incident in Detroit where law enforcement used a specific facial-recognization software that led to an arrest of an individual “at his home in front of his wife and kids because he was suspected of stealing a watch from a store. And his face was calculated to match another individual’s face, but it was a miscalculation made by the software.”
Another speaker pointed out that using A.I in law enforcement can lead to reinforcement of racial stereotypes because A.I. learns from biased data and tends to create its own “self-fulfilling prophecy.” According to the NAACP, A.I. recreates biases in policing data and leads to increased police activity in black neighborhoods.
Near the end of the event, some speakers expressed concern about the threat A.I. poses to the “job market and the economy and the social hierarchy.” One speaker was worried that A.I. is taking many of the entry-level jobs that were originally for fresh college graduates. “A.I. compromises the human spirit and the human well-being,” one student said in objection to A.I.’s increasing prominence.
In the midst of these concerns A.I. has its involvement in the job market and the economy, a faculty interjected with a different opinion by sharing their personal experience. They believe that “a lot of those entry level jobs are going to loop back around to being available [because] they are the most public-facing positions in most companies, and something the public doesn’t like is talking to a bot.”