MSU researchers receive grant to build ‘algorithmic awareness’ as form of digital literacy
BOZEMAN — Individuals’ experiences online are increasingly shaped by algorithms, but there is little awareness of how those algorithms work, according to researchers at Montana State University.
“Most of us routinely engage in systems that predict, recommend and speculate about our interests based on the digital fingerprint we provide with our link clicks and ‘likes,’ but we all struggle understanding how and why those systems work as they do,” said Jason Clark, associate professor and head of Special Collections and Archival Informatics at Montana State University.
But, Clark emphasized that understanding the rules that govern software and shape users’ digital experiences is a form of digital literacy that is important knowledge for everyone who uses the internet to possess, just as knowing how to read and write are key literacy skills.
To help increase awareness of algorithms, the MSU Library received a $50,000 grant for “Unpacking the Algorithms That Shape our User Experience.” The project includes three main parts, all with a goal of introducing “algorithmic awareness” as a form of digital literacy: researching algorithms and writing a report for users, developing a teaching tool in order to give transparency to common algorithms, and creating a curriculum and pilot class. The grant is from the Institute of Museum and Library Services through a Laura Bush 21st Century Librarian Program planning grant.
Along with Clark, Julian Kaptanian, an undergraduate history student from Kalispell in MSU’s Science, the Environment, Technology, and Society program, is working on the project as a research assistant. Tyler Bass, who received a bachelor’s degree in computer science, also earlier served as a research assistant. An advisory council, which includes a representative from Mozilla and professional leaders, as well as MSU Library faculty Jan Zauha and Scott Young, is guiding the work.
Algorithms – which Clark defines as the rules and formulas that govern our software – shape most of our digital experiences, he said. As part of their work, the team has focused on four common types of online experiences that involve algorithms: e-commerce, social networks, entertainment and search.
“In seeking to understand common systems, like the Facebook news feed or Google search engine results page, we view this grant research as an opportunity to discover the scope and reach of algorithms and how they might be taught to a general audience,” Clark said.
Because algorithms are constantly changing and many are proprietary, it can be difficult to drill down and understand each specific one, Clark noted. So, rather than teaching the specifics of particular algorithms, the grant team is seeking to share a set of first principles and common software rules that can lead to basic understanding.
Still, there are a number of challenges that come with teaching about and understanding algorithms, he noted.
Perhaps one of the biggest is a widespread misconception about how algorithms are formed, Clark said. Specifically, there is a sense that algorithms are impartial mathematical formulas and therefore cannot be changed or challenged.
“We often see news reporters talking about algorithms as though they’re an active, conscience agent,” he said. “One of the things we’ve realized is that algorithms are a set of rules that a programmer creates.” In other words, he said, a person makes choices to apply these formulas and sets these processes in motion.
Many people avoid learning about algorithms due to their perceived complexity, Kaptanian added, or because they think they don’t have material consequences, which results in uninformed users.
“Without at least a general knowledge base, it is difficult for users to feel empowered to question seemingly objective results, whether that be personal queries or even algorithms that perpetuate digital redlining,” she said. Digital redlining refers to technology policies and practices that reinforce inequitable class and race boundaries.
Bethany Nowviskie, director of the Digital Library Federation and a digital humanities faculty member at the University of Virginia, said the skills and digital literacies that Clark and his team term “algorithmic awareness” are essential for today’s students and scholars. The federation is a community of practitioners who advance research, learning, social justice and the public good through the creative design and wise application of digital library technologies.
“Computer algorithms shape our lives — often invisibly, but with profound effect,” Nowviskie said. “The great contribution of this project is to provide concrete teaching tools that can ultimately help a new generation of students ensure that the computer systems they build and use serve people well — respecting privacy, mitigating bias and promoting individual welfare over corporate interests.”
The grant will end Nov. 1, Clark said, although he expects the work will continue as either an Institute of Museum and Library Services project grant or research grant focused on curriculum implementation. Kaptanian – who is considering a capstone project around the history of the consequences of social bias in software as a technology - said she hopes the grant team’s work leads to more active digital consumers. “It is easy to be a passive consumer, especially when elements of digital authority are not readily discussed,” she said. “By introducing algorithmic awareness as a form of digital literacy, people can be empowered to be active consumers. That is why I believe this work is so important.”
To date, Clark and Kaptanian have taught several workshops and released a teaching module at the MSU Library, for the Association of College & Research Libraries and for the Digital Library Federation. People who are interested in learning more about the project are invited to visit github.com/jasonclark/algorithmic-awareness to see the code for the teaching tool and educational resources that are free and open to everyone.
- By Anne Cantrell, MSU News Service -