Monday, October 20, 2014

"The Relevance of Algorithms" by Tarleton Gillespie

     Tarleton Gillespie's article "The Relevance of Algorithms" primarily argues about how algorithms have become an ever-increasingly more important part of every day life because of how they filter and organize the immense amount of information that is found online so that it is easier and more manageable for users to parse through. He talks about how algorithms play a vital role in a massive number of online services that we use every day, from search engines to social networking sites. Algorithms allow users to find what is relevant, what is popular, and what is useful from the mass of data that exists on the web. Gillespie then talks about what algorithms are from a more theoretical perspective, explaining how they are, in general, simply tools to transform "input data into a desired output" (Gillespie, 2014, Page 168). He explains how algorithms are essentially worthless if they are not provided with a database from which to work off of. Gillespie talks about how often these databases are built from the practice of collecting user data and activities online. Gillespie goes on to talk about how "raw data" must be prepared to an extent before algorithms can be successfully run on it. Another important topic discussed is what the databases choose to exclude, as this is what can differentiate similar databases from each-other. Gillespie also discusses the claims that many algorithms try to make about what they can conclude from their data. Many algorithms make very large claims about what their data shows, and often it can be argued that they do not actually illustrate as much of a correlation or pattern as they claim to. Many people look at algorithms as being a very credible source of information without asking why the data shown is accurate or relevant. Gillespie then spends a lot of time examining how algorithms tend to change over time and with the changes in how society views information privacy and what they view as a credible source of information. As a computer science major I found this article to be very interesting because I don't often think about the way that algorithms I write could be used or why I feel that they are accurate. I find interesting to consider why the algorithm I am writing should be credible and I think that consideration will lead me to writing better algorithms in the future.

     Tarleton Gillespie is an associate professor of information technologies at Cornell and focuses on the implications of such technologies, their laws, policies, and technological changes. He has written many other publications on various technology topics, from copyright to the digital divide, including several others on algorithms. His first book was published in June of 2007, and his first article was published in 2004. "The Relevance of Algorithms" was first published by Gillespie in 2013 but will be featured in a forthcoming anthology called "Media Technologies" that will be published by the MIT press and contain several articles on similar topics by various authors. The article has been written with an audience of people that are somewhat versed in technology and the internet in mind, but is written in a way that allows for many people with even a basic knowledge of the ideas to understand it well. The article has been received fairly well by readers, and it has been cited in many other papers on the topic. It has also been presented and featured at several conferences and gatherings that focus on information policy and social implications, and has been widely regarded as a work that makes a clear point and supports it well.

One question I have after reading this article is if companies create their algorithms based off of outside reasearch or if they build them from their own internal reseearch, and how they provide credibility for their algorithms accuracy.

A few similar articles by other authors include:

Privacy-Preserving Data Mining by Rakesh Agrawal and Ramakrishnan Sirkant


No comments:

Post a Comment