Data Privacy in the age of Big Data
During last week’s guest lecture, Carl Eschenbach mentioned that Sequioa Capital had turned down investing in certain companies in the past because of ethical reasons. This led me to think more about the ethical implications of big data and predictive technologies. Just how much of my data is being collected and what is being done with that data? Some of the forms of data collection seem harmless: Netflix tracking my viewing habits to make better recommendations. But other examples are kind of creepy: I mentioned needing to change my skincare routine to a friend through Facebook messenger and later that night, received multiple advertisements from different skincare brands (shouldn’t private messages be private?) So, should I care that some companies seem to know more about me than I know about myself?
We can take a closer look at few companies who have been tracking, analyzing, and experimenting on our personal data, both with and without our knowledge:
- Google: It is no secret that Google tracks your search history and browsing behavior for more targeted advertising. Through Youtube, companies are able to create many different versions of a video ad tailored towards different audiences. Through Google Ads, companies can use not only Google search data, but also Google maps search and visitation data to target advertisements [1]. Would you be willing to have your searched history tracked and recorded in order to have more relevant advertisements? You’re going to see ads anyways so why not see ads for items you actually might want to buy?
- Palantir: Palantir has been working with a number of police departments in order to predict crime hot spots and high-risk criminals. In the process, Palantir has conducted extremely intrusive research into the lives of ordinary citizens (specifically, in LA and New Orleans) without their consent or knowledge. [2] Certain law enforcement agencies have claimed that such predictive technologies will be very helpful in reducing crime. But it is still extremely uncomfortable knowing that law enforcement is analyzing information on civilians and labeling them as high or low-risk criminals without their knowledge. Would you be willing to give up your personal information to feel more safe, even if there has not been any solid research backing up the effectiveness of such techniques?
- Facebook: A few years ago, Facebook conducted a psychological experiment on over 500,00 users by “manipulating their news feeds to assess the effects on their emotions.” [3] The study eventually found that the emotions expressed by our friends on our timelines did influence our own emotions, which is not surprising at all. This study disclosed the extremely uncomfortable fact that Facebook has the ability to easily manipulate user experience and experiment on its users. It can also lead to companies targeting advertisements towards Facebook users based on their mood. Would you be willing to give up your personal information in the name of science and research, even if you don’t now what kind of research is being conducting on you?
So as more of our personal data are being collected, analyzed, and stored, it is important to consider just how much privacy we are willing to give up in exchange for increased safety, convenience, personalization, and all the other promises that companies have made.
References:
[1] https://techcrunch.com/2017/09/25/youtubes-new-ad-tech-automatically-personalizes-ads-can-now-target-using-google-maps-app-install-data/
[2] https://www.theverge.com/2018/2/27/17054740/palantir-predictive-policing-tool-new-orleans-nopd
[3] https://www.forbes.com/sites/gregorymcneal/2014/06/28/facebook-manipulated-user-news-feeds-to-create-emotional-contagion/#df1d1e339dc7
One comment on “Data Privacy in the age of Big Data”
Comments are closed.
In recent times, data privacy has quite obviously been in the public spotlight. It really makes you think though – why are companies so underhanded/sly when it comes to capturing your data in order to render better ads/more tailored content to you. If the end goal really is to use your data to provide you with better products/services/ads, then why not just ask for your data up front and ensure it is stored and deleted in a timely manner? Why do a “behind-the-scenes-let’s-look-at-your-search-history” to find out who you are and what you’re looking for? I think there are a lot of people who would volunteer their data if they knew what it was used for, when and how long it would be stored. I think what’s happened is an underground market of sorts has opened up where data is being traded, and sadly this is too lucrative an opportunity for big companies to pass up. More transparency in this area, along with tighter regulation of data capture and storage will greatly increase the amount of positive things we can do with big data.