AG WEB Powered by Farm Journal

The Hueber Report

The Hueber Report is a grain marketing advisory service and brokerage firm that places the highest importance on risk management and profitable farming.

Weekend Commentary - Thought Police

Jul 03, 2014


If you have tuned into the news this past week, you have undoubtedly heard about the controversy concerning a "social" experiment that Facebook conducted on unsuspecting users of the social network.  As it turns out, for a week back in 2012, all in the interest of "science", Facebook chose 689,000 lucky users. For part of the group, the social network fed them negative stories in their news feeds while the other group received positive stories. Guess what, and this is the shocking part, those who read negative stories tended to post negative comments while those that received happy information responded by posting in like kind.  Now I suspect there is some would-be researcher out there who is very disappointed because he had just applied for a multimillion dollar federal grant to conduct just such a study, but really, hasn’t anyone who has raised a child or taken an introductory psychology class already learned this?  That is why it is all the more preposterous for Facebook to maintain this was conducted as innocent research.  This should be classified as a real life example of Orwell’s "thought police", except in this case they are not only monitoring what we read and say, but manipulating it for later commercial purposes.


By no means am I making light of what this company has done, but I am equally as amused/annoyed by many of the responses.  We hear indignation about the invasion of privacy, the violation of trust, reducing users to little more that lab rats and of course abuse of power, all of which are true.  But that begs the question, does this surprise you?  It does not matter if it is Facebook, Google, Bing, Yahoo, Pinterest, et al; they are all in the business of collecting data from every move we make on the internet in an effort to sort that through a number of algorithms to either direct information or, better, a product back to us that we can possibly use and/or ideally purchase.   Facebook even has a specific term; they refer to it as "curating" information.


Political activist, Eli Pariser, developed the concept of The Filter Bubble and wrote a book of the same name that examines the ways and means that search engines, mail servers and socials sites gather our information and then make determinations of what we should receive.  For example, did you know that when you search for something on Google, it will look at 57 individual data points about you before returning your answer?  In reality, Google or Facebook or any of these other electronic sites are delivering to us information that "they" have determined we want.  What Mr. Pariser found in his research is that, as you have developed your "electronic profile", the algorithms in the search engine will ultimately deliver results that only match your interest and biases. If you would like to watch a highly informative TED presentation by Mr. Pariser, click here .)


Part of the danger in all of this is that it will potentially create a people who are already more myopic than we already are.  Certainly we all like to read information and outlooks that appeal to our biases. But if we limit ourselves to only that, how will there ever be a chance of open and honest dialog where points of view that differ from our own are represented?  If everything we are presented is already in agreement with ourselves, soon we come to believe that "my" opinion must be the majority view because every time I search for information that is what my search tells me.  This is the World Wide Web, right, so it much be searching everywhere.  Isn’t it curious that as we as a culture have more and more information at our fingertips, we appear to become more linear and separated in our thinking?  Now there would be a study worth funding—has the explosion in the accessibility of information on the Internet created a more narrowly focused society?   The dysfunction in Washington would seem to back this up.

Again, by no means am I condoning the Facebook "social research" project, but I believe the reason people should really be up in arms about data gathering is that someone else has determined what we should and should not be reading.  Because a computer program has decided via our past patterns that we should only be provided with a certain kind of information because it fits our profile, and if you give us what we want to see, we will be happier, and of course if we are happier, we will be more likely to purchase something that the computer says our demographic will want to buy.  Unlike the "thought police" in George Orwell’s 1984, we have not lost the privacy of our information so they can stamp out dissidents, but rather, in an effort to sell them the latest fashion accessory for the next protest march.  I am not sure which is more dangerous.