The Filter Bubble – Book Review and Notes

I just finished The Filter Bubble: What the Internet Is Hiding from You by Eli Pariser.

This is an excellent book covering the issues involved with the increasing move towards personalisation (filtering) being implemented by Google, Facebook and many other companies (plenty of iPad apps that do this now too).  This is something that I have frequently thought about myself – how an increasingly more narrow view of the world will effect our perspective and our behaviour.

While reading, I took the following notes:

  • great overview of personalisation and the goal of relevance
  • comparison of Google and Facebook – nice summary of how both started, where they compete but also how they are different
  • discussion about ‘lock-in’ – personally FB has better lock-in than Google as it doesn’t take much to switch search engines – assuming that there is a better alternative
  • scary info about the data that companies like Acxiom and others store and the whole inner workings of remarketing
  • viewpoint on retargeting – users not sites are now the focus and capture the premium dollars
  • a review of how the news is created and delivered – it’s evolution (origins) and how it will end up in a world of personalisation (this section was comprehensive and fascinating – especially the effects of disintermediation or the illusion of disintermediation)
  • we are quick to complain about bias in the traditional media (e.g. Fox News), but no one is complaining about the new curators – the intelligent agents filtering our news
  • the impact that ‘traffic chasing journalism’ has had on the news media and on journalists – bringing them down from ‘Olympian heights’ to the same level as everyone else
  • parental personalisation vs sycophantic personalisation – we’re heading for the latter according to Nicholas Negroponte
  • explains how personalisation filters can also upset the cognitive balance of our brains – surrounding ourselves with ideas that we already agree with makes us overconfident in our mental frameworks (through confirmation bias) and removes key prompts to make us want to learn
  • studies show that information that challenges our view of the world as we see it has a positive impact on cognitive behaviour (I.e. Perform better mentally) – without the surprise of unexpected events and associations, the filtered world would provoke less learning
  • Growth in use of neuroenhancing drugs like Adderall which is usually fir ADHD sufferers but being used by others to compete intellectually and narrow focus to concentrate on one thing for longer periods of time (reports, studies,etc) Amongst side effects is a decrease in creativity (interestingly, I read James Patterson’s book called Toys that takes place in a future world of “Augmented” ‘Elites’ that despite superior physical and mental power, they lacked any and all creativity) – author compares personalisation filters to Adderall – what he is calling a push towards an ‘Adderall Society’
  • being around people and ideas different from ones own can boost creativity – even 45 minutes of exposure to a different culture can boost creativity.  US Students shown a slideshow about China instead of America increased their scores on creativity tests taken shortly after
  • the better Google gets at it’s core mission of transforming intention into action, the worse it will become at providing serendipity – the process of stumbling across the unintended
  • from a statistician’s perspective, you can’t tell how biased the sample is from looking at the sample alone – you need something to compare it to.  In a filter bubble you have no such point of reference.
  • targeting (personalisation) also prohibits choice – life choices. Not knowing that something is available (career, job, contests) effects / shapes your life by presenting some possibilities to you and blocking out others, the filter bubble has a hand in the decisions you make in life not just what you learn/know.
  • in identifying / defining ‘you’, there’s a big difference between ‘you are what you click’ (Googles approach) and ‘you are what you share’ (Facebook’s approach)
  • Personalisation doesn’t take into account the difference between your ‘work self’ and your ‘play self’, your aspirational self (who you would like to be) and your current self.  He calls this the ‘one identity problem’
  • This makes me think about the “observer effect” where just the act if observing something changes it?  As we become more aware that everything we do online is being tracked, recorded, measured and reported – do we start to change our ‘online’ behaviour and in doing so change (or perhaps even break) personalisation ??
  • your ‘persuasion profile’ is also being developed – what sort of marketing message appeals to you and triggers a purchase (reviews, discounts etc). Apparently what is most persuasive for you transcends product categories and is therefore very, very valuable to marketers
  • there is a ‘you loop’ – you click on something which gets recorded as interest which means you’ll see similar topics and rinse and repeat. But it is easy for your ‘identity’ to be misrepresented because of the amplification this loop causes too.
  • targeting is also about putting you in groups as much as personalising – the problem with this is occurs by programmatic stereotyping especially when using the social-graph.  For example, if your friends are frequently late payers – then you must be too. You are being judged without your knowledge every day. This is discrimination.
  • the problem of programmatic stereotyping also exists in predictive algorithms. If LinkedIn used it’s data to predict the success of your career later on life based on your friends, education, etc from when you were 18 or 21 – how accurate would it have been? How fair would it be if this was then used by employers – possibly preventing your career choices in life.  This uus also discrimination.
  • predictive algorithms should be constantly proving their models wrong by introducing the opposite choice, rather than perpetuating a systematic confirmation bias
  • ‘algorithmic induction can lead to a kind of information determinism, in which our past clickstreams entirely decide our future.’
  • author compares personalisation to censorship. Instead of a government censoring the information that you are allowed to see (as in China), there are only a few centralised companies making those sorts of decisions. So personalisation is censorship of some form.
  • in fact, this consolidation of personal data in the ‘hands’ of a small number of companies could result in more government control rather than less (as is at the core of the Internet and it’s use). The US government for example does not need a warrant to access your personal details in the ‘cloud’ – it simply asks the company holding the information. And as these companies gave a desire to reduce/avoid regulation, they usually comply.
  • an increasing chance of ‘friendly world syndrome’ will occur when living in a filter bubble and seeing the world through Rose-tinted glasses -where ‘some of the biggest and most important problems fail to reach our view at all’
  • author discusses personalisation’s impact on politics and how filter bubbles ‘make it increasingly difficult to have a public argument’
  • when asked, most Internet entrepreneurs, technologists, and programmers downplay the role/impact their software has – absolved of responsibility – they simply say that people can choose not to use it, etc
  • there could be worse personality types than programmers / Internet geeks to entrust with this sort of power – they generally have a reverence for rules and consider themselves to be principled and so stick to them.
  • but the traits that get someone to the top and drive them to build an Internet super-power – aggression, empire building etc can be a problem when their systems rule the world.
  • Kranzberg’s first law: ‘Technology is neither good or bad, nor us it neutral’
  • why isn’t Google’s slogan ‘do good’ instead of ‘do no evil’?
  • what are the possibilities when facial recognition of all photos online is more ‘publicly’ enabled?  What could this mean to personalisation and targeting algorithms
  • on the subject of the ‘Internet of things’, if in the future all your possessions are tagged (e.g. RFID) and indexed – ‘the items you own, where you put them, and what you do with them is a great signal about what kind of person you are and what kind of preferences you have’ (this is called ambient intelligence)
  • is there anything to stop our DNA being indexed and then used in personalisation and targeting activities (this is a chapter about the future – but this does seem a bit extreme)
  • the code that drives the algorithms that Google and similar companies are using is so large and so complex that even the engineers no longer understand it in it’s entirety.  Google engineers now make tweaks and test results without fully understanding why.  The author argues that as the filter algorithms also become this complex that ‘the harder it’ll be to understand why or how it is making the decisions it makes’
  • personalisation doesn’t need to stop at the products / information that we are offered, it could also change how we see / experience web sites.  Technology is being developed that allows a website to ‘morph’ into a UI that it predictably will be more comfortable with and can increase ‘purchase intentions’ by 21 percent or more.
  • rather than decentralising knowledge and control (as the origins of the Internet had hoped), ‘in practice it’s concentrating control over what we see and what opportunities we’re offered in the hands of fewer people than ever before’
  • we can avoid being boxed in / typecast by varying our online habits – if we read the same things, visit the same sites – then it is easy to create a profile for you.  But vary your path online and you have a greater chance of encountering new ideas and people.
  • the companies who are leading the way with filtering / personalisation (Facebook, Google,etc) need to realise their responsibilities – they need to be more transparent in what and how is being stored and why. In particular, we need to understand who these sites think we are and how companies are using this information.
  • the engineers at these companies need to also design for ‘serendipity’ to expose people to topics outside their filter bubble.
  • as users / citizens, we need to demand ‘Fair Information Practices’ and start thinking of personal data as personal property and do what is necessary to protect it.

In reading this, I kept thinking that first – my friends are not me.  Just because I am friends with people (access via a social graph) does not mean that they have the same tastes or preferences as I do.  I also think that as someone with eclectic tastes and interests, my best ideas have come when there has been an intersection of seemingly unrelated topics.. and how this would occur in a filter bubble world.

This is an excellent book and I highly recommend that you read it now.. or at least soon.  It is very topical and “now” and so will have less impact if read in five years from now (or perhaps even two years from now for that matter).

 

Leave a Reply