S’s Review

– Human Vs. Technology:

o The author Cathy O’Neil writes that future humanity is in the process of a new transformation.

o Throughout the book, the word Big Data is mentioned several times. Big Data is rapidly changing the way companies and society itself function. Big Data is a field that utilizes extensive data to make various industries more profitable/efficient.

o The author recognizes that computer algorithms and data collection are accommodating in a context. However, she acknowledges that much of modern life is dictated by machines rather than people.

o O’Neil suggests that the computer-controlling situation will continue unless humans actively engage in data interpretation.

o According to the book, to keep technology’s powerful influence over humans, industries must implement internal regulation on how data is collected and ensure that developers are held accountable for the results of the algorithms they create.

o An example that O’Neil used in her book is trucking companies. Trucking companies are closely tracking and surveilling their driver’s rigs. They are installing cameras and GPS devices to monitor how truckers are driving. These installations help the companies gather data about when the drivers might fall asleep or have problems. Thus, these actions prevent fatal or tragic accidents.

o O’Neil articulates that computer systems and algorithms make life more efficient by erasing human bias and error, but it might remove things that only human beings can do – self-correct, invent and imagine.

– Discrimination in Algorithms:

o Cathy O’Neil shares her experience when she worked at a hedge fund during the 2007-2008 global financial crisis.

o O’Neil realized that human biases are written into algorithms which are used to determine crucial entities like creditworthiness, job ability, and insurability.

o These damaging models, which O’Neil calls “weapons of math destruction”, create unfair discrimination against racial and ethnic minorities and women (O’Neil, p. 13).

o The author argues that data scientists must work to purge racism and sexism from their algorithms, or racial minorities, women, and financially unstable people will continue to be victimized.

o WMDs prevent women from having fair opportunities as men. In the 1970s, St. George’s Hospital Medical School in London used an algorithm to sort through the applications.

§ The algorithm had two functions: the first was to “boost efficiency, letting the machine handle much of the grunt work. It would automatically cull down the two thousand applications to five hundred, at which point humans would take over with a lengthy interviewing process” (O’Neil, 2016, p. 148). The second objective was fairness. “The computer would

remain unswayed by administrators’ moods or prejudices, or by urgent entreaties from lords or cabinet ministers” (O’Neil, 2016, p. 148).

§ In actual practice, this algorithm systematically rejected resumes from applicants whose names indicated that they were immigrants and women. The algorithm denied resumes from women implying that they might become a mother, which can impact their value as laborers.

o O’Neil argues that models reflect their developer’s implicant biases, and if developers train their machines to disregard data about gender and race, they could certify that the algorithms are not discriminatory.

– Fairness vs. Efficiency:

o O’Neil implied that the U.S. and the entire world are burdened with systems that prioritize profitability and speed over equity and fairness.

o To reform WMDs, O’Neil recommends that tech companies should start forfeiting profits and efficiency for justice and morality.

o O’Neil says that “the world is dominated by automatic systems” (O’Neil, p. 196). O’Neil questions whether these algorithms are successful. The author thinks that efficiency and profit should not be the metrics of successful models. There are several examples of models that are beginning to seek to do good for the world.

§ For instance, a Harvard Ph.D. in mathematics, Mira Bernstein, created “a model to scan vast industrial supply chains, like the ones that put together cell phones, sneakers, or SUVs, to find signs of forced labor” (O’Neil, 2016, p. 268). O’Neil thinks that it is time to put humans back into the frame rather than leaving the issue to the marketplace, which will always prefer “efficiency, growth, and cash flow” over fairness (O’Neil, 2016, p. 196).