back to notes

Disguising the Ways They Kill Us: Big Data, Behaviorism, and Mindset Marketing

Recent stuff in opposition to behaviorism and big data.

The irony of turning schools into therapeutic institutions when they generate so much stress and anxiety seems lost on policy-makers who express concern about children's mental health

Source: ClassDojo app takes mindfulness to scale in public education | code acts in education

Contents:

  • Big Data and Mindset Marketing
  • Behaviorism and Autistic People
  • Big Data and Personalization in Healthcare
  • Design Is Tested at the Edges
  • Closing the Digital Divide by Joining the Edges
  • Bonus: Indie Ed-tech After School

Big Data and Mindset Marketing

Here's a rule of thumb for you: An individual's enthusiasm about the employment of "data" in education is directly proportional to his or her distance from actual students. Policy makers and economists commonly refer to children in the aggregate, apparently viewing them mostly as a source of numbers to be crunched. They do this even more than consultants and superintendents, who do it more than principals, who do it more than teachers. The best teachers, in fact, tend to recoil from earnest talk about the benefits of "data-driven instruction," the use of "data coaches," "data walls," and the like.

Making matters worse, the data in question typically are just standardized test scores - even though, as I've explained elsewhere, that's not the only reason to be disturbed by this datamongering. And it doesn't help when the process of quantifying kids (and learning) is festooned with adjectives such as "personalized" or "customized."

But here's today's question: If collecting and sorting through data about students makes us uneasy, how should we feel about the growing role of Big Data?

Part of the problem is that we end up ignoring or minimizing the significance of whatever doesn't lend itself to data analytics. It's rather like the old joke about the guy searching for his lost keys at night near a street light even though that's not where he'd dropped them. ("But the light is so much better here!") No wonder education research - increasingly undertaken by economists - increasingly relies on huge data sets consisting of standardized test results. Those scores may be lousy representations of learning - and, indeed, egregiously misleading. But, by gum, they sure are readily available.

“What's left out?", then, is one critical question to ask. Another is: "Who benefits from it?" Noam Scheiber, a reporter who covers workplace issues, recently observed that big data is "massively increasing the power asymmetry between exploiters and exploitees." (For more on this, check out Cathy O'Neil's book Weapons of Math Destruction).

Anyone who has observed the enthusiasm for training students to show more "grit" or develop a "growth mindset" should know what it means to focus on fixing the kid so he or she can better adapt to the system rather than asking inconvenient questions about the system itself. Big data basically gives us more information, based on grades, about which kids need fixing (and how and when), making it even less likely that anyone would think to challenge the destructive effects of - and explore alternatives to - the practice of grading students.

Predictive analytics allows administrators to believe they're keeping a watchful eye on their charges when in fact they're learning nothing about each student's experience of college, his or her needs, fears, hopes, beliefs, and state of mind. Creating a "personalized" data set underscores just how impersonal the interaction with students is, and it may even compound that problem. At the same time that this approach reduces human beings to a pile of academic performance data, it also discourages critical thought about how the system, including teaching and evaluation, affects those human beings.

Source: When "Big Data" Goes to School - Alfie Kohn

Our public school policymakers want us to do the later. In fact, they have a whole pedagogical justification for ignoring the needs of children.

It's called "academic tenacity," a "growth mindset" or "grit."

And it goes something like this:

That child isn't learning? If she just worked harder, she would. 

It's the political equivalent of "pull yourself up by your own bootstraps" applied to the classroom.

And it's super helpful for politicians reluctant to allocate tax dollars to actually help kids succeed.

But what no one wants to admit is that grit is… well… shit.

It's just an excuse for a society that refuses to help those most in need.

Yet when anyone suggests offering help to even the playing field - to make things more fair - a plethora of policy wonks wag their fingers and say, "No way! They did it to themselves."

It's typical "blame the victim" pathology to say that some kids get all the love, time and resources they need while others can do without - they just need more "grit" and a "growth mindset."

Source: Grit is Sh!t – It’s Just an Excuse to do Nothing for Struggling Students – gadflyonthewallblog

As with the corporate flavor, ed-tech mindfulness, like other mindset marketing, disguises the ways they kill us.

Source: Mindfulness in Education – rnbn

And so we can firmly put the insistence on data-driven instruction in the trash bin of bad ideas.

It is unscientific, unproven, harmful, reductive, dehumanizing and contradictory.

The next time you hear an administrator or principal pull out this chestnut, take out one of these counterarguments and roast it on an open fire.

No more data-driven instruction.

Focus instead on student-driven learning.

Source: The Six Biggest Problems with Data-Driven Instruction | gadflyonthewallblog

Alfie Kohn on Twitter: "Studies w/adults (https://t.co/MlZhxzNFZv) & kids (https://t.co/d0xTpE8U00) from 40-odd yrs ago show that when we're monitored, esp. if the surveillance is perceived as controlling, we tend to lose interest in whatever we're doing. Implics for today's high-tech classrms/wkplaces?"

Alfie Kohn on Twitter: "From the archives: When educators reduce students to data, they miss an awful lot. When they rely on "big data," they may be making things even worse: https://t.co/bMnE024qhR"

I’ve seen the pathologies of big data from the inside. I’ve seen how it affected our company and community culture. I agree with Kohn’s assessments.

chill innovation on Twitter: "These young people have better judgement about their education than the folks who are supposed to be in charge. https://t.co/eLpKdBUZZB"

Brooklyn students hold walkout in protest of Facebook-designed online program

Indeed they do have better judgment.

Behaviorism and Autistic People

Thinking Person's Guide To Autism on Twitter: ""If a child was not autistic, would we use their hobby as a means to try to get them to do certain things? If not, we need to stop thinking it is okay to use an #autistic person’s hobby or interest […to] manipulate/fix/distinguish 'autistic behaviors'." https://t.co/aTYwhP0tqQ"

Some community data for my “autistic people reject behaviorism” assertions. This extends beyond ABA. ABA and PBS supporters are not allies to us. Listen to the people who know behaviorism best. Fiona Clarke on Twitter: "I am not sure where you get your "tiny minority" of autistics against ABA from. I think it is the other way around. This survey 98% of 5000+ autistic responders do not support ABA: https://t.co/wrxabd0y41 Here are 10,000s of others who do not: https://t.co/cxdIiE1d96… https://t.co/CDmxl43EO9"

11,521 people answered this autism survey. Warning: the results may challenge you. - Autistic Not Weird

The Autistic Community does not support Applied Behaviour Analysis (ABA) – ABA Controversy Autism Discussion

jon adams on Twitter: "@AspieHuman I hate the word ‘resilience’ In bullying or abusive situations it’s often used as gaslighting victim blaming derailing & I feel It’s often used to neglect responsibility to stop a situation happening or continuing It’s evil" / Twitter

Big Data and Personalization in Healthcare

The “personalized” pushback is happening in healthcare too. Again, disabled and neurodivergent people are the realest experts here. We live this stuff. Zackary Berger, MD, PhD on Twitter: "What if we had a different system, which incorporated technology but had it play a different role, centering the patient's experience (not "patient-centered care", as the interviewed executive was saying, but REAL patient experience)? Now THAT article would be truly epic." This is the tack Albemarle took with its ed-tech.

Compare LMS burnout to the Epic burnout doctors are feeling. Atul Gawande writes on the problems of big data and top-down systems in medicine:

As patients, we want the caring and the ingenuity of clinicians to be augmented by systems, not defeated by them. In an era of professional Taylorization-of the stay-in-your-lane ethos-that does not seem to be what we are getting.

Putting the system first is not inevitable. Postwar Japan and West Germany eschewed Taylor's method of industrial management, and implemented more collaborative approaches than was typical in the U.S. In their factories, front-line workers were expected to get involved when production problems arose, instead of being elbowed aside by top-down management. By the late twentieth century, American manufacturers were scrambling to match the higher quality and lower costs that these methods delivered. If our machines are pushing medicine in the wrong direction, it's our fault, not the machines' fault.

Some people are pushing back. Neil R. Malhotra is a boyish, energetic, forty-three-year-old neurosurgeon who has made his mark at the University of Pennsylvania as something of a tinkerer. He has a knack for tackling difficult medical problems. In the past year alone, he has published papers on rebuilding spinal disks using tissue engineering, on a better way to teach residents how to repair cerebral aneurysms, and on which spinal-surgery techniques have the lowest level of blood loss. When his hospital's new electronic-medical-record system arrived, he immediately decided to see if he could hack the system.

He wasn't a programmer, however, and wasn't interested in becoming one. So he sought out Judy Thornton, a software analyst from the hospital's I.T. department. Together, they convened an open weekly meeting, currently on Thursday mornings, where everyone in the neurosurgery department-from the desk clerks to the medical staff to the bosses-could come not just to complain about the system but also to reimagine it. Department members feared that Malhotra's pet project would be a time sink. Epic heard about his plans to fiddle around with its system and reacted with alarm. The hospital lawyers resisted, too. "They didn't want us to build something that potentially had a lot of intellectual property in someone else's system," Malhotra said.

But he managed to keep the skeptics from saying no outright. Soon, he and his fellow-tinkerers were removing useless functions and adding useful ones. Before long, they had built a faster, more intuitive interface, designed specifically for neurosurgery office visits. It would capture much more information that really mattered in the care of patients with brain tumors, cerebral aneurysms, or spinal problems.

But our systems are forever generating alerts about possible connections-to the point of signal fatigue. Just ordering medications and lab tests triggers dozens of alerts each day, most of them irrelevant, and all in need of human reviewing and sorting. There are more surprises, not fewer. The volume of knowledge and capability increases faster than any individual can manage-and faster than our technologies can make manageable for us. We ultimately need systems that make the right care simpler for both patients and professionals, not more complicated. And they must do so in ways that strengthen our human connections, instead of weakening them.

Source: Why Doctors Hate Their Computers | The New Yorker

Techno-sociologist and author of Twitter and Tear Gas, Zeynep Tufecki, writes:

But as medicine moves from the kind of clinical practice that has informed centuries of treatment to the data-driven practices that have already transformed commerce, finance and the media, it will also find itself facing some of the same social challenges. In particular, big-data technology might seem like a social neutralizer or even a leveling force, but it can have a way of increasing divisions.

One hint at why this is comes from what communications theorists describe as a knowledge gap. Basically, people who already have better information are also better at getting more information, even if that information is in theory universal and available to all. We see this again and again in different fields. In my own research on schools and computers, for example, I often encounter students doing advanced and creative “technology” activities on the computers in well-off schools, and students doing rote learning and typing on the computers in poorer ones. That division means that later on, when the kids face a putatively even playing field, some will know better than others how to get ahead. Privileged kids get more resources not simply because they (or the schools) can afford to pay for them but because their parents are better equipped to advocate for their acceptance into gifted and talented programs, or to academically support them better through tutoring, attention and encouragement — harder tasks for a poor or single parent. There is also the effect of expectations and a lifetime of socialization: If you experience life as unfair, you are probably less likely to demand better when you encounter more injustice.

There is a great lesson here as we anticipate the rise of data-driven diagnostic and intervention techniques in health care. It’s not that new methods won’t help people; it’s that they will increase health inequality — not just among those who can afford it and those who cannot, but among those who can undertake the research and take advantage of the new techniques and those who cannot.

Further, these new data-driven medical techniques could lead to more discrimination.

Source: Data-Driven Medicine Will Help People — But Can It Do So Equally? - The New York Times

Aside: Read everything Gawande writes. His prose is beautiful and enjoyable, making medicine accessible. He carries on the tradition of Oliver Sacks. The Checklist Manifesto is a big influence on me.

Speaking of Sacks, Steve Silberman, author of NeuroTribes, was close to Sacks. Silberman helped Sacks in his self-discovery as a gay man. Silberman helped me in my self-discovery as an autistic. Sacks’ work introduced me to the phrase “An Anthropologist on Mars” in the early 90s, giving me vocabulary for something I’ve felt my entire life.

Have y’all read NeuroTribes yet? It’s important reading for our times. The pattern repeats.

Design Is Tested at the Edges

Thread: jutta treviranus on Twitter: "Data-driven decisions & policies are about numbers. Big numbers rule. If you are a small number you will be ignored or overpowered, no matter how impactful the decision is to you. We need to think beyond numbers. We need a better data strategy. #inclusion"

jutta treviranus on Twitter: "All data is biased. The more data, the greater the bias. BIG Data & AI are biased toward the average & against at the margins. It's time to level the playing field & stop favoring the norm. Diversification makes for better intelligence and understanding than large numbers."

jutta treviranus on Twitter: "Data is about the past, the successes of the past won't help us create the change we want in our future. We need to learn from our mistakes more than from our successes & imagine untried approaches. #BigData #inclusion #SmartCity #AI"

jutta treviranus on Twitter: "Data analytics finds the average middle. The vulnerable edges foretell the challenges of the future. Addressing the needs of the margins prepares us for the challenges to come. #BigData #AI #inclusion"

Which is why I advocate this: Design is Tested at the Edges: Intersectionality, The Social Model of Disability, and Design for Real Life – Ryan Boren. These are lessons learned the hard way.

If one really wishes to know how justice is administered in a country, one does not question the policemen, the lawyers, the judges, or the protected members of the middle class. One goes to the unprotected - those, precisely, who need the law's protection most! - and listens to their testimony.

— James Baldwin

DSISD does not know better than the edges. The canaries and Cassandras, the people from the future, inhabit the edges. Listen and amplify.

Closing the Digital Divide by Joining the Edges

Realizing and acknowledging how big data fogged our vision, Automattic’s design team has been going into communities and talking directly to small business owners, non-profits, and activists. We’ve been designing sites with mom and pop businesses owned by marginalized people. We’ve been listening to the structural problems they face in their communities and in our software. We’ve been iterating based on that feedback.

City Advocates | Rebrand Cities is helping close the Digital Divide

Rebrand Cities is part of our effort to do better. Big data isolates and disconnects. It does the opposite of personalization. It commodifies. Websites run 40 something trackers on a page and instrument every line of code, gobbling up data that overwhelms and obscures the needs of human beings. I’ve been in so many discussions where people used data to justify something totally out of touch with the needs of the humans navigating our often frustrating software. I invite them to go exercise the flows that traverse their data and experience the actuality.

I mine perspectives and curate flow from the field, sharing with my coworkers so that we are informed by more than just big data style collection. We should be Alan Lomaxes of flow and experience, curating in the field.

Come curate in my fields. I’ll show you around.

Bonus: Indie Ed-tech After School

Anil Dash 🥭 on Twitter: "Every day, in each timezone, we know when it’s the end of the school day because we see a flood of kids come to @Glitch and start building @discordapp bots. :)… https://t.co/uViFiaH3ir"

Glitch is the kind of tech we should be supporting in school. It has the ethics and the community and the culture. It recaptures some of what we lost to Facebook and big tech. Align with the tech workers building ethical, humane tech.



last updated november 2018