The Web Almost Killed Me

For a decade and a half, I’d been a web obsessive, publishing blog posts multiple times a day, seven days a week, and ultimately corralling a team that curated the web every 20 minutes during peak hours. Each morning began with a full immersion in the stream of internet consciousness and news, jumping from site to site, tweet to tweet, breaking news story to hottest take, scanning countless images and videos, catching up with multiple memes. Throughout the day, I’d cough up an insight or an argument or a joke about what had just occurred or what was happening right now. My brain had never been so occupied so insistently by so many different subjects and in so public a way for so long.  If you had to reinvent yourself as a writer in the internet age, I reassured myself, then I was ahead of the curve. The problem was that I hadn’t been able to reinvent myself as a human being.

I realized I had been engaging—like most addicts—in a form of denial. I’d long treated my online life as a supplement to my real life. But then I began to realize, as my health and happiness deteriorated, that this was not a both-and kind of situation. It was either-or. Every hour I spent online was not spent in the physical world.

Andrew Sullivan, I used to Be a Human Being

The Bitter Lesson

There’s a famous essay in the field of machine learning known as “The Bitter Lesson,” which notes that decades of research prove that the best way to improve AI systems is not by trying to engineer intelligence but by simply throwing more computer power and data at the problem. The lesson is bitter because it shows that machine scale beats human curation. And the same might be true of the web. Read more at The Verge

The Information Riot

The Internet is an interruption system. It seizes our attention only to scramble it. There’s the problem of hypertext and the many different kinds of media coming at us simultaneously. Every time we shift our attention, the brain has to reorient itself, further taxing our mental resources. Many studies have shown that switching between just two tasks can add substantially to our cognitive load, impeding our thinking and increasing the likelihood that we’ll overlook or misinterpret important information.

On the Internet, where we generally juggle several tasks, the switching costs pile ever higher. We willingly accept the loss of concentration and focus, the fragmentation of our attention, and the thinning of our thoughts in return for the wealth of compelling, or at least diverting, information we receive.

Nicholas Carr
The Shallows

 

The new web struggles to be born

The changes AI is currently causing are just the latest in a long struggle in the web’s history. Essentially, this is a battle over information — over who makes it, how you access it, and who gets paid. But just because the fight is familiar doesn’t mean it doesn’t matter, nor does it guarantee the system that follows will be better than what we have now. The new web is struggling to be born, and the decisions we make now will shape how it grows.

James Vincent writing in The Verge

The Dictatorship of Data

The dictatorship of data ensnares even the best of them. Google runs everything according to data. That strategy has led to much of its success. But it also trips up the company from time to time. Its cofounders, Larry Page and Sergey Brin, long insisted on knowing all job candidates’ SAT scores and their grade point averages when they graduated from college. In their thinking, the first number measured potential and the second measured achievement. Accomplished managers in their 40s were hounded for the scores, to their outright bafflement. The company even continued to demand the numbers long after its internal studies showed no correlation between the scores and job performance.

Google ought to know better, to resist being seduced by data’s false charms. The measure leaves little room for change in a person’s life. It counts book smarts at the expense of knowledge. And it may not reflect the qualifications of people from the humanities, where know-how may be less quantifiable than in science and engineering. Google’s obsession with such data for HR purposes is especially queer considering that the company’s founders are products of Montessori schools, which emphasize learning, not grades. By Google’s standards, neither Bill Gates nor Mark Zuckerberg nor Steve Jobs would have been hired, since they lack college degrees.

Kenneth Cukier and Viktor Mayer-Schönberger, writing in MIT’s Technology Review