Your computer is on fire, but it will take more than this book to shut it down • The Register



Book review Seasoned industry watchers will welcome Your computer is on fire as a complete and flawless debunking of Big Tech’s extravagant self-mythology. They might even hope that governments, businesses, and media organizations who are embracing the propaganda barrage start asking some important questions. But there are limits to this niche text which is sometimes subject to academic navel-gazing.

In the 1990s, despite the exterior differences between the big guns in the industry, the background hum was the same. The Internet offered opportunities for everyone, e-commerce could lead to a frictionless economy, software made people more productive and businesses more competitive. Such illusions survived the dotcom crash and financial crisis, then reappeared in the early days of social media, as the Arab Spring became a use case for the positive impact of Twitter and Facebook. Along with the difficult development of this movement, the nefarious exploitation of data from social media users who contributed to the election of a US presidential regime with mildly insurgent tendencies should have given pause for thought.

It is therefore astonishing that the propaganda of the tech industry has hardly changed. Instead, it’s a case of different technology, same air. Google CEO Sundar Pichai last month told the BBC that AI would be the “deepest technology” humanity will ever develop. Likewise, UK Cabinet Office Minister Julia Lopez adopted industry language when she said “now, more than ever, digital must be at the center of government priorities to meet user needs.”

Into the arena enters the arena a group of technology historians whose goal is not only to dig holes in the production of the sector, but to aim for the center of myth-making and to argue that inequality, prejudice and self-delusion are at the very heart of the industry, and have always been.

Published by The MIT Press, Your computer is on fire denounces and upsets sacred assumptions about the tech industry widely adopted by government, business and most of the media. Therefore: nothing is virtual, the cloud is a factory, AI is powered by humans, and the internet has a hierarchy. In short, no technology is neutral or inconsequential.

Hot for over a decade, cloud is a term the tech industry has spoken of: fluffy, white, and somewhere up there. As Nathan Ensmenger, Associate Professor at Indiana University, says: “The cloud metaphor erases all connection between IT services and traditional hardware infrastructure … Suddenly, the IT industry has largely succeeded in declaring itself out of control. this history, and therefore independent of the political, social and environmental controls that have developed to coerce and mediate and constrain industrialization. “

Of course, the cloud is not such a thing. Like Reg readers know, it’s a computer elsewhere. It needs plastic, metals, energy, and people like any other computer. But where they come from, their impact on the environment, it doesn’t matter much once hidden behind the cloud metaphor, he explains: “The cloud metaphor allows the IT industry to hide and outsource a whole lot. range of issues, from energy costs to e-waste pollution. But the reality is the world is on fire. The Cloud is a factory. Let’s bring this deliberately ambiguous metaphor back to earth by rooting it in a larger history of technology, labor, and the built environment – before it’s too late. “

Not a bug

Volume is at its best when it comes to the specifics of the history of the tech industry. Mar Hicks, associate professor at the Illinois Institute of Technology, says “sexism is a feature, not a bug” in the chapter that describes how Britain lost its lead in computer technology development by not recruiting , by developing and not promoting women. who first used these machines. Once the importance of the commercial machines was taken into account, the men were recruited.

“In 1959, a female programmer spent the year training two new recruits with no IT experience for a critical set of long-term IT projects in the main government data center while simultaneously doing all the programming, operating and test as usual. ” The problem was that “the men who were recruited for these jobs lacked the technical skills to do them and were often uninterested in IT work, in part because of his feminized past.”

Most of the men trained for early IT positions quickly left for higher management positions, causing departments to hemorrhage for most of their IT staff. “This trend continued throughout the 1960s and 1970s, even as the estate’s status rose,” says Hicks. “As a result, the programming, systems analysis and computer operations needs of government and industry have not been met.”

Hicks traces the success of a woman who saw an opportunity in that glass ceiling and in software development independent of the hardware vendor, an unusual move at the time.

After working for the Dollis Hill Post Office Research Station in the 1950s, briefly with Colossus computer architect Tommy Flowers, Stephanie “Steve” Shirley went on to found Freelance Programmers, which later became Xansa before be sold to Steria for £ 472 million in 2007.

She had been skipped for promotion into the public service before launching the successful startup in the early 1960s, making a point of hiring female tech talent also sidelined on flexible contracts. The chapter includes a stunning photo of Ann Moffatt encoding the black box recorder for Concorde, with a toddler in the frame looking curiously over the table.

But that was the exception, not the rule. Far from revolutionizing society, the computer industry has acted to strengthen the pre-existing social structure, argues Hicks.


The breadth of the book is its strength. It not only covers the social, political and economic history of computer technologies, but also covers the details of the code, for those who can read it. Chapter by Ben Allen, programming historian and lecturer at Berkeley City College, details the antics of Unix co-creator Ken Thompson, who wrote an eponymous hack technique to get a backdoor into operating systems .

In addition to printing out the code for the principles underlying the Thompson hack, Allen describes the approach for the observant lay reader. Without going into detail here, the process relies on the liar paradox – “that phrase is wrong” – and the bootstrapping process to move from basic machine code to higher order languages. The result is a Trojan backdoor that, in a practical sense, is undetectable: only by reading every line of machine code it could in theory be spotted.

Thompson argued that this meant that no computer system could be entirely trustworthy and that laws against those who gain access to computers illegally should be tightened accordingly. On the flip side, he admitted to implementing the code in the Bell Labs machine, although he denied ever having released it into the wild. The fact that we trust him is as much tied to his identity as a respected white computer engineer. He was allowed to play and that’s good; others could have been judged more harshly, Allen argues.

At its best, the book breaks down the assumptions behind how the computer industry works. The QWERTY keyboard has long been ridiculed as an obscure interface dating back to the mechanics of typewriters. But while Western engineers have focused on how the keys could be best laid out, those who rely on non-Latin scripts have been forced to think more radically. Forced by the ubiquity of the QWERTY system and its complete lack of function to represent the native languages ​​of billions of people around the world, engineers in China began to view the keyboard as an input device to help select desired logograms. , rather than literally typing individual letters. While Western systems could learn a lot from this approach, they are still linked by the concept of “type” and married to a system design for Remington typewriters in the 1920s, according to the Stanford University professor. Thomas Mullaney, author of the chapter “Typing is Dead”. .

Your computer is on fire is not for the faint hearted. It is a detailed and sometimes dense academic text that must be judged on this basis. It wants to be “conceived as a call to arms”.

Shake for change

“We very much hope that the students who read this book will take up positions in STEM fields and then mobilize there on behalf of the issues we raise,” Mullaney said in the introduction.

In conclusion, the reader is faced with an existential crisis, apparently, a “call to face and embrace one’s own death”. He said: “There is perhaps not much more hostile to the modern mind which seeks prosperity, peace and a beneficent politics than such a call to reconcile ourselves with the brevity of human life. .

It’s at this point that the intro’s imaginary STEM student might be justified in putting the 400-page tome back in the library and pondering their first professional paycheck and Bay Area condo.

At the same time, the book lacks details on what could be done about the issues around AI regulation, privacy, breaking monopolies, fair taxation, and more.

Nonetheless, it is a courageous effort to tackle the illusions that have entrenched themselves in the tech industry and have spread around the world. Nothing has ever been virtual, industry is not a meritocracy, technology is not neutral. But convincing those who are able to tackle the massive billions of dollars of tech giants might take a little more effort.




Comments are closed.