Quantcast
Channel: michaelochurch – Michael O. Church
Viewing all articles
Browse latest Browse all 304

The Time I Ruined Programming

$
0
0

 

Part I: The Personal

I remember the time I ruined programming. It was long enough ago (and I recovered) that I’m no longer embarrassed by the fact.

Nearly all programmers have slowdowns and periods where they just can’t get anything done. It’s why we hate two-week “sprint” nonsense and stack-ranking; no one wants to be watched by a guy in a guard tower with a gun, just waiting to pop the programmer who slows down the chain.

When people learn how to program, they do so enticed by what the job once was: an R&D job of pure creation, free of nonsensical deadlines and business-driven engineering. A computer science curriculum doesn’t prepare one to be a bored grunt, but for a life of work on projects like compilers, interactive games, and machine learning algorithms. Of course, only a lucky few get paid to do that kind of stuff. Most programmers end up writing and maintaining boring business programs, solving problems that are only challenging because of tight deadlines and the sheer weight of bad decisions made before.

It’s easy to ruin a creative activity: do it as a subordinate, and you’ll often grow to hate it. This is why, although I’m a decent enough writer, I’ve never considered writing jobs; in fact, I’m wary of traditional publishing and its increasing tendency to push writers into an unfavorable employee/employer relationship. Authors pay a lot for that “Not Entirely Awful” stamp of approval they get from being “published”, thinking it means more than it does. Publishers then drive terms that turn authors into subordinates, and agents aren’t going to push back on behalf of a midlist or unknown author. Getting dumped by a publisher– or worse, an agent– can be worse than being fired. You don’t just lose your relationships, but your reputation.

I don’t intend to imply that it invariably crushes passion to work for someone else. Everyone works for someone else. In the abstract, we’re all subordinates. The alternative is solipsistic madness, idiocy in the original Greek definition, by which the word idiot meant not a lack of inborn intelligence, but one who non-virtuously chose to ignore public life. An idiot was a man only into himself. Our notion diametrically opposite the idiot, held up as the pinnacle of human character, is the hero. She’s not a subordinate in an organizational sense, but she still adheres to a set of ethical principles and works toward benefits she wishes to deliver to the world. She serves, but she takes agency over whom and how she serves.

Of course, workplace subordination is not heroic. To subordinate to the private benefit of someone else, who is under no obligation to return the loyalty (and probably won’t) is demoralizing and toxic.

The truth is that I have no use for people who are constitutionally insubordinate. To blindly disobey orders, because they are orders, is even more idiotic than blindly following orders. Civilization requires what I call operational subordination. An example would be stopping at red lights while driving. We do this not because we consider ourselves inferior to these robotic lights, but because driving wouldn’t be safe if we didn’t obey their directives. We don’t think of it as subordination; it’s just good sense. .

Workplaces, of the Theory X variety that has become the norm since the downsizing epidemic of the past few decades, don’t settle for operational subordination. They want personal subordination. The good of the company (read: the careers and reputations of executives) must take a higher priority than the career goals and personal needs of the worker, and the worker is expected not simply to obey stated commands, but to internalize this sense of moral inferiority. If he has no orders, he must ask for more work. If he seems to value his own advancement over that of his superiors– a successful out-of-work side project suffices to create this impression– he’ll be terminated even if he does nothing wrong.

Programmers, by and large, don’t mind operational subordination. In fact, we have an affinity for it. We like to solve complex problems with simple rules that make sense. We operationally subordinate, every day, to the syntactical demands of a compiler that simply won’t do anything with code it cannot parse. When rules are sane, and the benefit in their existence is obvious, we eagerly follow them. It’s the personal subordination that burns us out. We’re smart enough to spot a system that demands personal loyalty from us, while refusing to reciprocate, and it disgusts us. We recognize that our rules-based, overly rational way of thinking is under attack; someone is trying to hack us and take advantage.

The nature of employment has changed for programmers, and for the worse. Software development used to be a highly-compensated job with an R&D flavor, where programmers were trusted professionals rather than overworked grunts subjected to the likes of Scrum. The open-allocation environment of a company like Valve used to be the norm. What changed? Employers realized that line-of-business code could be written by mediocre people, and replaced the high-talent curmudgeons with this continuing churn of fresh-faced halfwits who have never written a program from scratch, have no mathematical or theoretical insight, and don’t know what “buffer overflow” or “O(n^2)” mean.

By and large, it has worked. Employers have successfully dumbed programming down. Large companies may need a few excellent programmers, but line-of-business software can be done by people of mediocre talent who’ll accept bad wages and worse working conditions. Scrum is not going away. Does it produce excellent software? No, not even close. It doesn’t even produce secure or safe or maintainable software. It works just well enough that deliverables get deliverated, and just barely so but at a sufficiently close approximation to working software that executives get promoted away from their messes before anything starts falling apart at a macroscopically visible level.

For my part, I got through that code slowdown, long enough ago that I don’t mind writing about it.

These days, I’m more interested in small programs that exist to solve problems than large ones written to justify budgets or “prove” technical choices. (I don’t care to convince anyone that Haskell is “ready for production”. If we can’t agree on a language, that’s fine; I’ll write C. It’s old and a bit wonky but it works.) One of my current projects is to build an AI for the card game Ambition (which I came up with in 2003, and have been refining since then) because I want to test out certain design changes; that, because I use Ambition for a pivotal card game scene in Farisa’s Crossing and I wanted to fix certain flaws before giving the game this added visibility. In order to test these changes, I need to run simulations with believable players; random-move players give some statistical insight, but they don’t explore any interesting strategy spaces. The “AI” doesn’t need to be AlphaGo, and it won’t be: if a simple Q-learner, backed by a basic backpropagation network suffices, that’s what I’ll use. If I need something more, I’ll write something more complex. These days, I’m rarely (if ever) impressed by code. Lines of code are spent, not acquired.

Now that I’m older, I have almost no passion for programming as an end in itself– an average corporate codebase is far more complex than anything I would intentionally write, and yet that complexity is wasteful and ugly– but, still, certain problems that can be solved with software interest me.

On the other hand: code for code’s sake, in million-line piles of corporate cruft and the dead shells of Jira tickets? Nah, I’m done with that. That really is a young idiot’s game.

Part II: The Fourth Turning

I reached a dark night of the soul with regard to software and technology. There were moments when I looked around and realized that my total contribution to humanity, by working for an increasingly maleficent industry, might be negative. The 21st century’s American theatre has featured the dismantling of the middle class, and I can’t say I had nothing to do with it.

In the 1990s, we had to hear about those “South Park Republicans”. In 2018, I find that I’m a “Black Mirror Liberal”. I loathe and fear Silicon Valley, with its complete lack of morality, more than Donald Trump; the latter is, to be frank, too stupid and too transparent in his self-indulgence to pull fascism off. If we don’t learn our lesson this time, a 39-year-old startup founder, more reserved and competent than the orange idiot, could do a lot more damage.

I don’t view technology as evil; however, I fear what humans will do with it. Like the Black Mirror series, I have the conservative’s skepticism toward human nature. Technological progress without moral improvement will lead us straight to hell. That, to me, is the important lesson of Black Mirror, arguably the most important television series of our time. Yes, science and technology are good, and liberal economics is largely correct; but, without cultural and moral improvements as well, other forms of progress can only do so much.

To me, the quintessential Black Mirror episode is “Fifteen Million Merits”. I shan’t do it justice here, but let me sum it up briefly. It’s set in (probably) a far-future dystopia. The middle classes ride stationary bikes to generate power, and outside of work they’re doomed to a virtualized, entertainment-driven life in which one must pay not to see ads (many of which are offensive, even pornographic). Those unfit to ride the bikes fall into the lower classes (“lemons”) and are treated as subhuman. There seems to be no “outdoors” in this world; food is grown in a Petri dish using energy that comes from… the bikes. Or, at least, that’s what the bike slaves are told. I’ll get to the plausibility of that, in a second.

The main characters try to escape their position. There’s a television called Hot Shots , modeled on shows like American Idol, through which talented people vie for a chance to join the celebrity upper class. Most of them fail, and some are forced into neoliberal prostitution. One character becomes a pornographic actress and her humiliation is broadcast to the entirety of her society; another, after a display of authenticity and rage, is enticed to make a performance out of his anger (thus commoditizing and killing his own authenticity).

There’s a scientific easter egg in “Fifteen Million Merits”. Examine the principle of it: humans must ride bikes to generate power for a society that, so it says, has been reduced to growing food artificially. This is physically impossible. Human efficiency is about 25 percent: if we eat 2000 kilocalories’ worth of food, we can generate 500 kilocalories of mechanical energy. It could conceivably increase, but never would it reach (much less exceed) 100 percent. No life is a power source; autotrophs like plants and algae consume solar power that the rest of us eat. Without an energy source like the sun (a giant, faraway fusion reactor) we could not survive.

Low-tech societies used on human power (stored solar power; faraway nuclear power) because they had no alternatives. If, however, a society were forced to use human power to grow food to feed humans, it would die. Therefore, we conclude that the society in “Fifteen Million Merits” isn’t energy poor, but energy rich. It can waste human effort and physical energy on pointless drudgery. The bikers think they’re keeping the lights on and the food growing, but anyone scientifically literate would recognize that as a lie.

I don’t think this is a mistake in the show’s writing. Someone would have pointed it out, I imagine. (Maybe not; The Matrix got this wrong.) For my part, I think it’s intentional. This means that the society has eliminated the need for work, but it has kept scarcity and work around. Why? Sadism.

That’s the horrifying truth that sneaks out of “Fifteen Million Merits”. It’s not a post-apocalyptic society that needs every erg of human effort. Rather, I suspect that it’s a rich society that has kept scarcity around to dominate and humiliate people, as entertainment both for the put-upon middle classes (who must be given seemingly useful work, lest they rebel) and the vicious upper classes.

To me, that’s quite a plausible future. Technology will eliminate the need for human work. It will eliminate most forms of scarcity. Is this going to lead us to a wealthy, egalitarian utopia? It might, but there are few guarantees. The elites could keep scarcity in place, valuing dominance over others too much to let the rest share in the wealth. There’s a lot that we don’t know about so-called “human nature”, and we’ve never had that kind of rich world before.

At any rate, let’s get back to the 21st century. One disturbing trend is that work is becoming more subordinate. I know, because I’ve been involved in making it so.

A few years ago, I worked on a “performance management” system, imposed on truckers, that would track which drivers were too fast, which were too slow, and even which drivers were eating lunch off-route to save money or have time with their children. It doesn’t save much to prohibit a driver from eating off-route: how much does five miles’ worth of gas cost? But, thanks to technology, this surveillance costs even less.

One could argue that Agile and Jira are programmers’ own professional karma. What we’ve allowed to be done to every other class of worker is now being done to us. It shocks us more than it should. Many of us worked toward bad ends– some examples would be so-called “performance management” systems, misuses of data to corporate benefit and human detriment, systems that allowed health insurers to deny care– and it’s only fitting that the surveillance capitalism we created would, at some point, be turned on us.

Technology has, I’ll note, also damaged the world for professional writers. (It may fix it; self-publishing is becoming a viable alternative, but that’s another topic.) Mastering writing and mastering sales tend to happen at different times. In the 1980s, a strong novelist whose first four books had mediocre sales could easily get a fifth book deal. These days, it’s a lot harder, because surveillance capitalism has no qualms about using data toward bad, career-denying ends. It’s not publishers who are to blame, in fact; the chain bookstores did this. Publishers, in the old days, would offer a fifth deal, along with an advance and some promotion, to a talented author who hadn’t yet built a following or mastered the golden touch. In the late 1990s, though, bookstores started pushing back and gutting authors who stayed on the midlist. Why risk space for someone who has “failed” four times, when an unknown might be a breakout bestseller? The long-term effect has been detrimental. Publishers have lost money and many have been put out of business or forced into conglomeration; writers’ careers have been wrecked by this; and the literary world has become more commercial and overall worse in terms of quality. The true culprit in this clusterfuck is, yet again, technology used for worker (in this case, author) surveillance. In the 1980s, an author with mediocre sales got more chances; in 2018, the bookstores run her numbers and say, “No thanks”, even if she’s quite talented.

Technology used to be in alliance with human progress, and now it seems to be moving against us. What’s going on? I think we’re headed for a Fourth Turning, or a historical crisis. The benefits of technology and surveillance accrue to the rich overseers, and they’re not going to give their ill-gotten wealth or power up.

We can assess the Fourth Turning, toward which we careen, by looking at the crises of the 20th century: the World Wars and the Great Depression.

As we moved from the 19th century into the 20th, we got very good at making food. It’s hard to imagine this being a bad thing; yet, it led to a Great Depression in North America and long-running, total wars (of a kind that prior agricultural capabilities and supply chains couldn’t support) in Europe. Ill-managed prosperity is more dangerous than true scarcity, it seems. Scarcity’s bad, but at least it slows things down.

In North America, food prices started dropping in the 1920s. Farms couldn’t survive. Our later corrections persist and, in some ways, are pernicious; due to our perennial corn surplus, we spike soft drinks with high-fructose Franken-sugars. However, that came after the Depression. In the last Gilded Age, consensus was that it was best to let these farmers fail. So, rural small towns that served the farmers died along with them. Heavy industry got hit around 1925-27, and the stock market grew more volatile in turn. Then it crashed, notably, in October 1929, but that didn’t “cause” the Depression; we’ve had worse stock market crashes since then (e.g., 1987) to minimal effect. Rather, the Great Depression actually started in the early 1920s; it just wasn’t until about 1930 that it started hitting rich people in the cities. We learned that poverty wasn’t some “moral medicine” to shock people back into adhering to the so-called protestant work ethic; rather, it was a cancer that, left to its own devices, would keep spreading until it destroyed a society.

What killed the American economy in the 1930s? Conservative mismanagement of agricultural prosperity.

What’s killing the American middle class in 2018? Conservative mismanagement of technological prosperity.

What happened to farmers in the 1920s is happening to… all human labor. Let that sink in. The vast majority of Americans produce things of less usefulness than what farmers produce. If farmers weren’t safe in the ’20s, public relations managers and computer programmers aren’t safe today. We’ll die without food; we’ll be fine without TPS reports.

Globalization is inevitable and desirable, but we’re letting it go off in a way that benefits the rich and eviscerates the middle class. Technological automation is wiping out jobs, and surveillance is turning high-autonomy, fulfilling jobs (like what programming used to be, when it had that R&D flavor) into drudge work. The high-surveillance culture doesn’t merely make the workplace unpleasant, but also lowers the bar for who can contribute– it becomes profitable to employ unskilled scabs, if surveillance becomes cheap enough– depressing wages further. Though Agile Scrum reduces the effectiveness of competent programmers, it turns incompetent ones into marginally employable code-cutters. So what happens? Well, employers replace the high-talent curmudgeonly experts with scrummy rent-a-coders, and wages nosedive. Furthermore, as jobs are wiped out in one industry, people who worked in it become “refugees” and flow into another, making that industry more competitive, and driving wages down further.

The wage slide is going to be far more severe than most people predict, due to economic inelasticity. When people have a desperate need for something– like gas to go to work, or water in the desert, or illegal drugs to which they’re physically addicted– they will pay nearly any price, and therefore small drops in availability will drive massive price increases. During the oil shocks of the 1970s, the supply only dropped by about 5 percent, but prices quadrupled. That’s demand inelasticity, but supply (of labor) experiences the same phenomenon. If people desperately need jobs to survive, then we should expect that either a small increase in worker availability or a slight decrease in demand for labor will tank wages, disproportionate to the actual amount of change.

For example, self-driving trucks will put millions of drivers out of business in the next twenty years. There are millions more who serve those drivers: hotel owners in the Midwest, logistics coordinators, and so on. There are a lot of smug programmers who think that this won’t affect their wages. Wrong. Again, no one is safe.

Let’s say that half the truck drivers lose their jobs. We won’t even account for all that supporting labor: the hotels and diners. That’s 1.75 million workers, fresh on the market. Let’s be conservative say that only 5% of them learn how to write code and become programmers; most will go into other industries. That’s 87,500 new programmers on the U.S. market. That’s about a 2.5 percent increase in our numbers. Seems survivable, right? If we expected wages to drop only by 2.5 percent, yes; but keeping in mind inelasticity, they could drop by 20 percent, just due to that.

Now, add in the workers who support trucking, the hotel owners who see the writing on the wall. They won’t all go into programming. Some will go into other industries (and push out other words, perhaps). Multiple industries will have refugee-driven inelasticity crises and wage collapses. It won’t be just us; but, no one will be safe. Look at what has happened to attorneys (once considered a safe, protected profession) over the past thirty years; many work long hours for mediocre salaries and are paying off student debt into their 40s. If it could happen to them, it can happen to us.

Is the world of 2050, even if we leave corporate capitalism to its own perverse devices, going to be jobless? No. Society will invent things for people to do. If nothing else, the rich will spend more time in school and the poor will spend more time in prison. Is it possible that continual downward wage shocks and industrial refugee crises will demolish the middle class? Yes. If we don’t recognize the danger that technological unemployment represents, and come up with proactive solutions, it will happen.

I used to hold the somewhat pleasant belief that technological progress would, if it hurt the economy enough, blow out its own flames. It ought to be self-limiting, right? If things get bad, people will stop creating (a sort of unprogrammed strike) and technical progress will slow… and we can find a way to clean up our messes. That was what I hoped for.

I’m no longer so sure of this. Technical progress, from a humanistic standpoint, has slowed down. Basic research funding has collapsed and we’re no longer putting people on the Moon; we’re tweeting about “covfefe” instead. Antibiotics did more to improve human life than a blog post about technological immortality written by some 24-year-old who optimizes ad placements; that much is true. Finally, the great technological marvel of the 21st century is a cloud. Not “the Cloud”; that’s just jargon for “the Internet”. I’m talking about the cloud of prehistoric algal carbon pumped out by the literal septillions of useless computations, performed to mine so-called “BitCoins”. Yes, that cloud, for anyone planning a tropical vacation to the North Pole. Ah, BitCoin and its ilk; this high-pitched electric whine, inaudibly belch-screaming carbon dioxide into our atmosphere, might be the siren song of a desperate middle class, seeing its impending demise at the hands of authoritarian capitalism, and not knowing what else to do but invest in libertarian wank tulips. Technology is becoming less useful and less friendly, from a humanist’s standpoint, but it does not seem to be self-limiting. Its direction leaves much to be desired, but its speed remains high.

Technology and globalization shall continue. There’s no other way. If we let our society and morality implode, this whole picture will take an undesirable direction. Without innovations that are currently untenable, such as a universal basic income and the eventual eradication of global poverty– in my view, in the next 50 years we ought to take a preemptive Marshall Plan attitude toward the elimination of third-world misery– Black Mirror is our future.

Technical achievement and beneficence have stalled. Sadly, the technological process will continue even if there’s a complete lack of human progress. There will, alas, always be money in helping rich people unemploy those pesky, expensive worker-serfs… and, also, helping the powerful surveil those whom they haven’t yet figured out how to unemploy.


Viewing all articles
Browse latest Browse all 304

Trending Articles