Quantcast
Channel: michaelochurch – Michael O. Church
Viewing all articles
Browse latest Browse all 304

The Sturgeon Filter: the cognitive mismatch between technologists and executives

$
0
0

There’s a rather negative saying, originally applied to science fiction, known as Sturgeon’s Law: “ninety percent of everything is crap“. Quantified so generally, I don’t think that it’s true or valuable. There are plenty of places where reliability can be achieved and things “just work”. If ninety percent of computers malfunctioned, the manufacturer would be out of business, so I don’t intend to apply the statement to everything. Still, there’s enough truth in the saying that people keep using it, even applying it far beyond what Theodore Sturgeon actually intended. How far is it true? And what does it mean for us in our working lives?

Let’s agree to take “ninety percent” to be a colloquial representation of “most, and it’s not close”; typically, between 75 and 99 percent. What about “is crap”? Is it fair to say that most creative works are crap? I wouldn’t even know where to begin on that one. Certainly, I only deign to publish about a quarter of the blog posts that I write, and I think that that’s a typical ratio for a writer, because I know far too well how often an appealing idea fails when taken into the real world. I think that most of the blog posts that I actually release are good, but a fair chunk of my writing is crap that, so long as I’m good at self-criticism, will never see daylight.

I can quote a Sturgeon-like principle with more confidence, in such a way that preserves its essence but is hard to debate: the vast majority (90 percent? more?) of mutations are of negative value and, if implemented, will be harmful. This concept of “mutation” covers new creative work as well as maintenance and refinement. To refine something is to mutate it, while new creation is still a mutation of the world in which it lives. And I think that my observation is true: a few mutations are great, but most are harmful or, at least, add complexity and disorder (entropy). In any novel or essay, changing a word at random will probably degrade its quality. Most “house variants” of popular games are not as playable as the original game, or are not justified by the increased complexity load. To mutate is, usually, to inflict damage. Two things save us and allow progress. One is that the beneficial mutations often pay for the failures, allowing macroscopic (if uneven) progress. The second is that we can often audit mutations and reverse a good number of those that turn out to be bad. Version control, for programmers, enables us to roll back mutations that are proven to be undesirable.

The Sturgeon Mismatch

Programmers experience the negative effects of random mutations all the time. We call them “bugs”, and they range from mild embarrassments to catastrophic failures, but very rarely is a discrepancy between what the programmer expects of the program, and what it actually does, desirable. Of course, intended mutations have a better success rate than truly “random” ones would, but even in those, there is a level of ambition at which the likelihood of degradation is high. I know very little about the Linux kernel and if I tried to hack it, my first commits would probably be rejected, and that’s a good thing. It’s only the ability to self-audit that allows the individual programmer, on average, to improve the world while mutating it. It can also help to have unit tests and, if available for the language, a compiler and a strong type system; those are a way to automate at least some of this self-censoring.

I’m a reasonably experienced programmer at this point, and I’m a good one, and I still generate phenomenally stupid bugs. Who doesn’t? Almost all bugs are stupid– tiny, random injections of entropy emerging from human error– which is why the claim (for example) that “static typing only catches ‘stupid’ bugs” is infuriating. What makes me a good programmer is that I know what tools and processes to use in order to catch them, and this allows me to take on ambitious projects with a high degree of confidence in the code I’ll be able to write. I still generate bugs and, occasionally, I’ll even come up with a bad idea. I’m also very good at catching myself and fixing mistakes quickly. I’m going to call this selective self-censoring that prevents 90 percent of one’s output from being crap the Sturgeon Filter.

With a strong Sturgeon Filter, you can export the good mutations and bury the bad ones. This is how reliability (either in an artistic masterpiece, orin  a correct, efficient program) can be achieved by unreliable creatures such as humans. I’d further argue that to be a competent programmer requires a strong Sturgeon Filter. The good news is that this filter is built up fairly quickly by tools that give objective feedback: compilers and computers that follow instructions literally, and malfunction at the slightest mistake. As programmers, we’re used to having our subordinates (compilers) tell us, “Fix your shit or I’m not doing anything.”

It’s no secret that most programmers dislike management, and have a generally negative view of the executives and “product managers” running most of the companies that employ them. This is because programmers pride themselves on having almost impermeable Sturgeon Filters, while lifelong managers have nonexistent Sturgeon Filters. They simply don’t get the direct, immediate feedback that would train them to recognize and reject their own bad ideas. That’s not because they’re stupider than we are. I don’t actually think that they are. I think that their jobs never build up the sense of fallibility that programmers know well.

Our subordinates, when given nonsensical instructions, give blunt, tactless feedback– and half the time they’re just pointing out spelling errors that any human would just ignore! Managers’ subordinates, however, are constantly telling them that they’re awesome, and will often silently clean up their mistakes. Carry this difference in experience out over 20 years or more, and you get different cultures and different attitudes. You get 45-year-old programmers who, while extraordinarily skillful, are often deeply convinced of their own fallibility; and you get 45-year-old executives who’ve never really failed or suffered at work, because even when they were bad at their jobs, they had armies of people ready to manage their images and ensure that, even in the worst case scenario where they lost jobs, they’d “fail up” into a senior position in another company.

Both sides now

Programmers and managers both mutate things; it’s the job. Programmers extend and alter the functionality of machines, while managers change the way people work. In both cases, the effects of a random mutation, or even an unwise intended one, are negative. Mutation for its own sake is undesirable.

For example, scheduling a meeting without a purpose is going to waste time and hurt morale. Hiring bad people and firing good ones will have massive repercussions. To manage at random (i.e. without a Sturgeon Filter) is almost as bad as to program at random. Only a small percentage of the changes to the way that people work that managers propose are actually beneficial. Most status pings or meetings serve no value except to allay the creeping sense of the manager that he isn’t “doing enough”, most processes that exist for executive benefit or “visibility” are harmful, and a good 90 to 99 percent of the time, the people doing the work have better ideas about how they should do it than the executives shouting orders. Managers, in most companies, interrupt and meddle on a daily basis, and it’s usually to the detriment of the work being produced. Jason Fried covers this in this talk, “Why work doesn’t happen at work”. As he says, “the real problems are … the M&Ms: the managers and the meetings”. Managers are often the last people to recognize the virtue of laziness: that constantly working (i.e. telling people what to do) is a sign of distress, while having little to do generally means that they’re doing their jobs well.

In the past, there was a Sturgeon Filter imposed by time and benign noncompliance. Managers gave bad orders just as often as they do now, but there was a garbage-collection mechanism in place. People followed the important orders, which were usually congruent already with common sense and basic safety, but when they were given useless orders or pointless rules to follow, they’d make a show of following the new rules for a month or two, then discard them when they failed to show any benefit or improvement. Many managers, I would imagine, preferred this, because it allowed them to have the failed change silently rejected without having any attention drawn to their mistake. In fact, a common mode of sub-strike resistance used in by organized labor is “the rule-follow“, a variety of slowdown in which rules were followed to the letter, resulting in low productivity. Discarding senseless rules (while following the important, safety-critical ones) is a necessary behavior of everyone who works in an organization; a person who interprets all orders literally is likely to perform at an unacceptably low level.

In the past, the passage of time lent plausible deniability to a person choosing to toss out silly policies that would quite possibly be forgotten or regretted by the person who issued them. An employee could defensibly say that he followed the rule for three months, realized that it wasn’t helping anything and that no one seemed to care, and eventually just forgot about it or, better yet, interpreted a new order to supersede the old one. This also imposed a check on managers, who’d embarrass themselves by enforcing a stupid rule. Since no one has precise recall of a months-old conversation of low general importance, the mists of time imposed a Sturgeon Filter on errant management. Stupid rules faded and important ones (like, “Don’t take a nap in the baler”) remained.

One negative side effect of technology is that it has removed that Sturgeon Filter from existence. Too much is put in writing, and persists forever, and the plausible deniability of a worker who (in the interest of getting more done, not in slacking) disobeys it has been reduced substantially. In the past, an intrepid worker could protest a status meeting by “forgetting” to attend it on occasion, or by claiming he’d heard “a murmur” that it was discontinued, or even (if he really wanted to make a point) by taking colleagues out for lunch at a spot not known for speedy service and, thus, an impersonal force just happening to half the team late for it. While few workers actually did such things on a regular basis (to make it obvious would get a person just as fired then as today) the fact that they might do so imposed a certain back-pressure on runaway management that doesn’t exist anymore. In 2015, there’s no excuse for missing a meeting when “it’s on your fucking Outlook calendar!”

Technology and persistence have evolved, but management hasn’t. Programmers have looked at their job of “messing with” (or, to use the word above, mutating) computers and software systems and spent 70 years coming up with new ways to compensate for the unreliable nature that comes from our being humans. Consequently, we can build systems that are extremely robust in spite of having been fueled by an unreliable input (human effort). We’ve changed the computers, the types of code that we can write, and the tools we use to do it. Management, on the other hand, is still the same game that it always has been. Many scenes from Mad Men could be set in a 2010s tech company, and the scripts would still fit. The only major change would be in the costumes.

To see the effects of runaway management, combined with the persistence allowed by technology, look no further than the Augean pile of shit that has been given the name of “Agile” or “Scrum”. These are neo-Taylorist ideas that most of industry has rejected, repackaged using macho athletic terminology (“Scrum” is a rugby term). Somehow, these discarded, awful ideas find homes in software engineering. This is a recurring theme. Welch-style stack ranking turned out to be a disaster, as finally proven by its thorough destruction of Enron, but it lives on in the technology: Microsoft used it until recently, while Google and Amazon still do. Why is this? What has made technology such an elephant graveyard for disproven management theories and bad ideas in general?

A squandered surplus

The answer is, first, a bit of good news: technology is very powerful. It’s so powerful that it generates a massive surplus, and the work is often engaging enough that the people doing it fail to capture most of the value they produce, because they’re more interested in doing the work than getting the largest possible share of the reward. Because so much value is generated, they’re able to have an upper-middle-class income– and upper-working-class social status– in spite of their shockingly low value-capture ratio.

There used to be an honorable, progressive reason why programmers and scientists had “only” upper-middle-class incomes: the surplus was being reinvested into further research. Unfortunately, that’s no longer the case: short-term thinking, a culture of aggressive self-interest, and mindless cost-cutting have been the norm since the Reagan Era. At this point, the surplus accrues to a tiny set of well-connected people, mostly in the Bay Area: venture capitalists and serial tech executives paying themselves massive salaries that come out of other peoples’ hard work. However, a great deal of this surplus is spent not on executive-level (and investor-level) pay but into another, related, sink: executive slack. Simply put, the industry tolerates a great deal of mismanagement simply because it can do so and still be profitable. That’s where “Agile” and Scrum come from. Technology companies don’t succeed because of that heaping pile of horseshit, but in spite of it. It takes about five years for Scrum to kill a tech company, whereas in a low-margin business it would kill the thing almost instantly.

Where this all goes

Programmers and executives are fundamentally different in how they see the world, and the difference in Sturgeon Filters is key to understanding why it is so. People who are never told that they are wrong will begin to believe that they’re never wrong. People who are constantly told that they’re wrong (because they made objective errors in a difficult formal language) and forced to keep working until they get it right, on the other hand, gain an appreciation for their own fallibility. This results in a cultural clash from two sets of people who could not be more different.

To be a programmer in business is painful because of this mismatch: your subordinates live in a world of formal logic and deterministic computation, and your superiors live in the human world, which is one of psychological manipulation, emotion, and social-proof arbitrage. I’ve often noted that programming interviews are tough not because of the technical questions, but because there is often a mix of technical questions and “fit: questions in them, and while either category is not terribly hard on its own, the combination can be deadly. Technical questions are often about getting the right answer: the objective truth. For a contrast, “fit” questions like “What would you do if given a deadline that you found unreasonable?” demand a plausible and socially attractive lie. (“I would be a team player.”) Getting the right answer is an important skill, and telling a socially convenient lie is also an important skill… but having to context-switch between them at a moment’s notice is, for anyone, going to be difficult.

In the long term, however, this cultural divergence seems almost destined to subordinate software engineers, inappropriately, to business people. A good software engineer is aware of all the ways in which he might be wrong, whereas being an executive is all about being so thoroughly convinced that one is right that others cannot even conceive of disagreement– the “reality distortion field”. The former job requires building an airtight Sturgeon Filter so that crap almost never gets through; the latter mandates tearing down one’s Sturgeon Filter and proclaiming loudly that one’s own crap doesn’t stink.



Viewing all articles
Browse latest Browse all 304

Trending Articles