Quantcast
Channel: michaelochurch – Michael O. Church
Viewing all articles
Browse latest Browse all 304

What Silicon Valley’s ageism means

$
0
0

Computer programming shouldn’t be ageist. After all, it’s a deep discipline with a lot to learn. Peter Norvig says it takes 10 years, but I’d consider that number a minimum for most people. Ten years of high-quality, dedicated practice to the tune of 5-6 hours per day, 250 days per year, might be enough. For most people, it’s going to take longer, because few people can work only on the interesting problems that constitute dedicated practice. The fundamentals (computer science) alone take a few thousand hours of study, and then there’s the experience of programming itself, which one must do in order to learn how to do it well. Getting code to work is easy. Making it efficient, robust, and legible is hard. Then, there’s a panoply of languages, frameworks, paradigms, to learn and absorb and, for many, to reject. As an obstacle, there’s the day-to-day misery of a typical software day job, in which so much time is wasted on politics and meetings and pointless projects that an average engineer is lucky to have 5 hours per week for learning and growth. Ten years might be the ideal; I’d bet that 20 years is typical for the people who actually become great engineers and, sadly, the vast majority of professional programmers never get anywhere close.

It takes a long time to be actually good in software engineering. The precocious are outliers. More typically, people seem to peak after 40, as in all the other high-skill disciplines. It, then, seems that most of age-related decline in this field is externally enforced. Age discrimination is not an artifact of declining ability but changing perceptions.

It doesn’t make sense, but there it is.

Age discrimination has absolutely no place in technology. Yet it exists. After age 40, engineers find it increasingly difficult to get appropriate jobs. Startups are, in theory, supposed to “trade against” the inefficiencies and moral failures of other companies but, on this issue, the venture capital (VC) funded startups are the biggest source of the problem. Youth and inexperience have become virtues, while older people who push back against dysfunction (and, as well, outright exploitation) are cited as “resistant to change”.

There’s another issue that isn’t derived from explicit ageism, but might as well be. Because our colonizers (mainstream business culture) are superficial, they’ve turned programming into a celebrity economy. A programmer has two jobs. In addition to the work itself, which is highly technical and requires continual investment and learning, there’s a full-time reputation-management workload. If a machine learning engineer works at a startup and spends most of his time in operations, he’s at risk of being branded “an ops guy”, and may struggle to get high-quality projects in his specialty from that point on. He hasn’t actually lost anything– in fact, he’s become far more valuable– but the superficial, nontechnical idiots who evaluate us will view him as “rusty” in his specialty and, at the least, exploit his lack of leverage. All because he spent 2 years doing operations, because it needed to be done!

As we get older and more specialized, the employment minefield becomes only more complicated. We are more highly paid at that point, but not by enough of a margin to offset the increasing professional difficulties. Executives cite the complexity of high-end job searches when demanding high salaries and years-long severances. Programmers who are any good face the same, but get none of those protections. I would, in fact, say that any programmer who is at all good needs a private agent, just as actors do. The reputation management component of this career, which is supposed to be about technology and work and making the world better, but is actually about appeasing the nontechnical, drooling patron class, constitutes a full-time job that requires a specialist. Either we need unions, or we need an agent model like Hollywood, or perhaps we need both. That’s another essay, though.

The hypocrisy of the technology employer

Forty years ago, smart people left finance and the mainstream corporate ladder for technology, to move into the emerging R&D-driven guild culture that computing had at the time. Companies like Hewlett-Packard were legitimately progressive in how they treated their talent, and rewarded for it by their employees’ commitment to making great products. In this time, Silicon Valley represented, for the most technically adept people in the middle class, a genuine middle path. The middle path will require its own essay, but what I’m talking about here is a moderate alternative between the extremes of subordination and revolution. Back then, Silicon Valley was the middle path that, four decades later, it is eagerly closing.

Technology is no longer managed by “geeks” who love the work and what it can do, but by the worst kinds of business people who’ve come in to take advantage of said geeks. Upper management in the software industry is, in most cases, far more unethical and brazen than anywhere else. To them, a concentration of talented people who don’t have the inclination or cultural memory that would lead them to fight for themselves (labor unions, agent models, ruthlessness of their own) is an immense resource. Consequently, some of the most disgusting HR practices (e.g. stack ranking, blatant sexism) can be found in the technology industry.

There’s one really bad and technical trait of software employers that, I think, has damaged the industry immensely. Technology employers demand specialties when vetting people for jobs. General intelligence and proven ability to code isn’t enough; one has to have “production experience” in a wide array of technologies invented in the past five years. For all their faults, the previous regime of long-lasting corporations was not so bigoted when it came to past experience, trusting people to learn on the job, as needed. The new regime has no time for training or long-term investment, because all of these companies have been built to be flipped to a greater fool. In spite of their bigoted insistence on pre-existing specialties in hiring, they refuse to respect specialties once people are hired. Individual programmers who attempt to protect their specialties (and, thus, their careers) by refusing assignment to out-of-band or inferior grunt work are quickly fired. This is fundamentally hypocritical. In hiring, software companies refuse to look twice at someone without a yellow brick road of in-specialty accomplishments of increasing scope; yet, once employees are inside and fairly captive (due to the pernicious stigma against changing jobs quickly, even with good reason) they will gladly disregard that specialty, for any reason or no reason. Usually, this is framed as a business need (“we need you to work on this”) but it’s, more often, political and sometimes personal. Moving talent out of its specialty is a great way for insecure middle managers to neutralize overperformance threats. In a way, employers are like the pervert who chases far-too-young sexual partners (if “partner” is the right word here) for their innocence, simply to experience the thrill of destroying it. They want people who are unspoiled by the mediocrity and negativity of the corporate world, because they want to inflict the spoilage. The virginity of a fresh, not-yet-cynical graduate from a prestigious university is something they want all for themselves.

The depression factor

I’m not going to get personal here, but I’m bipolar so when I use words like “depression” and “hypomania” and “anxiety” I do, in fact, know what the fuck I am talking about.

A side effect of corporate capitalism, that I see, is that it has created a silent epidemic of middle-aged depression. The going assumption in technology that mental ability declines after age 25 is not well supported, and it is in fact contrary to what most cultures believe about intelligence and age. (In truth, various aspects of cognition peak at different ages– from language acquisition at age 5 to writing ability around 65– and there’s also so much individual variation that there’s no clear “peak” age.) For general, holistic intelligence, there’s no evidence of an age-bound peak in healthy people. While this risks sounding like a “No True Scotsman” claim, what I mean to say is that every meaningful age-related decline in cognition can be tracked to a physical health problem and not aging itself. Cardiovascular problems, physical pain and side effects of medication can impair cognition. I’m going to talk about the Big One, though, and that’s depression. Depression can cause cognitive decline. Most of that loss is reversible, but only if the person recovers from it and, in many cases, they never do.

In this case, I’m not talking about severe depression, the kind that would have a person considering electroconvulsive therapy or on suicide watch. I’m talking about mild depression that, depending on time of diagnosis, might be considered subclinical. People experiencing it in middle age are, one presumes, liable to attribute it to getting older rather than a real health problem. Given that middle-aged “invisibility” in youth-obsessed careers is, in truth, unjust and depressing, it seems likely that more than a few people would experience depression and fail to perceive it as a health issue. That’s one danger of depression that those who’ve never experienced it might not realize exists: when you’re depressed, you suffer from the delusion that (a) you’ve always been depressed, and (b) that no other outlook or mood makes sense. Depression is delusional, and it is a genuine illness, but it’s also dangerously self-consistent.

Contrary to the stereotype, people with depression aren’t always unhappy. In fact, people with mild depression can be happy quite often. It’s just easier to make them unhappy. Things that don’t faze normal people, like traffic jams or gloomy weather or long queues at the grocery store, are more likely to bother them. For some, there’s a constant but low-grade gloom and tendency to avoid making decisions. Others might experience 23 hours and 50 minutes per day of normal mood and 10 minutes of intense, debilitating, sadness: the kind that would force them to pull over to the side of the road and cry. There isn’t a template and, just as a variety of disparate diseases (some viral, some bacterial, and some behavioral) were once called “fevers”, I feel like “depression” is a cluster of about 20 different diseases that we just don’t have the tools to separate. Some depressions come without external cause. Others are clearly induced by environmental stresses. Some depressions impair cognition and probably constitute a (temporary) 30-IQ-point loss. Others (more commonly seen in artists than in technology workers) seem to induce no intellectual impairment at all; the person is miserable, but as sharp as ever.

Corporate workers do become less sharp, on average, with age. You don’t see that effect, at least not involuntarily so, in most intellectually intense fields. A 45-year-old artist or author or chess master has his best work ahead of him. True entrepreneurs (not dipshits who raise VC based on connections) also seem to peak in their 50s and, for some, even later. Most leaders hit their prime around 60. However, it’s observable that something happens in Corporate America that makes people more bitter, more passive, and slower to act over time, and that it starts around 40. Perhaps it’s an inverse of survivor bias, with the more talented people escaping the corporate racket (by becoming consultants, or entrepreneurs) before middle age. I don’t think so, though. There are plenty of highly talented people in their forties and fifties who’ve been in private-sector programming for a long time and just seem out of gas. I don’t blame them for this. With better jobs, I think they’d recover their power surprisingly quickly. I think they have a situationally-induced case of mild depression that, while it may not be the life-threatening illness we tend to associate with major depression, takes the edge off their abilities. It doesn’t make them unemployable. It makes them slow and bitter but, unlike aging itself, it’s very easily reversible: change the context.

Most of these slowed-down, middle-aged, private-sector programmers wouldn’t qualify for major depressive disorder. They’re not suicidal, don’t have debilitating panic attacks, and can attribute their losses of ability (however incorrectly) to age. Rather, I think that most of them are mildly but chronically depressed. To an individual, this is more of a deflation than a disability; to society, the costs are enormous, just because such a large number of people are affected, and because it disproportionately affects the most experienced people at a time when, in a healthier economic environment, they’d be in their prime.

The tournament of idiots

No one comes out of university wanting to be a private-sector social climber. There’s no “Office Politics” major. People see themselves as poets, economists, mathematicians, or entrepreneurs. They want to make, build, and do things. To their chagrin, most college graduates find that between them and any real work, there’s at least a decade of political positioning, jockeying for permissions and status, and associated nonsense that’s necessary if one intends to navigate the artificial scarcity of the corporate world.

The truth is that most of the nation’s most prized and powerful institutions (private-sector companies) have lost all purpose for existing. Ideals and missions are for slogans, but the organization’s true purpose is to line the pockets of those ranking high within it. There’s also no role or use for real leadership. Corporate executives are the farthest one gets from true leaders. Most are entrenched rent-seekers. With extreme economic inequality and a culture that worships consumption, it should surprise no one that our “leadership” class is a set of self-dealing parasites at a level that hasn’t been seen in an advanced economy since pre-Revolution France.

Leadership and talent have nothing to do with getting to the top. It’s the same game of backstabbing and political positioning that has been played in kings’ courts for millennia. The difference, in the modern corporation, is that there’s a pretense of meritocracy. People, at least, have to pretend to be working and leading to advance further. The work that is most congruent with social advancement, however, isn’t the creative work that begets innovation. Instead, it’s success in superficial reliability. Before you get permission to be creative, you have to show that you can suffer, and once you’ve won the suffering contest, it’s neither necessary nor worth it to take creative risks. Companies, therefore, pick leaders by loading people with unnecessary busywork that often won’t go anywhere, and putting intense but ultimately counterproductive demands on them. They generate superficial reliability contests. One person will outlast the others, who’ll fall along the way due to unexpected health problems, family emergencies, and other varieties of attrition (for the lucky few, getting better jobs elsewhere). One of the more common failure modes by which people lose this tournament of idiots is mild depression: not enough to have them hospitalized, but enough to pull them out of contention.

The corporate worker’s depression, especially in midlife, isn’t an unexpected side effect of economic growth or displacement or some other agent that might allow Silicon Valley’s leadership to sweep it under the rug of “unintended consequences”. Rather, it’s a primary landscape feature of the senseless competition that organizations create for “leadership” (rent-seeking) positions when they’ve run out of reasons to exist. At that level of decay, there is no meaningful definition of “merit” because the organization itself has turned pointless, and the only sensible way to allocate highly-paid positions is to create a tournament of idiots, in which people psychologically abuse each other (often subtly, in the form of microaggressions) until only a few remain healthy enough to function.

Here we arrive at a word I’ve come to dread: corporate culture. Every corporation has a culture, and 99% of those are utterly abortive. Generally, the more that a company’s true believers talk about “our culture”, the more fucked-up the place actually is. See, culture is often ugly. Foot binding, infantile genital mutilation (“female circumcision”), war, and animal torture are all aspects of human culture. Previous societies used supernatural appeal to defend inhumane practices, but modern corporations use “our culture” itself as a god. “Culture fit” is often cited to justify the otherwise inconsistent and, sometimes, unjustifiable. Why wasn’t the 55-year-old woman, a better coder than anyone else on the team, hired? “It wouldn’t look right.” Can’t say that! “A seasoned coder who isn’t rich would shatter the illusion that everyone good gets rich.” Less illegal, but far too honest. “She didn’t fit with the culture.” Bingo! Culture can always be used in this way, by an organization, because it’s a black box of blame, diffusing moral culpability about the group. Blaming an adverse decision on “the team” or “the culture” avoids individual risk for the blamer, but the culture itself can never be attacked as bad. Most people in most organizations actually know that the “leadership team” (career office politicians, also known as executives) of their firm is toxic and incompetent. When they aren’t around, the executives are attacked. But it’s rare that anyone ever attacks the culture because “the culture” is everyone. To indict it is to insult the people. In this way, “the culture” is like an unassailable god.

Full circle

We’ve traveled through some dark territory. The tournament of idiots that organizations construct to select leadership roles, once they’ve ceased to have a real purpose, causes depression. The ubiquity of such office cultures has created, I argue, a silent epidemic of mild, midlife depression that has led venture capitalists (situated at its periphery, their wealth buying them some exit from the vortex of mediocrity in which they must still work, but do not live) and privileged young psuedo-entrepreneurs (terrified of what awaits them when their family connections cool and they must actually work) to conclude that a general cognitive mediocrity awaits in midlife, even though there is no evidence to support this belief, and plenty of evidence from outside of corporate purgatory to contradict it.

What does all of this say? Personally, I think that, to the extent that large groups of individuals and organizations can collectively “know” things, the contemporary corporate world devalues experience because it knows that the experience it provides is of low value. It refuses to eat its own dogfood, knowing that it’s poisoned.

For example, software is sufficiently technical and complex that great engineers are invariably experienced ones. The reverse isn’t true. Much corporate experience is of negative value, at least if one includes the emotional side effects that can lead to depression. Median-case private-sector technology work isn’t sufficiently valuable to overcome the disadvantages associated with age, which is another way of saying that the labor market considers average-case corporate experience to have negative value. I’m not sure that I disagree. Do I think it’s right to write people off because of their age? Absolutely not. Do I agree with the labor market’s assessment that most corporate work rots the brain? Well, the answer is mostly “yes”. The corporate world turns smart people, over the years, into stupid ones. If I’m right that the cause of this is midlife depression, there’s good news. Much of that “brain rot” is contextual and reversible.

How do we fix this?

Biologists and gerontologists seeking insight into longevity have studied the genetics and diet of long-living groups of people, such as the Sardinians and the people of Okinawa. Luckily for us, midlife cognitive decline isn’t a landscape feature of most technical or creative fields. (In fact, it’s probably not present in ours; it’s just perceived that way.) There are plenty of places to look for higher cognitive longevity, because few industries are as toxic as the contemporary software industry. When there is an R&D flavor to the work, and when people have basic autonomy, people tend to peak around 50, and sometimes later. Of course, there’s a lot of individual variation, and some people voluntarily slow down before that age, in order to attend to health, family, spiritual, or personal concerns. The key word to that is voluntary

Modeling and professional athletics (in which there are physical reasons for decline) aside, a career in which people tend to peak early, or have an early peak forced upon them, is likely to be a toxic one that enervates them. Silicon Valley’s being a young man’s game (and the current incarnation of it, focused on VC-funded light tech, is exactly that) simply indicates that it’s so destructive to the players that only the hard-core psychopaths can survive in it for more than 10 years. It’s not a healthy place to spend a long period of time and develop an expertise. As discussed above, it will happily consume expertise but produces almost none of value (hence, its attraction to those with pre-existing specialties, despite failing to respect specialties once the employee is captive). This means, as we already see, that technical excellence will fall by the wayside, and positive-sum technological ambition will give way to the zero-sum personal ambitions of the major players.

We can’t fix the current system, in which the leading venture capitalists are striving for “exits” (greater fools). That economy has evolved from being a technology industry with some need for marketing, to a marketing industry with some need (and that bit declining) for technology. We can’t bring it back, because its entrenched players are too comfortable with it being the way it is. While the current approach provides mediocre returns on investment, hence the underperformance of the VC  asset class, the king-making and career-altering power that it affords the venture capitalist allows him to capture all kinds of benefits on the side, ranging from financial upside (“2 and 20″) to executive positions for talentless drinking buddies to, during brief bubbly episodes such as this one, “coolness”. They like it the way it is, and it won’t change. Rather than incrementally fix the current VC-funded Valley, it must be replaced outright.

The first step, one might say, is to revive R&D within technology companies. That’s a step in the right direction, but it doesn’t go far enough. Technology should be R&D, full stop. To get there, we need to assert ourselves. Rather than answering to non-technical businessmen, we need to learn how to manage our own affairs. We need to begin valuing the experience on which most R&D progress actually relies, rather than shutting the seasoned and cynical out. And, as a first step in this direction, we need to stop selling each other out to nontechnical management for stupid reasons, including but especially “culture fit”.



Viewing all articles
Browse latest Browse all 304

Trending Articles