Sunday, May 24, 2020

Software will eat software in a remote-first world

Hi, Can here! Today, we talk about remote work.

2-rent.com | Telepresence robot | N/A

As someone who works in software and lives in the world, the idea of software eating the world has been top of mind for quite some time. I mean, I don’t think we should remake the world in the image of software and definitely not in that of the software people, but I do think we are in a time of transformational change in our societies because of it. At the same time, if software is eating the world, I do think that it should take a breather before taking another bite.

Meanwhile, one of my related, and if not slightly contrarian bets around software and its dietary habits has been “software is eating software”. We are coming to a point where software is developing so fast and the abstractions getting better that soon we will have more software written by a smaller number of people. In other words, just like software made legions of people working in other industries obsolete, it will soon make its creators less valuable too. In short, software will eat software. Or maybe, software will eat software people? I’m still working on it…

I’ve been thinking of this for a while now, but the recent announcement from various large tech companies on how they are going full-remote has brought it into full focus for me. Twitter started it (bear with me), Coinbase joined and now Facebook is joining too. Many tech firms, particularly those in Silicon Valley and Seattle, have been in Work from Home (WFH) mode for a while, but more are now saying that they’ll “lean” more into it, by turning the current setup where people do not regularly come into an office permanent.

Obviously, none of this is really new, as many other firms like Gitlab have been fully distributed from day one. Other firms like Stripe had hybrid approaches where there is still an HQ but remote workers are considered as part of an “office-in-the-cloud” of some sorts. And even the most remote-unfriendly companies have a few people who are able to pull off working remotely, generally through some form of nepotism, organizational apathy, politics, and occasionally genuine need. But, whatever the terms are, it does seem like a huge, very accelerated switch to how the next generation of tech companies will operate is in progress. And we are nowhere near figuring out what that will all mean.

I believe one of the more subtle impacts of this new way of working will be the tech workers losing some of the leverage they have over their employers. This will result in the further commodification of tech work, potentially less collective action by employees, and probably lower the salaries in the long run. Put another way, the technology industry will soon get a taste of what has been going on in other industries.

No-Code means No-Coders

Start with the commodification of work.  As a former software engineer, I’ve seen first-hand how fast this happens already. In my first job at a small start-up, we had tons of physical servers. Now, it’s hard to imagine any "webby” tech company ever interacting with any hardware at all. Whatever it took multiple ops people is now a few buttons and links on an AWS console. But, the impact of increasing abstractions in technology is more subtle than that and is hard to appreciate unless you work in the weeds of tech day-to-day.

We have come to a place where thanks to many libraries and frameworks, and overall improving software, what would’ve once used many developers to build from scratch is now more often than not, a bunch of people plumbing different things together. Software is creating software faster than we can use it. This is also why you are seeing so many of these “no-code” or “low-code” solutions pop up all over the place. There are increasingly fewer reasons to write code, and those who are writing code should, and do, increasingly write less of it. This will only be more accelerated by shifting to remote work due to how it’s going to change how we decide what code to write.

Again, I’ve seen this. When I was at Uber, we had to develop some unique technology due to our scale and unique problems, but a lot of what many people (including me) did was translating the business requirements into code using off-the-shelf technologies. As workers are more and more removed from the business, companies will get better at identifying what can be “automated”.

Anyone who’s spent a few months at a sizable tech company can tell you that a lot of software seems to exist primarily because companies have hired people to write and maintain them. In some ways, the software serves not the business, but the people who have written it, and then those who need to maintain it. This is stupid, but also very, very true.

When you don’t have to think about the people, but can simply reduce them to their contributions, it becomes a much easier mental task to figure out how you can get rid of the human part. In some ways, this is a prosocial artifact of our wiring: when the times are good, people are less inclined to think of ways to fire people by automating them out of a job. Tougher times with strained margins change that calculus. But it is also partly a cognitive load issue. When you can actually remove the human from the equation, it becomes mentally easier to figure out how you could actually not write that piece of code over and over again. This is a lesson Marx figured out when he mused on alienation and is a dark one that the techies will also be learning quite soon.

I don’t mean to imply that we should jettison people out of jobs because it’s the right thing to do for the bottom line. In fact, the opposite might be true. When I give people advice about jobs, I tell them once you’ve got an offer or two where your main requirements are met, you should decide primarily based on who your coworkers will be. For better or worse, work is how many people socialize for significant parts of the day. As someone who’s relatively extraverted, the lack of social interaction with my coworkers during this quarantine has been quite detrimental to my mental well-being already.

Borders Rule Everything Around Me

Let’s switch to salaries. I’ve talked about this before but here’s a short recap of how salaries are calculated. Most people would like to believe salaries are determined by a cost-plus model, where you get a tiny bit less than the value you add to the company. However, in reality, they are really determined by the competition. Companies are forced to pay as much as possible to keep the talent for leaving. In a competitive labor market, this is often a good thing for the employees.

Obviously, things can get quite weird when you take this model to its logical end. In the Bay Area, where the companies are giant, the geography tiny and the housing policies extremely questionable, this has resulted in salaries ballooning to insane levels. Getting a six-figure salary straight out of college barely raises an eyebrow anymore at many big firms. Companies have gone to great lengths, including some illegal ones to curb this competitive behavior to depress the salaries.

The remote-first mentality will be a god-send simply because you’ll no longer be restricted to a tiny piece of land with a questionable housing policy to source your talent. People estimate 40% of all VC funding going to landlords in the Bay, and I think that’s too conservative.

Again, this is a touchy topic. When Facebook announced they would be “localizing” salaries, there was a decent pushback on Twitter. Blair Reeves, a friend, and an occasional Margins contributor (on remote teams) wrote a persuasive piece on how companies should people based on what value they add, not pay them differently based on where they want to live. In some ways, I understand. People who live in more expensive neighborhoods in NYC or SF should not get paid more than those who decide to live farther away.

At the same time, I also think with companies as big as Facebook, at some point it becomes untenable to pay someone living in Turkey (or a cheaper US city) the same amount to those living in San Francisco. A huge American company paying Turkish people American salaries would be good for those employees, but it’d put an insane burden on other Turkish companies. Not localizing the pay would be a centralizing force for larger companies with nationwide or global reach.

Remote-First, Collective-Last

Lastly, let’s talk about the impact of remote-first on labor. Many months, before the virus hit, a CEO friend of mine “jokingly” told me that he believed all the “remote” buzz was as much about reducing the collective power of the employees as it was about saving a dime on salaries. He personally did not want his company to go full-remote but was under some pressure from investors to consider it. We were both hammered at the time, and I didn’t put too much thought into it, but it feels right the more I think about it.

Again, according to the seating chart theory of organizational behavior, it seems like employees who spend more time with each other would find it easier to connect, bond, find common ground, and be able to form a cohesive group. Collective action in the tech industry has been a big sore thumb for many of the tech companies, especially since Trump become president. And here, I actually do have sympathy for the management.

For a long time, most American companies could ignore the politics of their employees, which is a feature, not a bug, of stable liberal democracies. However, having to straddle the thin line between becoming an accomplice of an at best incompetent, and at worst dangerous and vicious government while maintaining a happy-go-round workplace is no easy task. As the tech talent has woken up to their power, which they’ve previously used to get free breakfast and Herman Miller chairs, they’ve increasingly exercised it to get their bosses to make meaningful and painful-for-the-bottom-line changes, like dropping entire projects. Any company in their right mind would opt for a chance to break this stronghold.

In a world where most employees are remote, this can be harder to do. Not only employees could be in touch with each other less, and in less personal ways, they might not be even able to do so without having non-monitored places. There will always be ways to employees to sneak around monitoring and surveillance, but it’ll be harder when everything is fully remote, and you’ll have less trust in those who will bond (or conspire with, depending on your POV) with you.

In Conclusion

This topic is really too big to fit in a single newsletter, and we are already at more than a thousand words. On the one hand, I do welcome these changes. I fundamentally believe that I’ve lucked into the technology industry, and many of the opportunities I’ve exploited were available to me because only I was in the right room at the right time, a few times quite literally so. If software eats the world, and we get rid of some of the geographical inequalities through it, that’d be an improvement.

However, I do think that going head-first into a new way of working will only cement the centralizing forces at play with software. We already know, for example, how the software margins generally end up with larger companies who can erect big fixed-cost barriers earning disproportionately more. As software eats not the margins, but the societal frictions now, we can end up with a similar situation for the workplaces too. We can, maybe, stop software eating software before its too late.


PS: A quick note from the last issue’s fundraiser for Frontline Foods. I’m still going through the list of people who have donated and reached out. You have not been forgotten, and I’ll hold my end of the deal! And if you haven’t donated, you still have time!



from Hacker News https://ift.tt/3eflnI5

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.