Are Software Developers Respected by “Business People”?

In the Hacker News thread for my last blog post, there were a couple of comments that really got me thinking about the role respect plays in bad office design.

Is it time to get depressed yet? This has been a topic for such a long time. It was in Peopleware in 1987. I thought I discovered the topic when Joel (On Software) wrote about it in 2000. While a few people seem to enjoy open offices, the overwhelming majority of developers I know, or who chime in on HN, value a quiet place to work and dislike open offices.

And yet, not only has nothing changed, it seems to be getting worse. It couldn't be more clear to me that developers, at least on this issue, simply have no clout as a profession. There may be a few individuals who can make demands, but on the balance, these are decisions imposed on us, as a group, and we are apparently unable to do anything about it.

-- geebee

In a response to the above comment:

30 years ago programmers were highly respected. We were mysterious to others and we were able to influence things like office layouts and the like.

At some point over that time period, things shifted. Programmers became seen as "geeks" who didn't really understand business and "business guys" took over.

They have no concept that we might know what we're talking about because "office space is the realm of business."

We're not capable of decision making and we have no understanding beyond our weird obsession with those stupid computers. -- that's how they see us. They'll lie and say otherwise, but deep down and a fundamental level, that's how non-technical people see us.

-- MCRed

Is this true? Are software developers “geeks” that lack clout and respect with the “business people”?

If so, why is that? Have we earned our lack of clout/respect? I have some thoughts on this that I’ll explore in a future post.

In the meantime, let me leave you with this question: If you believed that an employee of yours did extraordinarily challenging mental work, which required extreme concentration, would you put them here to do that work?

Classic open plan office

Just Wear Headphones

In the comments people have written in response to my posts about open plan offices, a common theme was that of headphones (or earbuds) and how they are oversold as a solution to office noise.

In this post, I want to elaborate on a few issues with headphone use that hold them back from being a silver bullet.

Developers using headphones and earbuds

Physical Hearing Damage

There are permanent physical consequences from prolonged headphone use. The effects accrue gradually, and as such people don’t notice that it’s happening.

From the American Osteopathic Association, Dr. James Foy explains:

I stress to my patients and the parents of my patients that if you can’t hear anything going on around you when listening to headphones, the decibel level is too high.

As a rule of thumb, you should only use [personal audio] devices at levels up to 60% of maximum volume for a total of 60 minutes a day. The louder the volume, the shorter your duration should be. At maximum volume, you should listen for only about five minutes a day.

ENT physician, Dr. Michael Seidman, continues in an article from the New York Times:

If you listen to music with earbuds or headphones at levels that block out normal discourse, you are in effect dealing lethal blows to the hair cells in your ears.

When you’re working in an environment so noisy that you have to pump music (or white noise) into your ear canal so loudly that it blocks out the other noise, you are doing permanent damage to your hearing.

Music Is Distracting

This is one of the trickier issues to discuss, as people love music and have a hard time separating the pleasure they get from listening to music from their effectiveness while listening to music.

If you ask software developers about what they blast out of those ubiquitous headphones, you’ll get answers like this:

"It's not just something in the background to help me concentrate; it's a source of inspiration, a door to free my mind from our day-to-day routines, and, at the same time, it's a way to memorize an experience," says Ortali. "I play tracks in a loop, sometimes the exact same track all day long. It's a way to connect with the lyrics, and move the tempo beneath my skin."

Scientific minds get very un-scientific when it comes to their favorite music.

In a terrific article from The Atlantic,How Headphones Changed the World:

In survey after survey, we report with confidence that music makes us happier, better at concentrating, and more productive.

Science says we're full of it. Listening to music hurts our ability to recall other stimuli, and any pop song -- loud or soft -- reduces overall performance for both extraverts and introverts. A Taiwanese study linked music with lyrics to lower scores on concentration tests for college students, and other research have shown music with words scrambles our brains' verbal-processing skills. "As silence had the best overall performance it would still be advisable that people work in silence," one report dryly concluded.

If headphones are so bad for productivity, why do so many people at work have headphones?

That brings us to a psychological answer: There is evidence that music relaxes our muscles, improves our mood, and can even moderately reduce blood pressure, heart rate, and anxiety. What music steals in acute concentration, it returns to us in the form of good vibes.

Headphones give us absolute control over our audio-environment, allowing us to privatize our public spaces.

People conflate the positive psychological effects of creating a cocoon of their favorite sounds in an environment of noise they can’t control with positive effects on their productivity.

Feeling of Vulnerability

As I touched on in a previous post, seating people with their backs to a high-traffic area leads to a constant sense of unease and vulnerability.

Back to the action

People in this position have lost their sense of sight to detect when someone is approaching them. When you add headphones to the equation, they’ve now also lost their sense of hearing.

Headphone use in a noisy open plan environment can be a catch-22. The noise is so oppressive that you want to block it out, but then you have to deal with the feeling of vulnerability and frequent startles of people approaching you from behind without hearing them.

So What to Do?

Headphones are not the new walls. Give people a quiet place to work or let them work from their own.

 

Side Note: Noise-Cancelling Headphones

I want to quickly address “noise-cancelling” headphones in particular, as they are mentioned often as a quick fix for the problem of office noise. The technology at use was designed to cancel the low, constant rumble of aircraft engines. So while it may work to cancel the noise of your office air conditioner, it’s powerless against the voices of your co-workers (the real noise you’d want to cancel in an office environment). Read some of the reviews for the popular Bose QuietComfort noise-cancelling headphones, and you’ll get the picture.

The "Phone Room"

Here’s another office design 101 thing that I want to get out of the way: the importance of temporary private space for people stuck in open plans.

These private spaces sometimes get labeled as “phone rooms” or something similar, implying that they exist for a person to take their loud conversation away from the rest of the workers. Well, the exact opposite situation is at least as important if not more: for a person to get some quiet away from the loudness of the general office environment.

Phone room

We already know that some people need more quiet time than others. And headphones are not a substitute for quiet.

Let’s stop calling these “phone rooms” and ensure there is no judgment from anyone who matters about people using these rooms simply for quiet time.

How to Make Your Open Plan Office Suck Less

Open plan offices suck, but there are some easy ways to make them suck less. Here are three ways that simple desk positioning can make a big difference in the suckiness of your open plan.

1. Lower Density

Lower density means less noise. Put more space between desks.  

red16 Sucks:

High density sucks

yellow16 Sucks less:

Low density sucks less

2. Face Space

Don’t seat people where their line of sight goes through a nearby face. If you haven’t felt the awkward tension of having someone’s visible face right behind your monitor while working all day, then congrats, your open plan sucks a bit less.

red16 Sucks:

Face-to-face bad

red16 Sucks:

Face-to-side-of-face bad

yellow16 Sucks less:

Facing in same direction better

3. Watch Your Back

Don’t seat people with their back to a high-traffic area. People in this position feel constantly vulnerable and cannot have one moment of screen privacy. Ask these folks to wear headphones, and they’ll feel even more vulnerable.

red16 Sucks:

Back to the action bad

yellow16 Sucks less:

Facing the action better

 

If you’re committed to an open plan office (shame on you), then at least get these things right.

Remote Work Denial Is a Bad Look

When out-of-state recruiters email me about their awesome tech company, my first response is always something along the lines of, “You support remote work, right? I live in Grand Rapids, Michigan.” When the response comes back as a flat denial of the possibility, suddenly that hot tech company seems very old-fashioned. And old-fashioned is not a good look in the tech industry. I always think to myself, “Really, you’re one of those? Well, that’s embarrassing.”

If “Agile” could be said to have traditional values, one of them might be colocation. I’ve come to view this emphasis as a bit na├»ve or idealistic in the present day.

As Keith Richards says in an article for InfoQ about distributed Agile:

Most of the agile body of knowledge that has been written is based on the utopian situation of one team, one ‘product owner’ and one location. Although this is still often the case, is it the exception or is it the rule? Having worked for well over a decade now on implementations of agile I find that multi-team, multi-location, multiple business area and even multi-time zone agile is more the norm.

If agile is to thrive over the next 10 years then it not only has to work in a distributed environment (i.e. an environment where we do not all work in the same place), but it has to work well in order to deliver the most value to an organization.

Mike Cohn, in his book Succeeding with Agile, expresses a similar thought:

A few years ago, collocated teams were the norm, and it was unusual for a team to be geographically distributed. By now, the reverse must be true. Personally, I’m now surprised when someone tells me that everyone on the team works in the same building.

Not a good look

You can think face-to-face, in-person communication is most efficient, and I won't argue with you, but it ultimately doesn't matter as the remote work trend will not be stopped.

When I see tech companies taking a hard-line stance against remote work, I can't help but think, “Imagine how dumb you're going to look in a few years.” I truly do not mean to offend anyone’s sensibilities with what I’m about to say and hope my point gets across regardless of your particular political beliefs, but remote work denial feels very much to me like people vehemently opposing the legalization of marijuana or same-sex marriage at this point--do you really think you're going to stem this tide? With all due respect to your beliefs, this is happening anyway.

Get on the boat now before your company has been too badly embarrassed in front of the people it wants to hire. Remote work is still a differentiator in this moment of time. What are you waiting for...your competitors to do it first? You want to clean up in the talent wars? Figure out how you're going to make remote work effective for your company and then shout it from the rooftops.

From the closing sentiments of Remote:

Life on the other side of the traditional office paradigm is simply too good for too many people. Progress on fundamental freedoms, like where to work, is largely cumulative. There might be setbacks here and there from poorly designed programs or misguided attempts at nostalgia, but they’ll be mere blips in the long run.

Between now and the remote work–dominated future, the debate is likely to get more intense and the battle lines more sharply drawn. Remote work has already progressed through the first two stages of Gandhi’s model for change: “First they ignore you, then they laugh at you, then they fight you, then you win.” We are squarely in the fighting stage—the toughest one—but it’s also the last one before you win.

Remote work is here, and it’s here to stay. The only question is whether you’ll be part of the early adopters, the early majority, the late majority, or the laggards. The ship carrying the innovators has already sailed, but there are still plenty of vessels for the early adopters.

Continuous Delivery and Management Pathologies

I believe continuous delivery is a foundational practice. In fact, it’s mentioned right at the top in the “Principles” section of the Agile Manifesto:

Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.

There’s a lack of trust that develops amongst stakeholders when the gap between “developer done” and “I can see that it’s done” is too great. In a perfect world, everyone trusts everyone, but that’s just not the case on real projects a great deal of the time.

You want to know how to cultivate trust? Deliver consistently. Perhaps, continuously.

I believe one of the greatest causes of managers behaving badly is a fear that the things that need to get done are not getting done. This is one of the great causes of micromanagement, which technical folks tend to hate (I know I do).

By reducing this fear, you can head off several management pathologies. You can prevent a hundred status meetings by sending a URL to someone and saying “see for yourself.”

I’ve come to believe that on most software projects, the speed of progress is not particularly important as much as steady progress. Optimize for steady, externally-visible progress.

CD is the ultimate progress indicator of your project. As Jenifer Tidwell says in Designing Interfaces:

Experiments show that if users see an indication that something is going on, they’re much more patient, even if they have to wait longer than they would without a Progress Indicator. Maybe it’s because they know that “the system is thinking,” and it isn’t just hung or waiting for them to do something.

Note that this is the same lack of trust that makes otherwise intelligent software organizations hesitant to embrace remote work. What if my daily work produced artifacts of working software that you can look at every day whenever you wanted? Would that assuage your concern that I’m sitting in my pajamas and watching cartoons all day?

I see CD as a sort of great equalizer. Working software in front of the people who paid for it levels all arguments.

Your Problem Is Not Unique

As I approach my 10th year as a software professional, one of the things that frustrates me the most about our industry is the way in which people chronically overestimate the uniqueness of the problems they’re facing, and the frequency with which wheels are reinvented.

Designers, developers, and managers all have a habit of approaching common problems as if they’re the first one to do so. There’s an expression about “standing on the shoulders of giants.” We’re not all giants, and not all of our problems require herculean efforts.

Begin rant…

Your UI problem is not unique

I think the most common and ultimately harmful way this manifests is in user interface design.

Reinvention at the UI level is particularly insidious, as it doesn’t just waste the time and money of the team developing the software, but it confuses users (and we’re ultimately doing this for them, right? Right?)

I’ve witnessed bored designers coming up with the most elaborate one-off UI components to show a list of items or tabular data as if that’s a unique thing to do. Your typical line-of-business web app is not going to succeed or fail based upon the clever, innovative concept you’ve invented for showing a list of items to a user.

The thing is for the 99%, familiarity trumps cleverness. Every UI element that has to be explained-- especially when you could have swapped in one that people already know and accomplishes the same purpose—is one check against you and your product.

For example, I can’t tell you how many hours I’ve spent on teams debating the look and feel of the primary navigation for a web app. Grab a copy of Don’t Make Me Think, read the chapter on navigation, do what it says, and then move onto more pressing issues. You’re doing a big disservice to your users if you don’t copy the familiar style of navigation that your users have gotten used to on popular websites for the last decade.

One more example: your web app is not the first one to need to notify users of events that have happened in the system. Roughly 1.4 billion people--including the people who use your web app--use Facebook and are desperately familiar with the way that Facebook handles notifications. Rip their design off as closely as you can, and move on to something that can differentiate you from your competitors.

fb-notify

There are good catalogs of established UI patterns out there, such as the book Designing Interfaces. Before you have a long, long, like cruelly long series of meetings and email chains, whiteboard sessions, and the like, identify the pattern that relates to your requirement, find some well-known examples of the pattern, and copy them as closely as you can. If by some fluke your unique problem has not already been solved in a nice, familiar, recognizable, warm-fuzzy to your grandma kind of way, then by all means proceed to your brainstorming session(s).

Your development problem is not unique

For developers (and this includes me), I know that working day in and day out on that accounting app is not always particularly exciting, but avoid the temptation to write your own object-relational mapper from scratch to shove ledgers in and out of your SQL database. If you want to break out of the mold and roll your own thing from scratch, choose an area where you can really make a difference.

If you really want to differentiate yourself, be the person that knows the lessons of 40 years ago rather than a person who’s obsessively implementing TodoMVC over and over in this afternoon’s hottest JavaScript framework. The former is much rarer than the latter.

Tweet from Giles Bowkett

I know a lot of devs are entranced by the technology for its own sake. And there’s nothing wrong with nerding out over programming languages (my personal favorite), or JavaScript frameworks, or NoSQL databases or whatever floats your boat, but it’s easy to lose sight of the bigger picture.

“Barb, I know this thing is a pain in the ass to use, because the navigation makes no sense, no thought was put into the information architecture, and we didn’t consult with one actual user before developing it, but you’ll be happy to know that we’re using Riak on the back end. You’re welcome.”

There’s an episode of The Changelog podcast that came out recently in which they interviewed DHH about the decade long history of Ruby on Rails. With his trademark bluntness, he lets loose on “unique snowflakes” in a quote that I would guess was inspired by Tyler Durden’s speech from Fight Club:

They want to believe that every single application is a unique snowflake. That they're so brilliantly unique, too. That their value comes from their careful selection of which template language, which data mapper, which whatever the hell it is...

There are lots of applications out there trying to be needlessly novel to satisfy the egos of programmers who do not want to feel like they're working in cookie-cutter domains. That they somehow attach their self-worth to how novel their application is.

That’s a little harsh—what would you expect from DHH?—but I think the point is valid. It’s all too easy to focus on the minutiae of technology when in reality, we’re in a people business. And now I quote from Peopleware, as I often do:

…the High-Tech Illusion: the widely held conviction among people who deal with any aspect of new technology (as who of us does not?) that they are in an intrinsically high-tech business. They are indulging in the illusion whenever they find themselves explaining at a cocktail party, say, that they are “in computers,” or “in telecommunications,” or “in electronic funds transfer.” The implication is that they are part of the high-tech world. Just between us, they usually aren’t. The researchers who made fundamental breakthroughs in those areas are in a high-tech business. The rest of us are appliers of their work. We use computers and other new technology components to develop our products or to organize our affairs. Because we go about this work in teams and projects and other tightly knit working groups, we are mostly in the human communication business.

Managers, you’re not helping

Project managers, stakeholders, and other people who manage software efforts are not off the hook here. You also need to be honest about the product you’re making and accept that sometimes your unique take on logo placement is not what’s going to sell your software. You folks hold the keys to the backlog and how work gets prioritized. Spend those precious developer-hours on things that matter, not reinventing the same basic features and conventions that nearly every application has in common.

What’s worth spending time on?

The way that your bread is truly buttered in 99% of applications is through insights into the business domain that come from years of domain expertise accumulating slowly.

If you want to futz with some new technology and try something new, that’s great. But be honest about what you’re doing. Most of the time you’re doing it for you, not for your users.

I wish a standard role on every software team was the “That Thing’s Already Been Invented” Czar. Or “Code Historian” or “User Experience Historian”. Now that person would be worth their weight in gold.

Instead of Not Invented Here (NIH), how about Proudly Found Elsewhere (PFE)?

But what about innovation?

Innovation is a necessary thing and I would never suggest that people should never try anything new. If you never try anything new, the industry can never move forward.

The question of when to innovate is so context-specific. If you’re writing an early iPhone app during the wild west of that platform, by all means, try some new form of navigation. If you’re Facebook and having the (good) problem of 1 billion users hammering your database every day, by all means, invent your own database.

I think “20% time” or something similar may be a good solution. Spike some crazy idea you had or some bleeding edge tech on a pet project one day a week. If it happens to have practical implications for a product, then awesome. But it’s ok to just have some fun and learn something new. Scratch your itch--we know that’s important for its own sake.


Postscript: Why make Ruby on Rails?

It’s funny—when I started writing an outline for this post, I couldn’t stop thinking about DHH and Ruby on Rails, which I think is one of the few true game-changing innovations in the field of web development since I joined the industry. I kept thinking back to the early days of Rails’ creation and how difficult it would have been to justify such a project. It seems like such a strange case of yak shaving. Really? You need to take this weird Japanese programming language no one has heard of and write the umpteenth web framework in order to write yet another project management tool? How can you possibly justify that? What did your business partner, Jason Fried, think? You guys are the “do less” guys. What the hell?

In an interview from 2005, Fried says:

I had some natural hesitation about using Ruby at first ("What the #@!* is Ruby?" "Why don't we just use PHP--it served us well before?"), but David Heinemeier Hansson, the first engineer on the Basecamp project, cogently made the case and I bought it.

In the podcast episode I mentioned earlier, DHH also discusses his motivation while writing the code that would become Rails. It turns out that a major motivation for DHH was to remove all the wheel-reinvention that happens with each new web project. He wanted to make a “batteries included” full-stack framework that gave you sensible defaults and convention over configuration. Because it’s not necessary, for example, to invent your own naming conventions for every class and every property and how they map to the naming conventions of your database tables and columns. Because what the hell does it matter? He’ll think carefully about the problem, pick one, make it the default, and then you can move on, because you have more important things to worry about.