Thoughts on unlimited PTO

The idea of unlimited paid time off (PTO) has been around for a while. I wrote about it in 2015 shortly after my then-employer changed from a traditional PTO policy. But unlike many practices in the tech sector, unlimited PTO has not become ubiquitous. It’s still a matter of debate, especially since it can result in pressure to take less time off because there’s no signal about the appropriate amount.

The joys of unlimited PTO

My old company was small, engineering-focused, and fully-remote. Everyone wore a lot of hats and there wasn’t a lot of redundancy to go around. But the unlimited PTO model worked for us. Employees that were there a full year before and after the switch used, on average, about half a day more of PTO in the unlimited model.

We had a minimum policy: you had to take at least two one-week periods off each year. This helped make sure people did take time off and established that “unlimited” wasn’t a way of saying “don’t actually take time off, we just don’t want to carry this liability on our balance sheets”.

For me individually, I didn’t take much more time off than I had previously (I’m pretty bad at taking PTO to begin with). What I really liked about it wasn’t the lack of a limit, but the lack of having to think about it. Need to take a day off? Just do it. There’s no “hm. Well, should I only take half a day so that I can make sure I have some at the end of the year if I need it?”

After Microsoft acquired our company in 2017, we were back to a traditional model. We received about the same number of days off as I had been taking under the unlimited policy. But now I had to think about it. At Red Hat, we also have something like the traditional model. And I’m bad at taking it. In part because I tend to flex my work time in order to attend off-hours community events. In part because I’m just bad at it. I long for the day when I can just take time off and not worry about whether or not there will be any left for me at the end of the year.

To track or not to track?

Over the weekend, Alyss asked about tracking unlimited PTO:

I can understand the hesitancy. If your manager can reject PTO requests arbitrarily, then you don’t actually have an unlimited PTO policy. But the request/approval process can be useful for coordination. You don’t want to show up one morning to find a dozen people in your team have decided to take off for two weeks. In that sense, what’s needed more is acknowledgement than approval.

Tracking can also be abused, but I think it’s good on the whole. As a manager, if you can see that someone isn’t taking PTO, you can kick them out of the office for a week. (Not really, of course. But you can encourage them to find some time to not be at work.)

Unlimited vacation policies, burnout, etc.

Recently, my company switched from a traditional vacation model to a minimum vacation model. If you’re unfamiliar with the term, it’s essentially the unlimited vacation model practiced by Netflix and others, with the additional requirement of taking a defined minimum of time off each year. It’s been a bit of an adjustment for me, since I’m used to the traditional model. Vacation was something to be carefully rationed (although at my previous employer, I tended to over-ration). Now it’s simply a matter of making sure my work is getting done and that there’s someone to cover for me when I’m out.

I’m writing this at 41,000 feet on my way to present at a conference [ed note: it is being published the day after it was written]. I’m secretly glad that the WiFi apparently does not work over the open ocean (I presume due to political/regulatory reasons). Now, don’t get me wrong, one of my favorite things to do when I fly is to watch myself on FlightAware, but in this case it’s a blessing to be disconnected. If a WiFi connection were available, it would be much harder to avoid checking my work email.

It took me a year and a half at my job before I convinced myself to turn off email sync after hours. Even though I rarely worked on emails that came in after hours, I felt like it was important that I know what was going on. After several weekends of work due to various projects, I’d had enough. The mental strain became too much. At first, I’d still manually check my mail a time of two, but now I don’t even do that much.

This is due in part to the fact that the main project that was keeping me busy has had most of the kinks worked out and is working pretty well. It also helps that there’s another vendor managing the operations, so I only get brought in when there’s an issue with software we support. Still, there are several customers where I’m the main point of contact, and the idea of being away for a week fills me with a sense of “oh god, what will I come back to on Monday?”

i’ve written before about burnout, but I thought it might be time to revisit the topic. When I wrote previously, I was outgrowing my first professional role. In the years since, burnout has taken a new form for me. Since I wrote the last post, two kids have come into my life. In addition, I’ve gone from a slow-paced academic environment to a small private sector company which claims several Fortune 100 companies as clients. Life is different now, and my perception of burnout has changed.

I don’t necessarily mind working long hours on interesting problems. There are still days when it’s hard to put my pencil down and go home (metaphorically, since I work from a spare bedroom in our house). But now that I have have kids, I’ve come to realize that when I used to feel burnt out, I was really feeling bored. Burnout is more represented by the impact on my family life.

I know I need to take time off, even if it’s just to sit around the house with my family. It’s just hard to do knowing that I’m the first — and sometimes last — line of support. But I’m adjusting (slowly), and I’m part of a great team, so that helps. Maybe one of these days, I’ll be able to check my email at the beginning of the work day without bracing myself.

Job requirements: often counterproductive

My friend Rikki Endsley shared an article from Quartz entitled “job requirements are mostly fiction and you should ignore them“. Based on how quickly my friends re-shared the post, it seems to have resonated with many people. The article is targeted at job applicants and the TL;DR is “apply for the job you want, even if you don’t think you’re qualified. Job postings are written to describe ideal candidates, even if they’re not realistic, and most hiring managers would gladly take someone who meets some of the requirements. When a characteristic is listed under “requirements” instead of “preferred”, potential applicants assume that they shouldn’t bother applying.

This isn’t true in all cases, of course. In some places, the requirements are well-written and the hiring manager doesn’t consider any applications that don’t meet the requirements. Other times, the initial evaluation is done by the human resources department and they apply the requirements strictly (as an anecdote, I know I’ve been rejected for more than one position because my degree was in the wrong field. This despite that I had experience in the position and the hiring manager asking HR for my resume specifically). In many cases, though, the “requirements” are a high bar. The Internet is full of (possibly apocryphal) stories of job postings wanting 7 years of experience in a 5 year old programming language.

Hiring managers aren’t addressed directly in the article, but there’s a lesson here for you: be careful when writing job requirements. Apart from scaring away people you might have otherwise ended up hiring (especially women, who are more likely to pass on jobs where they don’t meet all of the qualifications), you’re robbing yourself of a good way to weed out the truly unqualified. Especially when someone else is pre-screening applicants, I prefer to craft job postings as broadly as possible. I would much rather spend extra time reviewing applicants than miss out on someone who would have been a great hire. It’s a low-risk, high-reward decision.

It’s not cheap to hire people. Especially in small organizations, you don’t want to risk hiring someone who you’ll have to get rid of in six months. But turnover isn’t cheap, either. I haven’t studied this, but speaking from my own experience, I’m much more likely to leave a position when I feel like I’ve stopped growing. By hiring someone who is 80-90% of the way there instead of 100%, you buy yourself more time with this person, reducing turnover. Sure, you get less productivity initially, but allowing an employee to grow is a cheap way to keep them interested in their work.

Likewise, I don’t want to apply for a job where I could step in on day one and do everything. If I wanted a job that I could do easily, I’d still be in my first job. I bet I’d be really good at it by now, but my skills wouldn’t have expanded. As a friend-of-a-friend said “I don’t think I’ve ever applied for a job that I was qualified for.” If employers can write job requirements aspirationally, then potential applicants should be aspirational in job applications.

Humans as investments, not resources

As I last week, two tweets on Sunday morning lead me to a lengthy pontification about HR and how organizations treat employees. In part two, I focus on an article shared by Mrs. Y in which we find that the longer you stay in your position, the more money you’re robbing from your future self. I’m approaching the eight year anniversary of the start of my first professional job and just passed the one year anniversary of the start of my fourth. I just ran the numbers: assuming a 2% annual raise (which is probably generous), I’m making nearly 60% more than I would had I stayed in that first job.

Money isn’t everything, of course. I’ve never left a job because I wanted more money (though I’ve never complained about getting more money). Every time I left a job, it’s because I ran out of room to grow my skills and responsibilities or because I was dissatisfied with the organization. In those cases, throwing more money at me would have been at best a short-term inducement to stay. Still, it has been my experience that the best way to get what you want is to leave and go get it elsewhere. This is fundamentally broken. How much productivity does an organization lose when years of experience walk out the door? How much frustration does a person gain when they have to re-learn a new job, a new employer, and often a new city?

The basic issue is that all too often, organizations treat employees as resources to extract value from. Viewed properly, employees are investments that will help the organization grow. I’ve heard managers say “why should I send you to that training? You’ll probably end up leaving.” Of course, the manager is sure he’s right when the employee does leave. But maybe they’d have stayed if the organization was willing to invest.

When an employer gives a big raise or other consideration to an employee, it tends to be in reaction to the employee receiving an offer from somewhere else. “If I give you more money, will you stay?” is a losing proposition. At that point, the employee is already heading for the door. It’s far better to keep the good employees sufficiently happy such that they don’t get those offers in the first place. Of course, some people will always leave, and there’s no stopping that. Prolonging the period of mutual benefit is the best possible outcome.

The notion of loyalty to an employer strikes me as being misplaced. When an employer won’t take proactive steps to aid the development of an employee, why does the employee owe any loyalty? There are good organizations out there that really do try to keep employees on an upward trajectory when it comes to pay and skills. Those that don’t aren’t necessarily bad, but they’re not doing themselves any favors.

Of course, the blame isn’t entirely on the shoulders of the employer. Employees have to understand that their bosses aren’t mind readers. My major regret from my last job is the fact that I wasn’t more vocal about what I wanted from the position along the way. Maybe something could have been done that would have made me want to say “no thanks” when I was offered my current job. Or maybe not. But at least then I would have had to choose.

Organizational silos in the 21st century

My friend Joanna shared an interesting Harvard Business Review article yesterday morning entitled “It’s Time to Split HR“. Along with another link article that was shared by someone else a few minutes later, I found myself making a rather lengthy pontification about how organizations treat employees, particularly with respect to pay. Originally, I was going to focus on that, but the more I thought about it, the more I think there are two separate-but-related posts. So in this one, I’ll focus on the HBR article.

Ram Charan lays out an interesting argument for bifurcating the traditional Human Resources department into administration (e.g. compensation, hiring, etc.) and leadership (e.g. personnel development) roles. It’s certainly a proposal worth considering. As commenter Jonathan Magid pointed out, the two roles of HR are essentially to protect the organization from its employees and to develop those same employees. HR effectively has to serve two masters. In most cases where there’s a conflict of interest, HR will side with the organization. That makes sense, since the organization is the one writing the pay checks.

Charan, however, doesn’t really address this. His main concern seems to be the lack of strategic understanding within the ranks of HR leadership. Chief Human Resources Offices (CHROs), he writes, are too specialized in HR functional areas and lack understanding of the larger organization. This is true of all areas, though. Especially those that are not core to the business. Too often, IT leadership is painted (appropriately) with the same brush. That doesn’t necessarily mean that the area needs to go away.

The best employees are familiar with other areas of the business, not just those under their direct purview. This is true of all functional areas and at all levels. In a previous job, we frequently lamented that HR wasn’t as helpful as they could be in vetting resumes because they didn’t understand technology. Go on any forum where technical people gather and eventually you’ll find disparaging remarks and advice to game keywords because that’s all HR will understand. In my experience, the issue isn’t that HR staff are terrible human beings, it’s that they’re just too siloed.

Everyone “knows” that silos are bad. Bureaucratic fiefdoms tend to be self-reenforcing and lead to reactionary, defensive tactics like information hoarding in order to keep justifying their existence. What I find so interesting is that organizations can’t seem to avoid forming silos. Siloization is like an organizational version of entropy: left alone, it will increase over time.

There are certainly benefits to silos. They make the organization easier to understand. They allow for deep specialization. Knowledge transfer is simplified. But they’re a losing prospect in the longer term.

A good friend of mine directs the IT support organization for a large academic unit within a large midwestern university. This group is notable because it provides excellent customer service (according to their internal measures, the opinions of the customers, and their reputation on campus) and actively suppresses silo formation. He told me that silo suppression is “extremely expensive, and very difficult to explain/justify to people.”

Silo suppression is expensive because it’s a long-term investment in the organization. Rotating senior personnel through phone duty is expensive as compared to putting the newest, cheapest hire on that task all day. Requiring all knowledge to be shared among at least one colleague requires a larger staff. Hiring and retaining people who buy into the organizational philosophy is a slower and more expensive process. All of this looks terrible in a short-term view.

Where silo suppression pays off is in the long term. Vacations, illnesses, and employee turnover are less disruptive because the knowledge is retained. Someone else in the organization can pick up dropped issues without a long spin-up time. Employees develop a broader view of their work and can make useful suggestions for areas outside of their main expertise.

There are undoubtedly psychological and sociological reasons that we gravitate toward developing silos. Organizational leaders must be aware of this tendency and actively work to counter it. The problem with HR isn’t that the people or the functions are bad. The problem is that HR is not out “among the people.” Embedding HR representatives within other functional areas of an organization is almost certain to improve HR’s understanding of the larger business. In addition, it may help make HR more approachable and better understood by the rest of the organization, allowing employees to make the best use of the HR resources available.

Organizational silos are interesting. We know that they’re bad, but we always end up reverting to them over time. Maybe just we need better silos. Silos built around projects that can easily be torn down and put up somewhere else may give us the best of both worlds.

Considering Bloom’s taxonomy in staffing decisions

A while back, an exam question introduced me to a taxonomy developed by educational psychologist Benjamin Bloom. In researching this work, I was immediately struck by how useful it could be when making decisions about technical staff. Bloom’s taxonomy is composed of three domains. The cognitive domain includes six hierarchical levels (from lowest to highest):

  • Knowledge
  • Comprehension
  • Application
  • Analysis
  • Synthesis
  • Evaluation

Applying these levels can help guide the interview process and provide a measure of a candidate’s abilities. With many technical jobs, though, it’s preferable to ignore the knowledge level. “Knowledge” in this context refers to memorized facts. Some interviews, especially phone screens, tend toward being entirely focused around the knowledge level. Even interviews that are based around programming exercises potentially overemphasize recitation over application. It is far too easy for a nervous interviewee to underperform on memorized facts. In real-world tasks, references are available for facts.

Once a person is hired, they need to be assigned work. If tasks are rated at the level they require, they can be matched to people at the required level. Tracking a person’s task levels can be beneficial as well. Giving someone tasks lower than they’re capable of will erode morale over time (and is a waste of resources), but someone who never gets lower level tasks could probably use a break. By the same token, giving people the occasional higher-level task gives them growth opportunities but too many can cause undue stress. If employees largely self-select tasks, a drop in level can be a warning sign of wider problems.

Of course, such applications are not new. Bloom’s taxonomy, by its very inclusion in an IT project management exam, is clearly not newly applied. It’s just interesting to me that a taxonomy developed for education some 60 years ago could fit technology staffing so well. If it’s new to me, then it’s probably new to someone else, too.