Blog Fiasco

December 16, 2014

Calling people people. What’s in a name?

Filed under: Musings,Project Management — Tags: , , — bcotton @ 11:12 pm

My IT service management professor once told the class “there are only two professions who have users: IT and drug dealers.” It’s interesting how the term “user” has become so prevalent in technology, but nowhere else. Certainly the term “customer” is better for a series organization (be it an internal IT group or a company providing technology services). “Customer” sounds better, and it emphasizes whose needs are to be met.

For a free Internet service, though, it’s not necessarily an apt term, if for no other reason than the rule of “if you’re not paying for it, you’re the product.” That’s why I find Facebook’s recent decision to call their users “people” interesting.

Sure, it’s easy to dismiss this as a PR move calculated to make people feel more comfortable with a company that makes a living off of the personal information of others. I don’t doubt that there is a marketing component to this, but that doesn’t make the decision meritless. Words mean things, and chosen the right word can help frame employees mindsets, both consciously and subconsciously.

In Fedora, contributors have been actively discussing names, both of the collected software (“products” versus alternatives) and the people involved (“contributors”, “developers”, “users”). Understanding what the general perception of these terms are is a critical part of selecting the right one (particularly when the chosen term has to be translated into many other languages). A clear definition of the people terms is a necessary foundation of trying to understand the needs and expectations of that group.

“People” may be too broad of a term, but it’s nice to see a major company forego the word “user”. Perhaps others will follow suit. Of course, “user” is just such a handy term that it’s hard to find a suitably generic replacement. Maybe that’s why it sticks around?

December 7, 2014

The IT of politics

Filed under: Project Management — bcotton @ 10:24 pm

Much can be (and has been) written about the politics of IT: the intra-team relationships, the relationships with customers, and the C-level maneuvering. Less is said about the IT of politics. Not necessarily the IT issues of government agencies (NSA surveillance, missing IRS emails, and the ACA enrollment website disaster being three recent examples), but the IT efforts that power the political campaigns themselves.

Ars Technica recently reported on a research paper that examined the use of social media by the Obama and Romney campaigns in the 2012 presidential election. While Obama’s social media team was given some autonomy and reacted to events as they happened, Romney’s team had a greater level of bureaucratic control and schedule. Granted, social media is much more of a marketing issue than an IT issue, but it reinforces earlier reporting about the analytics and volunteer management platforms.

Politics aside, the IT organizations would seem to reflect the backgrounds of the candidates. The results of the respective projects, while hardly inevitable, don’t seem very surprising. There’s a wealth of project management knowledge to be gained from examining the development and deployment of Orca and Narwhal. That is, if they can be studied without the politics.

November 17, 2014

Mozilla’s new ad feature

Filed under: Linux,The Internet — Tags: , , , — bcotton @ 8:55 pm

Edited to remove erroneous statements about what gets sent to Mozilla based on Matthew Miller’s comment below.

Mozilla’s release last week of in-browser ads has caused quite the discussion on the Fedora development mailing list. Firefox now will show sponsored “tiles” on the default home screen when a new or cleared profile is used. Although Mozilla claims to collect data in such a way that it’s not personally identifiable, there are reasons to be concerned. Sure, this can be disabled, but the default behavior is the only thing most users will experience.

The reactions on Fedora-devel spanned the gamut from indifference to insistence that Firefox be removed from the repository entirely. My own take (which was already represented on the mailing list, so I refrained from “me too”-ing) is that the right answer is to disable this feature in the Firefox build that ships in Fedora, effectively making it opt-in instead of opt-out. Mozilla has a history of being a good actor and I don’t begrudge them trying to make some money. However, I’d prefer that the user have to consciously enable such tracking.

Though I disapprove of the implementation, I find it hard to get very worked up about this. The Internet is awash in tracking. Google and Facebook probably know more about me than I do about myself. But that’s because I decided the value I get from those sites (well, not so much Facebook) is worth the data I give them. I respect the right of others to come to their own decision, which is why opt-in is preferred.

I appreciate the opinion of those who think the only appropriate response is to remove Firefox entirely, but I find that to be a wholly impractical solution. If Fedora wants casual desktop users (and I see no reason to not court that use case), having Firefox is and important part of a welcoming environment. A great deal of casual computing is done in the browser these days and Firefox is a well-known browser (even if some people call it “Foxfire”). Sure, there are other FLOSS browsers (including IceWeasel), but few of them work as well for casual users as Firefox and none of them have the familiarity and name recognition. Given the good Mozilla has done for free software over the years, this hardly seems like a bridge worth burning.

November 10, 2014

Open source is about more than code

The idea of open source developed in a closed manner is hardly new. The first real discussion of it came, as best as I can tell, in Eric S. Raymond’s The Cathedral and the Bazaar. A culture of open discussion and decision making is still a conscious act for projects. It’s not always pretty: consensus decision making is frustrating and some media outlets jump on every mailing list suggestion as the final word on a project’s direction. Still, it’s important for a project to make a decision about openness one way or the other.

Bradley Kuhn recently announced the copyleft.org project, which seeks to “create and disseminate useful information, tutorial material, and new policy ideas regarding all forms of copyleft licensing.” In the first substantive post on the mailing list, Richard Fontana suggested the adoption of the “Harvey Birdman Rule,” which has been used in his copyleft-next project. The limited response has been mostly favorable, though some have questioned its utility given that to date the work is almost entirely Kuhn’s. One IRC user said the rule “seems to apply only to discussions, not decisions. The former are cheap and plentiful, but the latter actually matter.”

I argue that the discussions, while cheap and plentiful, do matter. If all of the meaningful discussion happens in private, those who are not privy to the discussion will have a hard time participating in the decision-making process. For some projects, that may be okay. A ruling cadre makes the decisions and other contributors can follow along or not. But I see open source as being more than just meeting the OSI’s definition (or the FSF’s definition of free software for that matter). Open source is about the democratization of computing, and that means putting the sausage-making on public display.

October 27, 2014

Another reason to disable what you’re not using

Filed under: The Internet — Tags: , , — bcotton @ 8:15 pm

A common and wise security suggestion is to turn off what you’re not using. That may be a service running on a computer or the bluetooth radio on a phone. This reduces the potential attack surface of your device and in the case of phones, tablets, and laptops helps to preserve battery life. On the way to a family gathering over the weekend, I discovered another, less intriguing reason.

As I exited the interstate, I passed a Comfort Inn. Having stayed a Comfort Inns in the past, my phone remembered the Wi-Fi network and apparently it tried to connect. The signal was just strong enough that my phone switched from 4G to Wi-Fi, and since the Comfort Inn had a registration portal, this messed up the navigation in the maps app. Oops.

I turned the Wi-Fi antenna off for the rest of the trip. It was a good reminder to shut off what I’m not using.

October 9, 2014

The UX of a microwave

Filed under: Musings,Project Management — Tags: , — bcotton @ 8:34 pm

I’m not a UX expert except in the sense that I have experience using things. Still, I spend a lot of time at work serving as a proxy for users in design discussions. It’s hard to get UX right, even on relatively simple experiences like a microwave oven.

Years ago, my systems analysis professor got on a tangent about user interactions. He pointed out that it can be faster to enter a minute on a microwave as 60 seconds instead of one minute, that 111 seconds is faster to enter than 110. Design choices (including the design of instructions and documentation) that seem obviously correct are sometimes incorrect for non-obvious reasons.

It took a little while, but I eventually discovered that the “Quick Set” menu doesn’t have pre-programmed settings, it just adds a zero to whatever code is entered. So the quick set to cook two slices of bacon (20) simply sets the time to 2:00. In that sense, it functions less as a shortcut and more as a list of cook times.

On a whim today, I tried to warm a cup of coffee by using a quick set code of 6 instead of 10. It didn’t work. Apparently the microwave requires quick set codes to be exactly two digits. For a one-minute cook time, the quick set is hardly any quicker than a manual entry.
My microwave came with the house, so while I don’t know exactly how old it is, I know it’s at least seven years old. Maybe recent microwaves have a more sensible UI. Or maybe it’s a problem that will never quite be solved.

September 30, 2014

Cloud detente

Filed under: HPC/HTC,Linux,The Internet — Tags: , , , , , , , — bcotton @ 8:21 am

Evident.io founder and CEO Tim Prendergast wondered on Twitter why other cloud service providers aren’t taking marketing advantage of the Xen vulnerability that lead Amazon and Rackspace to reboot a large number of cloud instances over a few-day period. Digital Ocean, Azure, and Google Compute Engine all use other hypervisors, so isn’t this an opportunity for them to brag about their security? Amazon is the clear market leader, so pointing out this vulnerability is a great differentiator.

Except that it isn’t. It’s a matter of chance that Xen is The hypervisor facing an apparently serious and soon-to-be-public exploit. Next week it could be Mircosoft’s Hyper-V. Imagine the PR nightmare if Microsoft bragged about how much more secure Azure is only to see a major exploit strike Hyper-V next week. It would be even worse if the exploit was active in the wild before patches could be applied.

“Choose us because of this Xen issue” is the cloud service provider equivalent of an airline running a “don’t fly those guys, they just had a plane crash” ad campaign. Just because your competition was unlucky this time, there’s no guarantee that you won’t be the lower next time.

I’m all for companies touting legitimate security features. Amazon’s handling of this incident seems pretty good, and I think they generally do a good job of giving users the ability to secure their environment. That doesn’t mean someone can’t come along and do it better. If there’s anything 2014 has taught us, it’s that we have a long road ahead of us when it comes to the security of computing.

It’s to the credit of Amazon’s competition that they’ve remained silent. It shows a great degree of professionalism. Digital Ocean’s Chief Technology Evangelist John Edgar had the best explanation for the silence: “because we’re not assholes mostly.”

September 7, 2014

FAQs are not the place to vent

Filed under: HPC/HTC,Musings,The Internet — Tags: — bcotton @ 2:42 pm

I’ve spent a lot of my professional life explaining technical concepts to not-necessarily-very-technical people. Most of the time (but sadly not all of it), it’s because the person doesn’t need to fully understand the technology, they just need to know enough to effectively do their job. I understand how frustrating it can be to answer what seems like an obvious question, and how the frustration compounds when the question is repeated. That’s why we maintain FAQ pages, so we can give a consistently friendly answer to a question.

You can imagine my dismay when my friend Andy shared an FAQ entry he found recently. A quantum chemistry application’s FAQ page includes this question: “How do I choose the number of processors/How do I setup my parallel calculation?” It’s a very reasonable question to ask. Unfortunately, the site answers it thusly: “By asking this question, you demonstrate your lack of basic understanding of how parallel machines work and how parallelism is implemented in Quantum ESPRESSO. Please go back to the previous point.”

The previous question is similar and has an answer of of “See Section 3 of the User Guide for an introduction to how parallelism is implemented in Quantum ESPRESSO”. Now that’s a pretty good answer. Depending on the depth of information in Section 3, it might be possible to answer the question directly on the FAQ page with an excerpt, but at least pointing the visitor to the information is a good step.

I don’t understand getting frustrated with a repeated FAQ. If the answers are so similar, copy and paste them. Or combine the questions. FAQs, user guides, and the like are great because you can compose them in a detached manner and edit them to make sure they’re correct, approachable, and not jerkish. FAQs are an opportunity to prevent frustration, not to express it.

August 28, 2014

Job requirements: often counterproductive

Filed under: Musings,Project Management — Tags: , , , — bcotton @ 7:53 pm

My friend Rikki Endsley shared an article from Quartz entitled “job requirements are mostly fiction and you should ignore them“. Based on how quickly my friends re-shared the post, it seems to have resonated with many people. The article is targeted at job applicants and the TL;DR is “apply for the job you want, even if you don’t think you’re qualified. Job postings are written to describe ideal candidates, even if they’re not realistic, and most hiring managers would gladly take someone who meets some of the requirements. When a characteristic is listed under “requirements” instead of “preferred”, potential applicants assume that they shouldn’t bother applying.

This isn’t true in all cases, of course. In some places, the requirements are well-written and the hiring manager doesn’t consider any applications that don’t meet the requirements. Other times, the initial evaluation is done by the human resources department and they apply the requirements strictly (as an anecdote, I know I’ve been rejected for more than one position because my degree was in the wrong field. This despite that I had experience in the position and the hiring manager asking HR for my resume specifically). In many cases, though, the “requirements” are a high bar. The Internet is full of (possibly apocryphal) stories of job postings wanting 7 years of experience in a 5 year old programming language.

Hiring managers aren’t addressed directly in the article, but there’s a lesson here for you: be careful when writing job requirements. Apart from scaring away people you might have otherwise ended up hiring (especially women, who are more likely to pass on jobs where they don’t meet all of the qualifications), you’re robbing yourself of a good way to weed out the truly unqualified. Especially when someone else is pre-screening applicants, I prefer to craft job postings as broadly as possible. I would much rather spend extra time reviewing applicants than miss out on someone who would have been a great hire. It’s a low-risk, high-reward decision.

It’s not cheap to hire people. Especially in small organizations, you don’t want to risk hiring someone who you’ll have to get rid of in six months. But turnover isn’t cheap, either. I haven’t studied this, but speaking from my own experience, I’m much more likely to leave a position when I feel like I’ve stopped growing. By hiring someone who is 80-90% of the way there instead of 100%, you buy yourself more time with this person, reducing turnover. Sure, you get less productivity initially, but allowing an employee to grow is a cheap way to keep them interested in their work.

Likewise, I don’t want to apply for a job where I could step in on day one and do everything. If I wanted a job that I could do easily, I’d still be in my first job. I bet I’d be really good at it by now, but my skills wouldn’t have expanded. As a friend-of-a-friend said “I don’t think I’ve ever applied for a job that I was qualified for.” If employers can write job requirements aspirationally, then potential applicants should be aspirational in job applications.

How to use HTCondor’s CLAIM_WORKLIFE to optimize cluster throughput

Filed under: HPC/HTC — Tags: , — bcotton @ 9:10 am

Not only do I write blog posts here, I also occasionally write some for work. The Cycle Computing blog just posted “How to use HTCondor’s CLAIM_WORKLIFE to optimize cluster throughput“. This began as a conversation I had with one of our customers and I decided it was worth expanding on and sharing with the wider community.

Older Posts »

Powered by WordPress