Zillow’s failure isn’t AI, it’s hubris

Zillow’s recent exit from the house-flipping arena was big news recently. In business news, the plummeting stock price and looming massive layoff made headlines. In tech circles, the talk was about artificial intelligence, and how Zillow’s algorithms failed them. And while I love me some AI criticism, I don’t think that’s what’s at play here.

Other so-called “iBuyers” haven’t suffered the same fate as Zillow. In fact, they vastly out-performed Zillow from the reporting I heard. Now maybe the competitors aren’t as AI-reliant as Zillow and that’s why. But I think a more likely cause is one we see time and time again: smart people believing themselves too much.

Being smart isn’t a singular value. Domain and context play big roles. And yet we often see people who are very smart speak confidently on topics they know nothing about. (And yes, this post may be an example of that. I’d counter that this post isn’t really about Zillow, it’s about over-confidence, a subject that have a lot of experience with.) Zillow is really good at being a search engine for houses. It’s okay at estimating the value of houses. But that doesn’t necessarily translate to being good at flipping houses.

I’m sure there are ways the algorithm failed, too. But as in many cases, it’s not a problem with AI as a technology, but how the AI is used. The lesson here, as in every AI failure, should be that we have to be a lot more careful with the decisions we trust to computers.

A culture of overwork

The technology industry has a problem. Okay, we have a lot of problems, but there’s one in particular that I’m talking about today. We have a culture of overwork and it’s bad for everyone.

As someone who is remarkably lazy, I have a keen interest in doing as little as possible. Of course, this tends to be more of the Sisyphean “automate all of the things” as opposed to just slacking off, but the point stands. Labor productivity in the United States has grown fairly steadily over the last few decades, so why don’t we see that reflected in the tech sector? (I can think of a few reasons offhand, but I don’t want to lose focus.)

My company has what I would consider a very healthy policy. Our vacation policy is effectively “take all the time you need,” and we have a minimum annual amount in case people don’t recognize that they need to take time. It’s in the best interests of company and employee alike that everyone is rested and focused. Exceptional times require late nights or weekends, but those are understood to be the exception.

But even in well-meaning organizations, particularly startups and small companies, it’s easy to let overwork culture creep in. Sometimes getting a release out on schedule means a lot of late nights for developers. Or a production outage for a customer means your support team loses their weekend. Even an offhand comment like “well look at Alice with the quick reply on a Saturday” can slowly lead to an unspoken social expectation that it’s always time to work.

For my team, I’ve always made it clear that I don’t expect anyone to be working outside of our business hours unless they get paged. In order to lead by example, I don’t have my email auto-sync to my phone, and I don’t check Slack when I’m not working. My team knows how to reach me in an emergency and I trust them to judge what an emergency is. Of course, the first week my most recent hire joined, I emphasized this to him and then ended up spending most of the weekend working. It was clearly and exception, but someone new to the company may question if that’s the unwritten reality.

Which brings me to the written reality in some places. A recent opinion piece by Alex St. John said, in effect, game developers should work 80 hour weeks and like it. Tech Twitter and many publications were immediately critical of St. John’s philosophy, but the fact that he could even get such an article published means it’s more of a problem than it should be.

I could never work in the kind of environment St. John advocates. Nor could many others for very long (and this is without considering his views on women in tech). I suspect even Mr. St. John would hate working for himself. I find the “just quit” reply to any complaints about a job to be incredibly myopic. That most people have practical considerations beyond the purity of an artist’s idealized life seems to have escaped him.

As an industry, we should be celebrating the incredible gains in productivity that we help make happen by shortening the work week, not lengthening it.

Student speech rights

To continue the legal theme from a few days ago (with the addition of some “old news is so exciting!”), a high school in Kansas suspended the senior class president for comments he made on Twitter. What did he say? ““Heights U” is equivalent to WSU’s football team“. WSU’s football team doesn’t exist. That’s it. For that, the school deemed his initial tweet and responses were disruptive to the school.

It’s not clear to me if the Heights High School is acting in accordance with legal precedent (their decision is certainly unjust, but that’s another matter). The Supreme Court has affirmed and re-affirmed restrictions on the free speech rights of students. Bethel School District v. Fraser, Hazelwood v. Kuhlmeier, and Morse v. Frederick have all served to limit what students can say.

In Tinker v. Des Moines, the Court protected non-disruptive political speech, with the disruption being the critical factor. In Bethel, Hazelwood, and Morse the speech in question was part of a school-sanctioned activity even if the activity was not on school grounds (as in Morse). It would be a great stretch to consider Mr. Teague’s Twitter account to be a school-sanctioned activity, as it appears to be his personal account. To my knowledge, no Supreme Court ruling has ever addressed a school’s ability to restrict speech that occurs outside of school events.

Arguably, the concept of in loco parentis could be used to support the ability of schools to respond to behavior that happens outside the school. I don’t agree with this, but it would be interesting to see how this argument played out in the courts. In the meantime, I expect that this may end up being discussed in court rooms for years to come. If no suit is filed, it should at least be used as an exercise in high school government classes across the country.

On September 11: my memories and the role of technology in never forgetting

I really hadn’t intended to write a 9/11 post here. It doesn’t seem to fit with whatever this blog is supposed to be. But it’s all over the newspaper and it’s all over Twitter, and I’m sure if I turned on the TV I’d see 9/11 all over again. Even the Sunday comics were more touching than comic, so I guess it’s fitting that I share my thoughts.

The morning of September 11, 2001 dawned. I’m not sure how it dawned, because I was still sound asleep in my room at Purdue’s Cary Quadrangle. My alarm went off at some point to tell me to wake up and go to class, and I ignored it. A few weeks into my collegiate career, I had already decided that 8:30 chemistry lectures were optional. I didn’t wake up again until my roommate Carl came back from his morning classes. “Dude. One of the World Trade Center towers collapsed,” he told me. “Fuck off, Carl,” was my reply. I was barely awake, and I was convinced that Carl was bullshitting me.

So he turned on the TV.

I don’t remember what time it was. I don’t even remember where in the timeline it happened. All I know is that for the rest of the day, Carl and I sat on Lucy the Couch and watched CNN. We couldn’t look away. I don’t even think I left to go to the restroom until about 2:00 that afternoon. And that’s when I first started to realize the magnitude of what had happened. There were about 40 guys on my end of the floor, mostly freshmen and sophomores, and it was rarely a quiet place. Without air conditioning, we all kept our doors open to get air flow. But as I walked down the hall to the bathroom, I realized that all I could hear was the sound of everyone’s televisions.

That night, Carl and I went to go get dinner. I don’t think we went with friends as we normally did. It was more of a “we haven’t eaten all day and there’s no new news, let’s go grab a bite real quick” decision. The Cary dining hall, one of the most popular eateries in all of University Residences, was subdued. The kids of middle-Eastern decent looked nervous and ate quietly and away from everyone else. Were they afraid of misplaced retribution? To my relief, I never heard of such an occurrence at Purdue. The same could not be said for other college campuses.

Life returned to normal fairly quickly for us. No classes were cancelled. Homework was still there. Most of us, being generally Midwesterners, had few ties to New York City. While the news was horrific, it didn’t impact our daily lives. And here we are 10 years later. The political climate is soured. Our troops are still in Afghanistan. Laws passed to aid the fight against terrorism have been used largely to combat domestic drug crimes. And yet we maintain this promise to never forget.

And so I think about the other events that we, as a nation, have sworn to remember. The Alamo, the Maine, Pearl Harbor. Each of these events were a rallying cry for a moment in time, a common thought that drove the people toward a goal. But as time has passed, we seem to remember them less. The events are still recalled, but with no more clarity than a history lesson. The personal stories are fading, and continue to do so as a an ever smaller percentage of our population has first-hand stories to tell.

A decade on, the September 11 attacks are still remembered. Will they be in 2101? Certainly the history and political science texts will have much to say. But what will our national conscious say? Does the fact that the victims were civilians instead of military personnel make this more enduring? Will the digital age help preserve our stories? Or will time simply wash this event from our collective thoughts?

As a technology enthusiast, I am intrigued by the role that technology may play in our shared history. Although social media didn’t really exist in 2001, it now provides an opportunity for shared reflection. People are able to interconnect in ways that were not possible on December 7, 1951. We’ve seen the role Twitter and Facebook can play in driving revolution in oppressive regimes. What will our Tweets, our statuses, and our blog posts do to ensure we truly never forget?