Blog Fiasco

April 30, 2010

The Terry Childs case

Filed under: Musings — Tags: , , , — bcotton @ 12:39 pm

If you pay much attention to technical news, you probably have heard of Terry Childs.  Childs is the network admin formerly employed by the City of San Francisco who was arrested in 2008 after he was fired for insubordination and subsequently refused to give his supervisor the passwords for the FiberWAN routers.  If you know this much, you probably also heard that he was found guilty of one felony count on Tuesday.  For the sake of continuing this paragraph, I’ll assume you heard that.  Since you know this, I think it’s fairly safe to assume that your response to his conviction falls into one of two summaries: “he had it coming” and “this is an outrage.”

The prevailing mood on Slashdot and elsewhere seems to favor the latter summary.  My own take is more toward the former.  I’m not sure if that’s because I’m a short-hair type (side note: in my experience, there are two broad classifications of admins — short-hair and long-hair.  There’s often a stark behavioral/mindset difference between the two.  Maybe I’ll write about that at some point.), or if it’s because I’m still a youngin, or if it’s just because I’m being more sensible than everyone else.

My opinion on the case has softened a bit since it first broke.  Initially, the city was claiming that Childs had booby-trapped systems so that they would fail if anyone tried to gain access after he left.  As it turns out, things continued to run smoothly after Childs was fired.  There was a lot of stupid surrounding this case, and neither side comes out particularly sympathetic.  InfoWorld’s Paul Venezia had a good summary of the case in July 2008.

I don’t fault Terry Childs for refusing to give the passwords to people who asked for them, as the city had a very sensible password policy in place (don’t give user or system account passwords to anyone. The End).  What he didn’t do was put the passwords in the appropriate central repository.  I can understand his reasoning — we’ve all had incompetent coworkers that we didn’t want to share a password with, but sometimes that’s what we have to do.

Perhaps the city’s biggest mistake was letting Childs “own” the FiberWAN in the first place.  By all accounts, it was a pretty brilliant design, and every artist should be proud of the work they do, but that doesn’t make it their work.  Let’s face it: except in very rare cases, the work an admin does for his employer is the property of that employer.  We all like to think of systems as “ours”, but the reality is that we’re just caretakers, even when we design the system.  Think of a gardener as an analogue.

System/network/database/whatever-else admins have access to a great deal of sensitive information — grades in education, financial or research data in the public sector, medical records in hospitals, etc.  There is definitely a compelling need to restrict access in a sensible, responsible manner, but this must also be balanced out with a need to increase the bus factor.  There should always be at least one other person who has access to the passwords in case something unfortunate happens to the person with primary responsibility, even if this person is only authorized to get the passwords in the event of an emergency.

Childs also failed to play nice with others, and that’s the only reason we’ve heard about this at all.  Allegedly, he harassed a new manager to the point where she locked herself in a room to get away from him.  Like it or not, admins have to deal with other people, and that’s often the skill that is most lacking.  However, it is also perhaps the most necessary.  Technical position or no, we all need to be able to manage our role in office politics.  I sometimes think that should be a required class for sysadmins.  Maybe someone could set up a certification program?

April 26, 2010

New recipe: veggie roll-up thingies

Filed under: Cooking — Tags: , , , , — bcotton @ 8:53 pm

Since we recently spent a week being vegetarian, I had to stretch my repertoire a bit.  Vegetarian recipes are not something I have much experience with, so I had to wing it. Fortunately, my usual method of cooking involves staring at ingredients until something says “use me!” I present here the recipe that I used one night, which hasn’t received much refinement yet, so use at your own risk.

Veggie roll-up thingies

5 radishes
2-ish carrots
4oz cream cheese
6 leaves of Romaine lettuce
4 asparagus stalks
some dill

Cut the lettuce leaves in half lengthwise (you might just want to cut the …vein, spine, whatever… out so that you can roll it in a few minutes).  Chop the celery and radishes and mix them into the cream cheese. Add the dill and mix some more.  Cut the asparagus into 12 pieces.  Spread the cream cheese mixture on two-thirds of each lettuce leaf.  Place a piece of asparagus on the un-cheesed third of each leaf and roll toward the cheese.  When you’ve rolled all twelve, find three friends and have them join you in eating the roll-ups (or eat the roll-ups alone and miserable for four meals).

A wrap-up of our No Impact Week experiment

Filed under: Musings — Tags: , , — bcotton @ 7:20 pm

I wrote last week about the movie “No Impact Man” and the week-long mini project we’d be undertaking.  I haven’t updated this blog since then, mostly due to laziness and time constraints, but Angie has had regular updates on her blog, so if you’re interested in the nitty-gritty, see Hippie in Training.  What follows here is more like an executive summary, with the additional comments of a less-enthusiastic participant.

I say “less-enthusiastic” because it was Angie’s idea to participate in this, and her passion that got us here in the first place.  It’s not fair to say that she dragged me into this, but I’ll admit that I participated more to be a supportive husband than for any other reason.  That’s not to say that I don’t try to be environmentally conscious, it is just isn’t the ideal that I get most worked up about (it may come as no surprise that I get most worked up about freedom – especially as pertains to speech and software).

Despite my hesitance, I decided that if I was going to do this, I was going to do it sincerely.  At work, I took the stairs to my 9th floor office every day, I got water out of the tap instead of the water cooler, and I rode the bus all 5 days (normally I do this 3-4 days a week).  I even brought my coffee grounds (yeah, I guess if I was perfect, I wouldn’t have had coffee at all) home to compost instead of throwing them away.  Other than that, my work life didn’t change.

At home is where the big changes happened.  At first, I was pretty ambivalent.  We already recycle and compost most of what we use, we unplug unused appliances, and we generally don’t leave lights on when we don’t need them.  The big change initially was to eat vegetarian (since we weren’t quite equipped for local-only eating, we decided this was a reasonable modification).  Although we’ve tried to have a meatless dinner once a week, I haven’t gone a week without eating meat since I began eating solid foods.  I was pleasantly surprised at how well I handled the change (at least until about Thursday, when someone described a burger in detail and I couldn’t stand it anymore).  I’ve now gone nearly 9 days without eating meat, and I tell you — that chicken on the grill can’t cook fast enough!

Toward the end of the week, we had to nearly stop our electricity use as well.  I took the rare step of turning my computer off (except for when we did our OSMacTalk broadcast, which we did by candlelight).  Being both professionally and recreationally a computer nerd, I found it a little difficult being away from e-mail, RSS, and Twitter.  Instead, we had lengthy discussions and played board games by the light of our candles.  That was enjoyable, and we plan to make that a regular event (though perhaps with a bit more electrical lighting, at least once it gets really dark).

Where it all fell apart was on Saturday.  The day held the lure of tornadoes as near as southern Illinois, and it had been a long year since my last attempt at chasing.  Storm chasing is about as no-impact of a hobby as rain forest burning or oceanic oil dumping.  I justified it to myself by arguing that the theme of Saturday was supposed to be volunteering, and if spending my own time and money to potentially save the lives of strangers 200 miles away isn’t volunteering, I don’t know what is.  Angie was leary, but she figured since I’ve been so supportive, she should return the favor.  12 hours and 500 miles later, all we had to show was a few lackluster pictures of nothing particular.  We tried to be as low impact as we can, by which I mean we ate vegetarian meals.

On Sunday, we tried to make up for it by doing absolutely nothing.  Apart from a walk to the store, we mostly sat around and enjoyed the day.  Much of the conversation revolved around the week and what we planned to do for the future.  Having recently read Animal, Vegetable, Miracle by Barbara Kingsolver, Angie has decided that the food industry is not something to be admired and wants us to become locavores.  Admittedly, I find the idea of giving up some of my fast food and out-of-season loves uncomfortable.  The agreement we arrived at as that we’d eat locally when possible, but not exclusively. I can live with that.  We also want to make Sunday evenings “eco evenings” which means no TV, radio, or computers.

Some of the efforts we made last week we’re dropping (for example, I turned the space heater on in the bathroom this morning before my shower).  Others we’re keeping (the stairs aren’t so bad).  The point of the week wasn’t to give up everything forever, but to show us what we can do.  I’d like to think I’ve learned some stuff about myself, my wife, and my marriage.  I’d also like that grilled chicken, so if you’ll excuse me…

April 19, 2010

The impact of “No Impact Man”

Filed under: Musings — Tags: , , , — bcotton @ 2:22 pm

Three weeks ago, my wife heard about the movie “No Impact Man”: the story of one family in New York City who spend a year trying to have no net impact on the environment.  They didn’t quit everything cold turkey, of course, but worked changes in in phases over the year.  By the end, they had given up powered transportation, electricity, and even toilet paper.  As you might expect, these changes did not come without some difficulty and sacrifice.

The two-year-old daughter didn’t seem to object to the changes, but Colin Beaven’s wife Michelle seemed less enthusiastic.  It’s hard to distill a year into 90 minutes, but through much of the movie she seems reluctant or even opposed to many of the changes.  In fairness, it’s probably because he rarely seemed to discuss changes with her ahead of time, instead choosing to announce them as (or after!) they happened. By the end of the year, she had embraced many of the changes, but it still makes me appreciate my wife’s habit of discussing ideas with me before we try them.

One thing the Beaven family faced was ridicule and scorn.  This is to be expected: extremism is almost always met with disdain. Don’t get me wrong, I’ve mocked this project and the absurd lengths Beaven goes to, but I also have a degree of respect for them.  We try to be environmentally conscious, but there’s no way I could go to the lengths they did. Or at least, I wouldn’t do it willingly.

The Beaven family didn’t do this permanently, either.  At the end of the year, the lights went back on (I think Michelle cried), and some of the changes were reverted.  But they kept riding their bicycles, they kept getting food at the local farmers’ market, but they will probably resume their use of toilet paper.  The point, Beaven says, is not that everyone has to do what they did, but everyone should do what they’re capable of.

So what does that mean for me?  Well first it means that I got to spend most of my afternoon at the Lafayette YWCA as the screening that Angie arranged in three weeks went off successfully.  This week, we’ll be participating in our own mini-project (see www.noimpactproject.org for more details), and in the future we’ll try to do what we do even more.  I’ve already assembled a compost bin to make use of food waste.  Today we stopped at a local cyclery to find a bicycle for me (if you’re in the Lafayette area and have bike needs, stop by Virtuous Cycles!) for days I need to go places where the bus isn’t convenient.  I’m sure there are other changes we’ll make, in addition to the ones we’ve already made (see my wife’s blog for information on that).  And that’s what it takes as a first step: each person contributing what they can.

April 14, 2010

Playing Sopwith on the N900

Filed under: Linux — Tags: , , , — bcotton @ 9:04 am

Despite my involvement with Mario Marathon, I’m not much of a gamer.  I have more toes than I do games for my Wii, and only a few more (purchased) titles for my computers.  However, I’ve found that my phone gets the most gaming time simply because it is portable and I can play while I’m on the bus, waiting in line, etc.  The game that has been receiving the bulk of my attention these days is Sopwith.

Sopwith is a simple 2-D game where you must fly your biplane and destroy enemy buildings.  It has been ported to the Maemo platform by Mikko Vartiainen and can be installed from the Maemo Extras repository.  I had never played Sopwith before discovering this version, but my understanding is that it is very true to the original (it helps that the code was re-licensed under the GPL a few years ago) with the exception of a lack of sound.  The presence of missiles seems to be a relatively new and anachronistic feature that I can’t help but use. I never claimed to be good at the game.

Gameplay itself is quite addictive, and fortunately very simple — there are a total of 10 keys you might need to use, and I find myself only using six with any regularity.  The one disadvantage is that all of the keys are on the bottom row of the N900 keyboard, and I’ve found myself hitting the wrong key in the heat of battle. That usually ends up with a dead me.

Since Sopwith was originally written as a showcase for the “imaginet” network system, it makes sense that the Maemo version of Sopwith also has support for network games.  Unfortunately, since I’m the only person I know with an N900, I can’t test that aspect of it.  I imagine it would be fun, especially if you’re in the same room and and trade sharply-pointed barbs.

For those worried about it growing stale, there are several levels.  In the novice mode, there are at least three levels that get progressively more difficult.  I’ve nearly made it past the third level, but not quite.  In expert mode there’s at least one level, and you don’t get unlimited ammunition or automatic throttling.  There’s also an option for playing against a computer opponent, which appears to play in expert mode as well.

Sopwith clearly isn’t a sufficient reason to buy a Maemo device (if it is, then please send me some of your vast amounts of disposable income), but it is a great game to have installed for times when you need to kill a few minutes (and enemies).

April 12, 2010

My wheat bread recipe

Filed under: Cooking — Tags: , , — bcotton @ 10:48 am

In order to provide more content (and add yet another category to my blog!), I’ve decided to start sharing some of my favorite recipes with my ones of readers when I can.  I enjoy cooking, but sometimes I just make it up as I go.  Other times, I go straight from a recipe from a cookbook or a website. In those cases, it’s not prudent to try to share the recipe.  In this first entry, I’m going to start with a recipe for wheat bread that I got from the Better Homes and Gardens Cookbook and then modified to better suit what I wanted.

I have to admit, I cheat and use a bread maker to do the mixing, kneading and first rise.  That’s more out of laziness than anything else.  Kneading by hand isn’t too hard, and I’ve done it before.  The consistency of the bread does seem better when I use the bread maker, but with more practice I’m sure I’d be better.  The dough for this recipe also tends to be pretty thick, so if your mixer is wimpy, be careful.

Wheat Bread

  • 1¾ cups water
  • 1/3 cup honey
  • 1 Tbsp butter
  • 1 tsp salt
  • 2 to 2½ cups all-purpose flour (divided)
  • 3 cups whole wheat flour
  • 1 package active dry yeast (or 2½ tsp canned yeast)
  1. Add ingredients to bread maker in the order listed. Before adding yeast, make a well in the flour.  Add the yeast into the well.
  2. Start the bread machine on the dough-only setting.  On mine, this takes about an hour and a half and includes mixing, kneading and the first rise.
  3. Turn the dough out onto a lightly-floured surface and divide in half.  Cover and let rest 10 minutes.
  4. Shape dough into loaves and place into greased 8x4x2-inch loaf pans.
  5. Cover and let rise in a warm place for 30-45 minutes.
  6. Bake at 375°F for 30-40 minutes or until bread sounds hollow when lightly tapped.  Immediately remove bread from pans and cool on wire racks.

And that’s all there is to it.  In just a few hours, and with minimal mess, you’ll have your own fresh-baked bread.  If you wrap it up well in plastic and keep it in the refrigerator, a loaf can cast several weeks (a few seconds in the mircowave wrapped in a slightly damp paper towel will warm it right up).  Slice thinly for sandwiches, or use thicker slices for sopping up stews and soups.

April 7, 2010

Building GEMPAK on Fedora

Filed under: Linux,Weather — Tags: , , , — bcotton @ 8:21 am

People who have training or experience in following severe weather rarely are content to rely on the media for severe weather information.  This isn’t to say anything against the TV and radio outlets, but we would just rather see the data for ourselves and make our own decisions.  A number of software packages exist for analyzing weather data, of varying price and quality.  For Linux and OS X users (and Windows users via Cygwin), perhaps the most powerful is GEMPAK, developed by the National Weather Service and made available by Unidata.

GEMPAK is basically what national centers, including the Storm Prediction Center and the National Hurricane Center currently use for data analysis and visualization.  It is also a pretty old suite, written in pretty old Fortran, and kind of a bear to get installed sometimes.  Unidata formerly provided pre-built binaries, but only source code is available for the 5.11.4 release.  Since there seems to be an absence of step-by-step instructions, I thought I’d post my own.  Of course, as soon as I post this, a public release of AWIPS II will be announced.

The first step is to prepare your environment.  This basically consists of creating the appropriate user and/or directory structure.  For simplicity and security reasons, I prefer not to create a separate GEMPAK user.  I prefer to just build it in a directory that I have access to and then move it to opt when I’m done.  If your account will be the only one using GEMPAK, you can just leave it in your home directory.  For the sake of this demonstration, let’s assume we’re doing it my way.  Then there’s no pre-work to do.

Next you’ll need to download the GEMPAK source code from Unidata’s website.  Unidata requires registration to download the software, but it is free and they don’t harass you.  While you’re waiting for the tarball to download, you’ll need to make sure you’ve got a few of the packages necessary for building GEMPAK.  Most of them can be found in the yum repository, so you can install them with:

su -c "yum install gcc-gfortran gcc-c++ libXp-devel  libX11-devel xorg-x11-fonts-ISO8859-1-75dpi"

You’ll also need OpenMotif packages, but they can’t be newer that 2.3.1.  This means you cannot install the packages from the RPMFusion-nonfree repository.  You’ll have to grab them manually from MotifZone.  There aren’t builds for recent Fedora releases on there, but the Fedora 9 RPMs work fine with Fedora 12.  Since we’ve installed these RPMs, you need to make sure yum won’t try to upgrade them, so make sure you exclude “openmotif*” in /etc/yum.conf.

By now, the tarball should be downloaded.  You’ll need to unpack it just like you would any other. Let’s assume you downloaded it to ~/Download, so go to the directory you want to build GEMPAK in and run:

tar xfz ~/Download/gempak_upc5.11.4.tar.gz

Now cd into the GEMPAK5.11.4 directory you just created.  The first thing you’ll need to do is edit the Gemenviron.profile (if you use the Bash shell) or Gemenviorn (for C-shell).  This file sets the myriad of environment variables that GEMPAK uses.  Fortunately, it is sanely built, so you only need to make a few changes.  Change the NAWIPS variable to point to the directory you’re building in (for example, $HOME/GEMPAK5.11.4).  USE_GFORTRAN should already be set to 1, but double-check that it is and that the references to USE_G77 and USE_PGI right below it are commented out.  Eventually, you’ll also need to make sure the GEMDATA and LDMDATA variables are set correctly, but that’s beyond the scope of this how-to.

After you’ve saved the Gemenviron[.profile] file, you’ll need to make sure config/Makeinc.common is correct (it probably is) and you’ll need to edit config/Makeinc.linux_gfortran (or config/Makeinc.linux64_gfortran if you’re on an x86_64 system. You can check that with `uname -i`).  Change the MOTIFINC variable to “-I/usr/include/openmotif” and change X11LIBDIR to “-L/usr/lib/openmotif” (replace “lib” with “lib64″ for x86_64 systems).

There’s one more step to get things working, which appears to be a result of Fedora 12 using a newer version of autotools than the source was prepared with.  You”ll need to run the following command in both the extlibs/netCDF/v3.6.2 and extlibs/JasPer/jasper directories (return to the GEMPAK5.11.4 directory when you’re done):

autoreconf --force --install --symlink

Now that that’s done, you’ll need to load the GEMPAK environment.  For the Bash shell, use `. Gemenviron.profile` and for the C-shell use `source Gemenviron`.  The next step is to actually build the programs.  It is a lengthy process (I timed it at about 8 minutes on my system), so you might want to go get a cup of coffee or something.  Since it produces a lot of output, we want to make sure we can go back and look through it if there are any problems (it took about 5 build attempts before I got these instructions to a workable state), so we’ll save the output to a file called make.out.

make >& make.out

Once that’s completed, it’s time to install the files and clean up after ourselves:

make install; make clean

If you plan to move the built GEMPAK into somewhere like /opt, now’s the time to do that (make sure you update the NAWIPS variable in Gemenviron[.profile] appropriately).  I prefer to move it to /opt and then create an /opt/gempak symbolic link to the GEMPAK5.11.4 directory so that if I install a different version, I just need to change my link and everything else works the same.  For ease of use, you should set your shell configuration file to call the Gemenviron[.profile] script at login if you plan to use GEMPAK frequently.

That’s all there is, it’s ready to use now.

Edit – 7 October 2010.  Instructions for Intel-based Mac OS X builds have been written by Kevin Tyle and posted to the Gembud mailing list.

Edit – 9 October 2010. Added the gcc-c++ package per Gerry Creager’s suggestion.

Solving the CUPS “hpcups failed” error

Filed under: Linux — Tags: , , , , , , — bcotton @ 7:03 am

I thought when I took my new job that my days of dealing with printer headaches were over.  Alas, it was not to be.  A few weeks ago, I needed to print out a form for work.  I tried to print to the shared laser printer down the hall.  Nothing.  So I tried the color printer. Nothing again.  I was perplexed because both printers had worked previously, so being a moderately competent sysadmin, I looked in the CUPS logs.  I saw a line in error_log that read printer-state-message="/usr/lib/cups/filter/hpcups failed". That seemed like it was the problem, so I tried to find a solution and couldn’t come up with anything immediately.

Since a quick fix didn’t seem to be on the horizon, I decided that I had better things to do with my time and I just used my laptop to print.  That worked, so I forgot about the printing issue.  Shortly thereafter, the group that maintains the printers added the ones on our floor to their CUPS server.  I stopped CUPS on my desktop and switched to their server and printing worked again, thus I had even less incentive to track down the problem.

Fast forward to yesterday afternoon when my wife tried to print a handbill for an event she is organizing in a few weeks.  Since my desktop at home is a x86_64 Fedora 12 system, too, it didn’t surprise me too much when she told me she couldn’t print.  Sure, enough, when I checked the logs, I saw the same error.  I tried all of the regular stalling tactics: restarting CUPS, power cycling the printer, just removing the job and trying again.  Nothing worked.

The first site I found was an Ubuntu bug report which seemed to suggest maybe I should update the printer’s firmware.  That seemed like a really unappealing prospect to me, but as I scrolled down I saw comment #8.  This suggested that maybe I was looking in the wrong place for my answer.  A few lines above the hpcups line, there was an error message that read prnt/hpcups/HPCupsFilter.cpp 361: DEBUG: Bad PPD - hpPrinterLanguage not found.

A search for this brought me to a page about the latest version of hplip. Apparently, the new version required updated PPD files, which are the files that describe the printer to the print server.  In this case, updating the PPD file was simple, and didn’t involve having to find it on HP’s or a third-party website.  All I had to do was use the CUPS web interface and modify the printer, keeping everything the same except selecting the hpcups 3.10.2 driver instead of the 3.9.x that it had been using.  As soon as I made that change, printing worked exactly as expected.

The lesson here, besides the ever-present “printing is evil” is that the error message you think is the clue might not always be.  When you get stuck trying to figure a problem out, look around for other clues.  Tunnel vision only works if you’re on the right track to begin with.

April 5, 2010

What do I actually read?

Filed under: Musings,The Internet — Tags: , , , , — bcotton @ 8:17 am

My long-time readers (I’ll call them “Matt” and “Shelley”) might recall that I wrote a post a long time ago about the importance of reading.  I’m too lazy to go find it and put a link here, but that doesn’t really matter anyway.  I know that it’s important to read, but I thought it might be interesting to see what I actually do read.  Like much of the rest of my life, I let Google handle this for me.  Google Reader has a nifty trends feature which allows you to see some information about what feeds you actually read. So what do we know?

My most popular friend is Matt Simmons, with 87 other Google Reader users subscribing to his feeds.  By comparison, I have nine.  On the other hand, there are 52 Google Reader users subscribed to this blog. Hi, everyone! I’m guessing a lot of you started reading this because of the many re-tweets I got from Friday’s post. I hope I don’t let you down.  While you’re here, you might try reading Journal & Courier reporter Amanda Hamon’s blog — I’m the only person using Google Reader to follow it.  Of course, she doesn’t update too often. Unlike Slashdot, which is the most active of my feeds with over 23 items per day.

None of that answers the question of what I read myself.  Well, in terms of absolute numbers, I’ve read more of Boiled Sports than anything else, with 47 read items in the past 30 days.  Hammer and Rails, Hitchin’ On, Slashdot, and Maemo News round out the top five.  On a percentage basis, there are several items where I’ve read every post in the past month.  Only counting feeds with 4 or more posts, I’ve read all of Hitchin’ On and Hippie In Training (the finest environmental blog I’ve read, and I’m not just saying that because my wife writes it).  I’ve also read 94% of Boiled Sports, 86% of Sara Spelled Without An ‘H’, 82% of Kassy_ and 52% of Chris Siebenmann’s blog.

From this, it seems clear that I mostly use my RSS feeds to follow sports and keep in touch with friends.  I’d like to start adding some more, especially feeds pertaining to high-performance and high-throughput computing.  I’m open to anything worthwhile and/or entertaining though (which reminds me, I need to add The Bloggess to my list) so if you have any must-reads, please let me know in the comments.

And speaking of comments, I remarked to my wife last night that I had over 50 Google Readers users subscribed and she was amazed since I never seem to have any comments.  I told her that either no one actually read my blog after subscribing or that they all felt that I say everything that needs to be said.  I like to think it is the latter.

April 2, 2010

Summary of the 2010 CERIAS Information Security Symposium

Filed under: The Internet — Tags: , , , — bcotton @ 6:46 am

Earlier this week, Purdue’s Center for Education and Research in Information Assurance and Security (CERIAS) held its annual Information Security Symposium. This year’s symposium was well-attended, and the keynote speakers perhaps had something to do with that.  The keynote speaker for the first day was the Honorable Mike McConnell, a former Director of National Intelligence, among several other posts.  The day two keynote speaker was the current Under Secretary for the National Protection and Programs Directorate in the Department of Homeland Security, the Honorable Rand Beers. Of course, the internationally-renowned director of CERIAS, Gene Spafford, was there, along with a collection of academic and industry representatives serving on three speaking panels.

With the exception of the poster session, the content of the symposium was largely non-technical.  This is fitting, since many of the greatest challenges in cyber security revolve around social or political difficulties, not technical limitations.  Both Admiral McConnell and Mr. Beers discussed at great length the interactions between the public and private sectors and the need for a mature cyber security policy. (more…)

Powered by WordPress