Blog Fiasco

May 24, 2010

A quick summary of green-er computing

Filed under: Musings,The Internet — Tags: , , , , , — bcotton @ 8:52 am

Last week a Twitter buddy posted a blog entry called “E-Waste not, Want not”.  In it, she raises some very good points about how the technology we consider “green” isn’t always.  She’s right, but fortunately things may not be as dire as it seems.  As computers and other electronic devices become more and more important to our economy, communication, and recreation, efforts are being made to reduce the impact of these devices.  For the devices themselves, the familiar rules apply: reduce, reuse, recycle.

Reduce

The first way that reduction is being accomplished is the improved efficiency of the components.  As processors become more powerful, they’re also becoming more efficient.  In some cases, the total electrical consumption still rises, but much more slowly than it would otherwise.  In addition, research and improvements in manufacturing technology are getting more out of the same space.  Whereas a each compute core was on a separate chip, nowadays it’s not unusual to have several cores on a single processor the same size as the old single-core models.  Memory and hard drives have increased their density dramatically, too.  In the space of about 10 years, we’ve gone from “I’ll never be able to fill a 20 GB hard drive” to 20 GB is so small that few companies sell them anymore.

As the demand for computing increases, it might seem unreasonable to expect any reduction in the number of computers.  However, some organizations are doing just that.  Earlier this year, I replaced two eight-year-old computers I had been using with a single new computer that had more power than the two old ones combined.  That might not be very impressive, but consider the case of Solvay Pharmaceuticals: by using VMWare‘s virtualization software, they were able to consolidate their servers by a 10:1 ratio, resulting in a $67,000 annual savings in power and cooling costs.  Virtualization involves running one or more independent computers on the same hardware.  This means, for example, that I can test software builds on several Linux variants and two versions of Windows without having to use separate physical hardware for each variation.

Thin clients are a related reduction.  In the old days of computing, most of the work was done on large central machines and users would connect via dumb terminals: basically a keyboard and monitor.  In the late 80′s and 90′s, the paradigm shifted toward more powerful, independent desktops.  Now the shift is reversing itself in some cases.  Many organizations are beginning to use thin clients powered by a powerful central server.  The thin client contains just enough power to boot up and connect to the server.  While this isn’t useful in all cases, for general office work it is often quite suitable.  For example, my doctor has a thin client in each exam room instead of a full desktop computer.  Thin clients provide reduction by extending the replacement cycle.  While a desktop might need to be replaced every 3-4 years to keep an acceptable level of performance, thin clients can go 5-10 years or more because they don’t require local compute power.

Another way that the impact of computing is being reduced is by the use of software to increase the utilization of existing resources.  This particular subject is near and dear to me, since I spend so much of my work life on this very issue.  One under-utilized resource that can be scavenged is disk space.  Apache’s Hadoop software includes the ability to pool disk space on a collection of machines into a high-throughput file system.  For some applications, this can remove the need to purchase a dedicated file server.

In addition to disk space, compute power can be scavenged as well.  Perhaps the most widely known is BOINC, which was created to drive the SETI@Home project that was a very popular screen saver around the turn of the millennium.  BOINC allows members of the general public to contribute their “extra” cycles to actual scientific research.  Internally, both academic and financial institutions make heavy use of software products like Condor to scavenge cycles.  At Purdue University, over 22 million hours of compute time were harvested from the unused time on the research clusters in 2009 alone.  By making use of these otherwise wasted compute hours, people are getting more work done without having to purchase extra equipment.

Reuse

There’s such a wide range of what computers can be used for, and that’s a great thing when it comes to reusing.  Computers that have become too low-powered to use as a desktops can find new life as file or web servers, networking gear, or as teaching computers.  Cell phones, of course, seem to be replaced all the time (my younger cousins burn out the keyboards really quickly).  Fortunately, there’s a good market for used cell phones, and there are always domestic violence shelters and the like that will take donations of old cell phones.

Recycle

Of course, at some point all electronics reach the end of their useful lives.  At that point, it’s time to recycle them.  Fortunately, recycling in general is a common service provided by sanitation services these days.  Some of those provide electronics recycling, as do many electronics stores.  Recycling of electronics (including batteries!) is especially important because the materials are often toxic, and often in short supply.  The U.S. Environmental Protection Agency has a website devoted to the recycling of electronic waste.

It’s not just the devices themselves that are a problem.  As I mentioned above, consolidating servers results in a large savings in power and cooling costs.  Keeping servers cool enough to continue operating is a very energy-intensive.  In cooler climates, outside air is sometimes brought in to reduce the need for large air conditioners.  ComputerWorld recently had an article about using methane from cow manure to power a datacenter.  This is old hat to the people of central Vermont.

It’s clear that the electronic world is not zero-impact.  However, it has some positive social impacts, and there’s a lot of work being done to reduce the environmental impact.  So while it may not be the height of nobility to include a note about not printing in your e-mail signature, it’s still better than having a stack of papers on your desk.

May 21, 2010

Why it’s not always done the right way: difficulties with preempting Condor jobs when the disk is nearly full

Filed under: HPC/HTC — Tags: , , — bcotton @ 7:38 am

In the IT field, there’s a concept called “best practice”, which is the recommended policy, method, etc for a particular setting or action.  In the perfect world, every system would conform to the accepted best practices in every respect.  Reality isn’t always perfect, though, and there are often times when a sysadmin has to fall somewhere short of this goal.  Some Internet Tough Guys will insist that their systems are rock-solid and superbly secured. That’s crap, we all have to cut corners.  Sometimes it’s acceptable, sometimes it’s a BadThing™.  This is the story of one of the (hopefully) acceptable times.

(more…)

May 7, 2010

Perl’s CGI.pm popup_menu cares how you give it data

Filed under: Linux,Web design — Tags: , , — bcotton @ 11:53 am

Last weekend when I was working on the script that mirrors and presents radar data for mobile use, I decided the less work I had to do, the better.  To that end, I tried to make heavy use of the CGI.pm Perl module.  In addition to handling the CGI input, CGI.pm also prints regular HTML tags, so you can avoiding having to throw a bunch of HTML markup in your print statements.  This makes for much cleaner code and reduces the chances you’ll make a silly formatting mistake.

Everything was going well until I added the popup menu to select the radar product.  Initially I followed the example in the documentation and it worked.  As I went on, I decided instead of having two hashes for the product information, it made sense to make my hash include not only the product description, but the URL pattern I’d be using when it came time to mirror the image.  Unfortunately, when I tried to make that change, my popup form no longer had the labels I wanted.

I kept poking at it for a while and finally got frustrated to the point where I decided I’d just write a foreach and have that part print the HTML markup instead of using CGI.pm functions.  Fortunately, I first talked to my friend Mike about it.  I sent him the code and after a little bit of working, he realized what my problem was.  CGI.pm’s popup_menu function expects a pointer to a hash for labels, not an array (I’m not really sure why, maybe someone can explain it?).  Once that was settled, the script worked as expected and the remainder was finished in short order.

Sometimes, it really helps to pay attention to the data type that a function expects.

May 3, 2010

Presenting: Funnel Fiasco mobile weather

Filed under: Funnel Fiasco,Weather — Tags: , , , — bcotton @ 7:01 am

You may recall on Saturday that I mentioned some big things that were coming.  Fortunately, you don’t have to wait long.  I’m proud to announce that an idea I’ve been thinking about has finally been realized this weekend.  Without further ado, the Funnel Fiasco mobile weather site. The idea behind this site is simple: the National Weather Service makes a lot of data available but it isn’t always in a mobile-friendly format.  Even the NWS mobile page has some bad navigation (and more importantly, doesn’t include velocity data, which is very important to chasers).  All I’ve done is to re-package the data in a way that I want to see it when I’m away from the computer.

All of the data is mirrored and hosted locally to minimize my impact on the NWS servers (and thus save taxpayer money!).  The local storm reports (LSRs) are grabbed by a cron job every 10 minutes.  I use the comma-separated value (CSV) files hosted on the Storm Prediction Center’s storm report website.  The CSVs are parsed by a Perl script I wrote and then a static HTML page is generated.  For the radar data, the images are mirrored on demand and a Perl script generates the output on the fly.  The radar data piece is a fairly heavy-duty script (by my standards, at least) and so I still consider it in beta.  For now, it actually runs on my server at home and not on the main funnelfiasco.com server.  I plan to move it onto funnelfiasco.com (and hope it doesn’t kill my bandwidth limits) after further testing.

I have to say, I’m pretty proud of the work I’ve done, almost all in the space of a weekend.  It’s nice to be able to add some useful content to my site.  I hope that it will get some good use and continue to grow.  As I come across data that can be easily manipulated, I’ll add it to the site.  Of course, if you wonderful readers have data you’d like to see, please let me know.

May 1, 2010

A few site updates

Filed under: Funnel Fiasco — Tags: — bcotton @ 9:13 pm

In the past two days, I’ve made a few minor site updates.  They mostly revolve around removing dead links and otherwise making things tidy. I also removed the CSS element that made links go all caps when you hover over them.  It was a neat effect, but it messed with page rendering sometimes.

The more major change is that I’ve removed links to my ham radio pages.  Frankly, I’m just not that interested in the hobby of amateur radio anymore and I never had much content on there to begin with.  I won’t remove the pages, since I’ve got plenty of disk space available, but I am running low on subdomains, so I’ll be retiring kc9fyx.funnelfiasco.com later this month.

Take heart, though. I’m also working on adding a lot of content (that I hope will even be useful!).  I’m not quite ready to open the curtain yet, but I’m very excited.  More to come soon!

Powered by WordPress