The Great Compromise

Its been a while since my last post. I got the crazy idea in my head the using Putty / WinSCP has become frustratingly inadequate for my day to day work. Putty has weird copy / paste, ugly formatting, no tabs, dumb keypair management, etc. The list goes on. And using WinSCP to bridge the great divide between Windows and Linux is really getting on my nerves. Not to mention IRB on Windows totally blows. Simply put, a great deal of my day to day tasks would be easier if I were working on Linux. Considering that 90% of my time is working with Debian, and Red Hat it seems simple.

But simple is far from the truth. The truth is that Microsoft has all but snubbed out Linux development. Windows is ubiquitous, and hols 95% plus of the OS market. Most of the talent write for Windows. People test software on Windows.  Support resolves issues that affect Windows user’s first, while other OS bugs become understandable second class citizens. So the question is:

How can I satiate my need for Linux, and remain “professional” in the workplace. If my boss, or my co-workers come in and say there is an issue with a website on IE6 that I should look into, it doesn’t cut the mustard to say “Oh, I run Linux now, so I don’t have IE6.”. There are several answers – each with its own pros and cons. I can dual boot, I can virtualize, I can use two different computers, I *could* just stick with Putty.

I am mulling the options around in my head, and in light of some new technologies, the scales seem to be constantly tipping. Now I am going to try Innotek -> Sun -> Oracle’s VirtualBox, with the “seamless” mode (think VMWare Fusion’s “Unity”), and guest 3D acceleration.


Windows 7 RC Initial Impressions

Windows 7 logoFor those of you who hate Windows Vista, don’t stop reading just yet. I hated Windows Vista too, but for objective reasons that were derived from my experiences, instead of being a sheep. I think its important to try new things, if nothing else but as a base of comparison.


Windows 7 won’t win any speed contests here. The installation took so long that I left and made breakfast (bacon…yum) , ate it, and came back and we were still “Uncompressing files”. What is most interesting is that this install seemed to take LONGER than Windows Vista. Compare that to the 15-20 minute installations offered by Ubuntu Linux.

Out of the Box Experience:

Windows has historically been a weak OS out of the box, and Windows 7 makes improvements, but is still arguably in last place. The tools are bare metal, with the usual roundup – notepad, paint, calculator, and a few games. Interestingly paint has undergone a significant upgrade. Too bad no one uses or cares about paint. The most significant changes in the OOTB experience is the revamped Windows Media Player, Internet Explorer browser, and Windows Explorer file manager.

I have never cared much for Windows Media Player because it has historically done WAY TOO MUCH. The result is mediocrity across the board, and a difficult to user application. I did feel like this version has separated quality from quantity resulting in a nicer experience. It still relies too much on contextual menus, and hunting-and-clicking to perform the action you want. There is no Windows Explorer context option for “Play in Windows Media Player” or anything when you want to play a whole folder on your computer. Also, dragging music folders onto the WMP icon doesn’t play them.

Internet Explorer is now at version 8. Microsoft has supposedly spent a lot of time and effort in revamping the browser experience. All I can say is that it rendered the download page for Mozilla Firefox without any noticeable problems. It may be a good browser now – I will never know first hand, because I will not use it out of principle.

Windows Explorer has been revamped. It does start the browsing location at the root of the drive – something that earlier versions didn’t do, in favor of landing you “My Documents”. I would have like to have seen a consolidated file transfer dialog window that gets appended to instead of having a separate pop-up for each operation. Another annoyance is that the tree on the left no longer follows your navigation on the right – making it basically useless.

The Operating System:

The operating system layer itself is responsive and stable. Windows updates have been moved into a new Action Center component that resides in your system tray. If you care what it says it will nag you about needing to apply updates, and scanning your computer with Windows Defender. Updates still require a full reboot, and the “reminder” to do so is still very annoying and intrusive. God forbid I need to do work – my computer is too busy bothering me about installing the latest security patch for Internet Explorer 8.

The UAC is still alive and ticking – of course you can turn it off anytime you want. It does hinder the download and install software experience, as you have to acknowledge that you do actually really truthfully DO want to install application ‘x’ about four different times.

Appearance and Task Management:

The appearance of the operating system is largly a carry over from Windows Vista, but somehow less masculine. Animations have a purpose, and everything seems to be more fluid, and organic. I would sum it up as being like Windows Vista, but with more creativity and design. That being said, it is dissappointly uncustomizable. You have themes, and gadgets, and your wallpaper can cycle backgrounds – but if you love to tinker you will be dissappointed.

The task bar now has the icons of applications open instead of having labels of what they are. It is any easy setting to change, but a poor default setting choice. Another task management feature is that the task switcher application previews are too small to even pick out details of the window you want to focus on. If the point is to show the application, then SHOW it – don’t make me squint.


The core of the operating systems seems to share a LARGE code base with Windows Vista, and I think that it is advantageous to Windows 7. Vista gets to be the bad guy that broke everything, and Windows 7 is seen as the version that works with everything. In reality, it was just a matter of time for the device drivers to become compliant with the next-gen Windows base. I doub’t Windows 7 is any more compatible with software / drivers. I even tried the emulator on a Dell printer driver – after the hype it was a just a let down.

I installed Adobe Reader, Corel WordPerfect, Google Picasa, Microsoft Office, and Mozilla Firefox without any problems. The Dell printer driver was another story entirely. After fighting with installer files that reported missing post script drivers, and OS incompatability, I finally did a “Windows Update” for printer models. Luckily for me I am on the list, and the process finished successfully.

Initial Conclusion:

It is a rare day indeed when I review a new distribution of Windows. Extremely long development cycles mean the operating system appears to always be just behind the technology curve. Vendors rush in and make software to address the inadequacies of the Windows platform, and make a killing. Vendors get rich, and Microsoft sells more copies of Windows.

I view Microsoft with an overly critical eye, because while they are inovating now, and trying to change, they are a very seedy company. They want to control a lot of aspects of computing, and have the money to try. They drive out competition that offer superior products, and stifle innovation. They are also a monopoly that is so big the world seems to be its hostage.

I am glad to see a breath of fresh air come into the Windows 7 development now that the dust of the disaster codenamed Vista has settled. I truly believe that Microsoft realized their market share is not to be taken for granted anymore, and they will have to fight to hold it.

The real test will be to see what my wife thinks. If she doesn’t ask me to reload her desktop, then I know that the product is “there”.


Life has been busy lately. I have stopped teaching in our Continuing Education program, and focused my time on completing my degree starting this summer.

I am currently taking two classes that are 100% online. It has been an adjustment for me for a few reasons. First, being a student again is hard. There is a lot of shit to shovel. Second, I am seeing our new systems operate like the Portal, WebCT, etc from the outside. I have to resist the temptation of “troubleshooting” mode, where I explore ways to make the process better and just focus on the classes. Third, I never got real familiar with WebCT, and pacing myself and doing everything electronically is surprising harder than it sounds. I am having fun though, and that is what counts.

We are almost done setting up our new office. The desks are in place (including the cabinet doors, which we had to hunt down). Kristin’s new desktop is here and being loaded as I write this post. We are still picking out some more lighting, storage, etc to make the room a perfect office. I am even eyeballing one of those portable AC units to keep the temperature a little more comfortable. Pictures soon!


At work, we are on the edge of having resolved a lot of our Portal issues. In addition to performance and reliability improvements, there will be other subtle enhancements that I am anxious to look into further. These include a Facebook channel, mobile Targeted Announcements, a rich text editor, a better Email SSO experience, and resolution to some terrible technical problems that are unfixable right now. Who knows if these updates are of substance, or are just marketing bullets on a sales pitch. We are fully operational in our testing environment, so the switch should be happening within the next week, assuming testing goes well.

In other news, our garage sale has made us almost $300 so far, and the space we got back in our quaint house is quite impressive.

Windows 7 RC here I come…

Stunnel – Just goes to show, anything can be encrypted

Image by Will JacksonToday I got the opportunity to play around with an open-source application named STunnel. Its purpose is simple – some applications are dumb and don’t understand SSL. This program makes it.

Bullying aside, it had a perfect application today when I wanted to move two websites over to use a secure login page. Why didn’t I just Apache, or IIS you might ask? This particular website happens to be a from a 3rd party vendor who made their own web server to… get this… serve websites written in their own custom language. I don’t understand why anyone would do that in this day and age, what with the 10,000+ languages already available, and the 1,000+ web server platforms available. (On top of all that, they use a proprietary database!)

Long story short, the third party had a module ($695) to enable SSL in their webserver, which I will refer to as Peggle from here-out. The recommendation to use STunnel came from this vendor’s website, so it seems to contradict their money making plans. My guess is someone chewed their asses out about not having SSL out of the box. Whatever the reason, I will consider them less cruel by at least dropping the recommendation on their website as a free alternative to their solution. For all I know, their solution is installing STunnel for you for $695.

Back to Peggle . It does one thing – it serves requested pages out on a specific port. Incorporating SSL is really beyond its scope. This is similar to Mongrel. Whereas Mongrel can sit behind Apache (or some other webserver) to redirect to SSL, STunnel just “takes care of it”.

After finding a lack of documentation (and even worse documentation for Windows), I decided to write my own, much briefer, much more straight forward version. Hopefully, some poor soul will wander across this article searching for enlightenment.

STunnel installation:

Make sure that you have OpenSSL available to you in some form or fashion. Watching Certificate Requests getting generated on Windows makes babies cry, so your best bet is to take someone with a Unix machine out to lunch to get some terminal time.

Certificate Request

  1. Locate a machine with OpenSSL (Windows, Mac, Linux – it doesn’t matter)
  2. Run the command: openssl req -new -nodes -keyout server.key -out server.csr
  3. Answer the certificate questions, keeping in mind that the common name is the FQDN for your website
  4. Locate the generated server.csr file and submit this to your Certificate Authority for signing.
  5. Transfer the server.key file to the machine you are installing STunnel to, if it is not the same.

Installing STunnel

  1. Download Stunnel from here – note that Windows binaries are generously provided
  2. Run the STunnel installation – the default path will install to “C:Program Filesstunnel”

Configuring STunnel

  1. Get the signed certificate returned from your Certificate Authority (above), and drop this somewhere safe. You will need to refer to this location in a moment
  2. Locate the path for the server.key file generated earlier
  3. You now need to combine your key file, and your signed certificate into one single file. I named my file “stunnel.pem”
  4. Open up the file stunnel.conf and replace with the following configurations:
#Stunnel server configuration file

#This is the path to your combined key / signed certificate file
key=C:Program Filesstunnelstunnel.pem

#up this number to 7 to get full log details
#leave it at 3 to just get critical error messages
output=C:Program Filesstunneloutput.log

#These are definitions for services.


Once you have created your configuration file, you can start STunnel by running: stunnel.exe stunnel.conf. Remember that if anything goes wrong, you will probably have to kill it with Task Manager, or by clicking on the system tray icon (in Windows). Additionally, check your server log file – it contains valuable information.

Once you have the STunnel server running, you should be able to go to a URL in your browser such as “” and see that SSL is working. Even if there is nothing serving content at this port, and the page times out, you should still see the secure webpage icon in your browser. This SSL indicator should reflect your CA.

Once you have the port encrypted, it is up to the individual application to respond to requests coming in to that port. Data being sent to and from this port will be encrypted using SSL – totally transparent to the web server.

So there we have it – Peggle is now SSL enabled without ever knowing the difference. People’s data (especially login credentials) are secure once again! The crowd goes wild…

NetworkManager and PPTP in Intepid Ibex

Of all the things that I love about our campus, trying to connect to our Microsoft VPN with a vanilla installation of Ubuntu seems to top the list. Inevitably, I forget all of the specifics of what comprises a “Microsoft VPN” connection, and am overwhelmed by the myriad of shit thrown at my by the NetworkManager VPN interface. To add to it, I seem to suffer from “VPN failed to connect” ad nauseum trying find the right combination of settings that will work.

Well, here it is – documented for all time.


From a vanilla installation of Intrepid Ibex, you will first need to install VPN support to be able to setup this connection. To do this, fire up Synaptic Package Manager (under System -> Administration), and search for “network-manager-pptp” (or from a terminal: “sudo network-manager-pptp”). Once this is installed, you will need to issue a command to restart NetworkManager with its new plugin: “sudo NetworkManager restart”.

For the maintainers of Ubuntu / NetworkManager / PPTP let me point out a few things:

  • Ubuntu should ship with network-manager-pptp by default.
  • NetworkManager should be accessible using a lower-cased entry inside of “/usr/sbin/” .
  • There should be a script for NetworkManager in “/etc/init.d/” like for everything else.

Moving along – now VPN should be available in NetworkManager. Click on “Add” and setup your information. Take special care to observe the following issues:

  • The gateway is just the DNS name – no “http://”, etc
  • The username should by your Microsoft domain name, followed by a backslash, then your username.
  • Leave the password blank here – it is buggy and doesn’t integrate into the Gnome Keyring as it should.
  • Leave the Domain field blank – this was taken care of by the username.

Click on “Advanced”, and check that the option for MPPE is checked. If your network requires Protected EAP (PEAP), then you will need to do one additional step (horribly not accessible from the GUI):

For PEAP, run “gconf-editor” from a terminal, and navigate to “system” -> “networking” -> “connections” -> “1”, or “2”, etc -> “vpn”. Locate the entry that corresponds to your VPN connection in question. Create a new key under this connection called, “refuse-eap”.  Make it of type “string”, and its value should be set to “yes”.

Now choose to connect to the VPN connection, and type in your password, and optionally select to remember the password in the Gnome Keyring. This is a work-around for it not saving in the keyring earlier.


Is your VPN connection slow? Wouldn’t it be nice if you could just use the VPN for work resources, and leave the rest of the Internet accessible through your local connection? Welcome to the magic of routes. Despite the intimidating name, they are fairly easy to setup.

To see what routes are available, run the following from a terminal: “sudo route -n”. See the network connections (eth0, etc), and your “ppp0” connection if you have connected to a VPN? The connection with the destination of is your “default” route – in other words, where all outbound traffic goes unless otherwise specified in this list.

Try toggling your VPN connection and looking at the output to make more sense of it.

The old NetworkManager had a place where you could specify an IP address range. For our campus network, it was “”. See this post for more information.

In the lastest NetworkManager (0.7.0), this “easy” configuration has been swapped out for one that is more confusing IMO. Go back to the VPN configuration screen, then click on the “IPv4” tab. Click on the “Routes” button. Enter in the following values if you wanted to convert “” into the new format:

  • Address:
  • Prefix: 16
  • Gateway:
  • Metric: 0

Disconnect, then reconnect from the VPN server to have the changes take effect. Running “sudo route -n” should now show the new “ppp0” not as default, but as the new route for “” addresses.

More suggestions for the maintainers:

  • For the Ubuntu maintainer: How about shipping with pre-configured VPN profiles?
  • Can the new “Routes” interface optionally include either format?
  • For the NetworkManager / PPTP maintainers: Is it possible to have less verbose, but more relevant output? Is it possible to run a check against a VPN domain and find out what settings are supported / required?

Netbeans 6.5 Rails Console

I was really excited about the release of Netbeans 6.5. It is the only solid Ruby  / Rails IDE that I know of, and it runs circles around Aptana (aka RadRails). Among the new release features is an integrated Rails console. This means that I can develop and test all in the same app without having to ALT+Tab myself to death.

Let me preface this rant by saying that I have the utmost respect for Netbeans (despite its owner) – it is rock solid, easy to use, and powerful. However, the Rails console leaves much to be desired. Here is my list of issues that I would love to see addressed (by someone with more Java finesse than myself):

  1. The Rails console doesn’t allow pre-typing. The environment may take 1 – 30 seconds to load. In this time, I should be able to start typing in my command and have it fill-in correctly once the prompt is available. It will fill in all the stuff I type before it is ready in a random order instead.
  2. The “Up” arrow on the keyboard moves the caret to the previous line, unlike in IRB / console / Bash where it should autocomplete the previously run command
  3. Ctl + L will clear the screen, just like in the development logs, and console on Linux / OS X, but it actually renders the console useless. It removes your prompt, and stops accepting keyboard input

While someone is at this, here are some of my dream features:

  1. The same code-completion that is provided by the IDE itself. If I type in a model name, and hit “.”, it should show me a list of instance methods on that model.
  2. Syntax highlight the output just like in the IDE

Please Netbeans community, Christmas is coming – consider it a gift to the world!

SWAN Manager v2.0

It has been a trying last few weeks, but I have finally rolled out the new SWAN Manager. It ended up being almost a total rewrite, and I walked away with a lesson learned:

  • Even on a total rewrite, I should have consistently been checking in my code at “checkpoints”. Instead, I waited until everything was working, then did one single massive checkin. Inevitably I missed stuff, forgot to cleanup stuff, and had to resolve a couple of SVN conflicts where it just didn’t understand what the hell happened.

I have highlighted a few of the more significant changes in screenshots below:

I based the new login screen off of Google Docs login. It shows at a glance what services are offered inside the Manager, and is a little more friendly than just a login box. Also, you can see the new tabbed interface at the top.
The User model underwent the most significant of changes. First, I decided that it was running way to slow, so I reimplemented the way it looks up the data from an indirect (and unreliable) method, to querying the sources directly. Also, the data is cached using memcached for even more speed enhancements.

An area I am particularly proud of is the display of the icons the user should see. I take each role name, and do a Net::HTTP fetch on them, checking for a 200 result, and displaying it. This is all handled in a helper.

The channel model underwent significant changes as well. It has always directly queried for the data on each request, instead of caching the results. In addition to caching and other performance tweaking – I now know a lot more about the channels themselves. The entire model operates as an “acts_as_tree” with parent, and children nodes to show the sections, and sub-sections of a channel. If you can edit a section, it shows up as a link.
The announcements controller has been completely reworked as well. Before the user didn’t have the ability to do things like send to a role, or choose delivery / expiration times, or a destination. Now the user gets to pick all of this (Population Selection with a parameter is shown). A message can be sent with just a few clicks. The announcement model uses “acts_as_state_machine“, a seemingly dead but very useful plugin. The announcement goes through several states with validation checking and routing automatically handled. I have to thank Matt for turning me onto the idea.

Here I have an image of the announcement wizard further down on the same page as the image above. The date selected is handled by a Rails plugin called “unobtrusive_date_picker“. It allows some cool tricks like keyboard arrow navigation, and the ability to define starting / ending date ranges, and minute increments in the select box.

Additionally, once the announcement is sent, rather than going to get a cup of coffee while it runs the process (sometimes 10+ minutes), it now backgrounds it in a separate rake task.

All in all, I think that this is a word of difference from the previous SWAN Manager. This is stable, fast , and easy enough now that I feel it is something the campus as a whole can use without concern.  And it will need to live up to it expectations as well, as we have a few departments already lined up to start using its functionality as soon as we give the green light.

After I pat myself on the back, I suppose its time to get back to working on all those pesky channels…

The Rich-Internet-Application Bubble

It doesn’t seem to be a question of if, but when we will see the Rich-Internet Application (RIA) bubble burst. Adobe’s Flex platform, and Microsoft’s Silverlight platform promise the world, but their implementation seems to be flawed. The entire discussion seems to always remind me of the story of Java. Consider the following advantages and disadvantages of an RIA system over traditional HTML / Javascript / CSS:


  • RIA platforms promise “rapid application development” (so cliche in the development world). A developer can create a visually rich and interactive application inside the comfort and power of an IDE.
  • RIA platforms can consume data at a level that is not currently possible with current existing web technologies. SOAP, XML and other services can be consumed in a stateless web application.
  • Web applications are treated like traditional applications, offering added benefits like running offline


  • RIA platforms are specific to an IDE – in short controlled. Adobe’s Flex promises to be open source, but only on the client end. The “black box” will probably remain just that.
  • If open-source specifications are achieved (unlikely with Microsoft Silverlight) It will take years for alternate IDEs, and editors to support its functionality on a level comparible to what is currently offered
  • RIA platforms add another layer of complexity to an increasingly growing software stack with the need for web services to interact with database sources as the RIAs ironically are not able to.
  • HTML, Javascript and CSS are all free, standardized, open-source, cross platform, and ubiquitous. Microsoft’s Silverlight and Adobe’s Flex are both controlled and shaped by a single company.
  • RIA platforms require a browser plugin, since the technologies generate data that is not directly consumed by any current browser.

Hopefully we remember the failed promises of Sun’s Java VM. The marketing term “write once, deploy everywhere” has become something of an inside joke in the technology industry, as this is hardly ever the case. It should be more appropriately named “write once, deploy anywhere that supports platform ‘x’. This is made worse by the fact that as closed-spec platforms, the RIA browser plugin development is left to the companies pushing the platform. How well do you think Microsoft will support a Silverlight client on Linux, or iPhone?

Performance in RIA platforms has also historically been a sore subject. Java code runs by creating a host-specific virtual machine that abstracts all of an operating system’s details. This works against performance, as the virtual machine will always be the limiting factor – there is simply too much overhead involved.

What each of these companies are pitching to developers is the idea that HTML, Javascript, and CSS are old, and cumbersome to work with. Further, browser inconsistencies make truly agnostic client-side scripting, or styling very limited in implementation. Solutions like Prototype.js, or the CSS browser reset are needed to write agnostic code.

I just don’t buy into it.

First, HTML, Javascript, and CSS are made cumbersome by the fact that no browser (except possibly Safari) adhere to the standards agreed upon by the W3C consortium. Microsoft in particular is selling a tool to abstract browser development, while their browser consistently remains one of the least compliant available.

Second, I think that the success (undeniable when looking at their ubiquity) of HTML, Javascript, and CSS are due to the fact they are free, accessible, open-source, and standardized. Anyone who wants to be a web developer can start right now without ever spending a penny. Developers wishing to use RIA platforms must purchase a developer license to use the IDEs to generate the code.

I hope that the developers of the Internet don’t fall prey to the seduction of RIA offerings. RIA is simply a product being pushed soley on the basis of profit – not with the promise to make the web a better place tomorrow. Instead, I encourage everyone to ride out this wave, and let the technology’s bubble burst.

Back from Catastrophe

What a horrid last few days. Around lunch yesterday I restarted my Windows box at work, and I was greeted with unpleasantness. At the login screen, I got an error informing me that some instruction in memory has performed an illegal operation. After clicking on that, I got the infamous “NT Authority” says you have 60 more seconds with Windows error.

On rebooting the machine, I was notified that “C:WindowsSystem32hal.dll” is missing. I called shins, but sure enough, my entire Windows directory appeared to be empty. On closer inspection, Ubuntu informed me that it wasn’t empty, but instead, it was receiving an Input/Output error when trying to “ls” the contents. I ran a “chkdsk /r” wholly expecting that NTFS has fucked it all up again, and I seem to have been correct – at least in part.

After the chkdsk, I advanced two seconds further than my last attempt to boot Windows, only to be greeted with some cryptic error informing me that my registry looks about like that train over there. Repairing was not an option, so after much fingernail biting, and a few choice words, I decided that my only remaining option was a reinstallation.

Let me take a moment here to talk about the Windows reload process. My problem isn’t that I think its crappy and that I think someone should do something about it. I actually know that its crappy compared to any other Operating System’s standards. I can’t tell you how many damn “Next” buttons I had to click. And then how many preferences I had to change. This would have been much easier if I could have used something like Synaptic to check all the programs I wanted to install in one swoop. Additionally, on a *NIX platform, all of my application preferences would have been saved under my home folder. Windows is a tard in that department so it took me about five hours to get it back to usable.

After that got resolved I fired up my Virtual Machine containing my webserver (cheap hosting solution I know) and found that the MySQL database wouldn’t start. It ended up that the filesystem on my Linux box was corrupted as well. I ran “fsck” and fixed a dozen or so errors, rebooted, and realized that one of the files that was corrupted happened to be the MySQL user’s table. Long story short, I learned alot about troubleshooting MySQL, and got everything restored without losing any data.

Now I am finding other files all over the place that are 0 bytes in size. I have backups, but since the original file still exists when the backup is made, the backup is successfully overwritten with the new (0 kb) file.

John (and I partly) suspect VMWare may be the culprit. This is an incomplete theory however, and the entire process has left me visibly shaken. We run financial systems on these things. We run nuclear power plants with these things. My net worth is just a number sitting on some hard drive in a basement Wachovia owns somewhere. What happens when that dissappears?