Apple, Computers, Hardware, Linux, Open-source, Personal, Ruby, Software, Windows

Living in an Apple World

Welcome readers to what is a first here on my blog – a review about Apple’s OS X. As some of you may know, part of my new job is working on a Mac for 8 hours a day, 5 days a week. Someone asked me about my experiences, and I feel up to sharing my findings. I want to be fair in my assessments, so if it sounds like I am starting to get a little slanted, keep me in check with a comment!

First things first – the initial impression. I have a 27″ iMac and I was initially impressed by the appearance of the machine. The iMac screens and case are one piece, so I have plenty of room to kick around beneath my desk with minimal cord entanglement (not that it matters because I sit cross-legged all day). The compact style keyboard has an aluminum casing, which matches the iMac. The mouse is the Mighty Mouse. Both are wired, which I appreciate – especially on the mouse. I hated the compact keyboard since it feels shrunken, and the addition of the “Fn” key in the bottom row meant every time I tried to press “Control” I missed. After swapping this out for a full-sized keyboard I was much happier, and even unlearned some bad habits. The Mighty mouse absolutely sucks. The tiny wheel stops responding all the time from the slightest spec of dirt, and you have to turn it over and rub it back and forth on your jeans, or the mouse pad. Its one saving feature is the ability to vertically, and horizontally scroll which is occasionally helpful. I am a right click fan, and though invisible, the region on the mouse that registers as a right click versus a left is about 10 times smaller. It feels like the edge of the mouse.

The keyboard on a Mac is different in important ways from its PC counterparts. The “Windows” key is replaced with the Command key, which is utilized far more than the Windows key ever was. In fact, most of the operations of the machine are done using Command (copy, paste, new tab, close window, etc) effectively making it closer to the “Control” key in Windows. However, the Control key remains, which actually introduces a whole new key combination to more effectively use shortcuts. The Command key is located next to the space bar, which is much more convenient than the extreme left placement of the Control key. I do copy, paste, etc operations using my thumb, and not my pinky finger – much less strain.

The computer screen can be tilted, which is nice since the whole world seems to be moving towards the annoying high gloss screens. I can tilt it down, and out of the florescent overhead lights. I really feel that gloss is a showroom gimmick just like turning the brightness up to max on the TVs in the store. If I wanted to look at myself, I would sit in front of a mirror. Fortunately, I have a second non-gloss monitor, and I do most of my coding on this screen. Also, it would be nice if the monitor had a height adjustment, as second monitor isn’t quite the height of the iMac screen.

Enough about appearance – lets talk hardware. This is a dual core Intel-based processor, with 2 GB of memory (later upgraded to 4GB). The video card is decent I suppose (however the interface can get quite “laggy” at times). I don’t have any idea what the machine costs, but this is definitely unimpressive hardware. 2GB of RAM is the minimum I would work with, and it being slow laptop RAM doesn’t help at all. At least there isn’t a laptop hard in it too.

As for the Operating System, it seems pretty stripped down. This isn’t necessarily a bad thing – I can quickly find what I am looking for, without going on a damn field trip through obscure dialog windows. The flip-side to this is it doesn’t feel very “customizable”. You use the stock features, or you don’t use a Mac. Perhaps there are a bunch of third party utilities that I don’t know about? Sometimes I am disappointed by the lack of customization options (there are just a handful of settings for the dock). To be honest, I am not sure what I would customize, but I like to poke around, and I often leave the System Preferences disappointed having not found “setting xyz“.

I really enjoy the file system indexing, and they have the best implementation for full-text search I have seen. It doesn’t bog down the computer, and the results are instantly updated. Magic. It effectively is the starting point for all my open actions. I don’t know why it isn’t available for the first 10 minutes after a boot, but I don’t shut down that much so its ok.

I was surprised by the lack of a default system-wide notification system – something that Growl has aimed to fill. I was also disappointed by the lack of package management on the Mac – again third party solutions exist. The system updates are just as annoying as in Windows which was a disappointment. Once the “restart” prompt stole my typing focus and proceeded to shut down the system. A few times the machine has “beach balled” (the Mac “hourglass” icon), and hard locked. Most of time its fairly responsive and stable which I can appreciate.

Other points of interest are the window management. I use Expose almost as regularly as I do the task switcher (Command + Tab), though admittely sometimes I get lost in the special effects and forget what I was doing. There are a bunch of other window groupings, but I don’t really find them that useful. One particularly frustrating observation is that once you minimize a window, you can’t Command + Tab back to it. Isn’t that the point of the task switcher? It even shows up in the task switcher, but when it is selected, absolutely nothing happens.

As for the software available on the Mac it is more comprehensive than Linux, and less comprehensive than Windows. Some of my co-workers commented that in OS X, there is usually one utility to do something, whether you like it or not. I use Google Chrome, JetBrain’s RubyMine, Ruby, Terminal, Lotus Notes, Adium, and Propane almost exclusively. Because of this, I can’t really assess the state of the Mac software ecosystem, but I will say that all these programs run damn well on the Mac. The only software crash I have is Flash. Flash on Linux and Windows is stable, however on the Mac probably one in ten uses causes the browser tab to lockup. I am not sure whether this is a Chrome issue or not, but something is seriously wrong with the state of Flash on my Mac. Now I understand why so many Mac users hate Flash – as a Windows user, I never experienced the constant crashing.

In summary, due to the nature of my work, I use the Mac at work in essentially the same manner I would use Linux. The terminal is where I spend my time, and I am more or less indifferent about the operating system around it, as long as I can install the system libraries I need for Ruby extensions, and it stays responsive. My next computer purchase will be a netbook and I will install Ubuntu on it, as I can’t justify spending the designer prices of Apple products to use a terminal and a web browser.  Toe to toe with Windows, and many Linux distributions, OS X excels in many areas. Its a fantastic operating system, but I am not sure that it is worth its cost. If I could throw it on my PC at home it would be worth $100. Buying a special machine just to run it is just silly.

Apple, Computers, Events, Hardware, Open-source, Personal, Ruby, Software, Thoughts, Web, Windows

3 Days Down, 40 Years to Go

Yesterday at 5:00pm marked the end of my first week at Beacon Interactive Systems. My coworkers are all really nice, and there is a surprising geographic mix between them. Some folks have lived in Massachusetts their whole lives, while others come from Maryland, and Michigan. The cultural differences between “down South” and here are pretty minimal, unless you just feel like having a good laugh. There have been two big adjustments however: Snow is really not a big deal up here – people hardly notice it outside. The second is restaurants don’t have sweet tea. You would have to drink sweet tea to understand why this is a big deal.

In general:

  • The job is much less stressful. Even during crunch times, you hear Southpark and Big Lebowski quotes (“I’m not your pal, guy!”).
  • The environment is a lot less structured. You come in whenever, you leave whenever. If you want to go outside and toss around the football, go for it. Good team-builder by the way.
  • The skill sets of my coworkers are all very impressive. Its the rifle vs shotgun approach.
  • The job area is nice – its next to Harvard. Getting there is rough – I have to cut across the city. My 20 minute commute takes about an hour.
  • Developing on a Mac is an easier transition than I thought. I won’t say that I’m in love with it yet, but its workable. The biggest pain has been this silly bundled keyboard and mouse. No one else uses them. Also, package management on Mac sucks compared to Linux. I think I would actually prefer to use Linux. Time will tell on this one.
  • The coffee isn’t as good.

An interesting collision of viewpoints occurred my second day at the job, while I was shadowing a coworker on a joint project. He was showing me their (complex) system of bug detection, and correction. They write up a use case, file a ticket, branch the code, create a changset, rebase it, merge it into QA, verify it, then push it back upstream. Not coming from anything near that complex (“Hey Ben – login to the production server and change it!”) I was amazed that they spent so much time on this process. I asked if they ever just ignore a bug that would be too minimal to matter. My coworker asked me to clarify what I meant. I replied with “You know, its good enough for government.” He paused and looked at me funny, then reiterated that they address all bugs that are discovered. A bug is a bug. It will take me a while to harden my resolve to be like theirs, and aim for perfection. Perfection wasn’t possible before because we had the typical scenario of overworked, underpaid, and on a deadline.

We are moving into our new building in a few weeks. When we move, there will be a train station across the street from the new building, and I will probably make the transition to riding into work. Its about the same amount of time, but I would have the ability to sleep, read, surf the Internet, etc all without causing an accident.

Wish me luck for next week – its been a difficult adjustment.

Computers, Family, Hardware, Linux, Open-source, Personal, Software, Windows

Bringing the Dead Back to Life

destroyeddriveNo, not zombies. A few days ago, a friend of mine brought me a computer and told me that it was running REALLY slow. I did some investigation, and quickly discovered after running some diagnostics that the hard drive was on the fritz. They had been wanting a new hard drive for a while anyways. One that was bigger, faster, and one that worked. I went to Newegg.com, bought a replacement SATA drive, and after arrival, began the task of migrating to this new drive.

The installation of Windows that was on the old drive was fine (aside from the drive itself causing errors). I didn’t want to reinstall Windows, as it is such a painful and time-consuming process. Hunt down drivers, copy over the documents, reconfigure settings, install the old software again. No thanks. Instead, I decided to explore my options with the open source utility dd. Why not Norton Ghost, or one of the dozens of other programs? They all cost money, and I suspect some just use a ported version of dd to do their dirty work behind the scenes. I am cutting out the middle-man (and the cost) and going straight for dd.

dd is a common Unix program whose primary purpose is the low-level copying and conversion of raw data” – Wikipedia

dd allows a user to copy data from a source to a destination bit for bit – an exact mirror image. Even better, if the device has corruptions, you can use dd_rescue. The concept is the same, however this utility assumes that the source is likely damaged and does some special handling for bad sectors.

Making the backup

To make your disk copy, boot the source computer to a Linux live CD such as Ubuntu. A live CD may or may not mount your hard drive inside of the operating system. If it does, you want to unmount this device. In Ubuntu, the mount directory is in “/media”. If you browse for files here, and see a Windows partition, unmount this first. This ensures that there are no write operations occurring during the copy of the disk.

Next, fire up dd_rescue and use the following syntax:

sudo dd_rescue /dev/sda - | ssh user@remote_server "dd of=/path/to/save/backup"
  • sudo runs this command with elevated permissions
  • dd_rescue needs the two parameters of <source> and <destination>
  • Our source is /dev/sda. Note that we are copying the entire drive (sda) instead of a partition (such as sda1, sda2, etc). Your device may differ from “sda”.
  • Our destination is , which is shorthand for standard out. This would direct the binary content to the screen
  • The “|” symbol (pipe) is redirecting the previous output to a new process
  • ssh is a secure way to copy files from one machine to another. In this case, we are specifying a username and a remote server (ip)
  • Finally, “dd” is executed on the remote server with the parameter “of” set to the path where the image will be saved to

Note that you will need to have sudo access, an ssh account on another machine, and permissions to write to the path you will be saving to. Also, make sure that your replacement drive is the same size, or larger than your source drive. Make sense right?

This process will take a while, depending on the damage to the drive, and the size, network speed, etc. I maxed out my router at around 11 MBps (100 Mbps). dd_rescue will provide output, and let you know when it is complete. An 80 GB hard drive took about 2 hours to complete.

Restoring the Backup

Once this completes, you are ready to shut down the source machine, and swap out the bad hard drive for the new hard drive. Reboot to your live CD, and run the following command (taking some values from earlier):

ssh user@remote_server "dd if=/path/to/save/backup"| sudo dd of=/dev/sda
  • We are using ssh again, this time to pull the image back across to the client
  • On the remote server, we are executing dd if. Note the if. We are using this backup as the input file, instead of the output file (of)
  • We are piping this output into our local machine
  • Using sudo, we execute dd with the of parameter. This will do a bit by bit copy of the image to the destination media.

Note that again you want to ensure that the target is not mounted in any way. Also, we are doing the entire hard drive, so I am just using “/dev/sda”

Guess what?! This process will take a while, just like the copy operation before.

Expand the New Partition

Note that if you have a replacement drive that is larger, you will need to expand the Windows partition. This makes sense, since an 80GB drive has less bits than a 250GB drive. When the image (that is 80GB) finishes copying, the rest of the hard drive is completely untouched. To address this behavior, and be able to use the rest of the replacement hard drive, we will need to expand this partition.

You may need to run a “chkdsk /R” a few times in Windows if your hard drive has any bad sectors. As a note, new drives usually have a few bad sectors that need to be identified and disabled. After you run chkdsk, fire up your Live CD again, and launch gparted. This is a graphical tool for managing hard drive partitions. You should see the unallocated space at the end of your drive. Click the preceding partition (NTFS, etc) and choose to resize this partition, taking up all of the unallocated space. Apply your changes, and in a while, you have a full sized hard drive partition for Windows.

If you receive an error about bad sectors preventing ntfsresize from running from gparted, you may not be able to continue. Despite running chkdsk and restarting multiple times as instructed, I was not able to continue in gparted due to this message. There is switch –bad-sectors that can be called for ntfsresize arguments, however the GUI gparted does not let you do this. I first tried setting an alias, but it seems the program references the full path of the command, so the alias did not work.  Finally, I arrived at this solution:

# sudo su
# mv /usr/sbin/ntfsresize /usr/sbin/ntfsresize.real
# touch /usr/sbin/ntfsresize
# chmod a+x /usr/sbin/ntfsresize

Now, edit the file ntfsresize file you just created and do something like the following code:

#!/bin/bash
ntfsresize.real $@ --bad-sectors
exit 0
  • We are becoming root with sudo su
  • We move the existing ntfsresize file to another filename (ntfsresize.real)
  • We then create a new file named ntfsresize with execute permissions, and edit this file
  • Inside this file, we call the original ntfsresize, with our –bad-sectors switch, and $@. This is a way to pass any arguments sent to our script on to the ntfsresize.real script.

After this, run gparted, and you should be able to continue. A resize from 80GB to 250GB took about an hour to complete.

References:

Apple, Computers, Hardware, Software, Thoughts, Windows

What Ever Happened to Multi-tasking?

fix-computerToday we take multi-tasking for granted. Right now, as you read this you probably have a browser window open, perhaps email, instant messenger, your music player, perhaps anti-virus software running. If you had to run one, and only one application at a time I think we can agree that it would significantly affect your productivity. I have a difficult time even imagining computing without the ability to multi-task. However, the importance of this core computing concept seems to be called into question with new appliances.

When you look at a computer, it is very different from an appliance, such as your game console, your DVR, or your smartphone. One could argue that each type of electronic device has a purpose, and constraints that interfere with achieving this purpose. In the case of a video game console, the purpose is entertainment. A console’s constraints are the need to be consistent, and offer high performance with limited hardware. With a DVR, its purpose is to manage scheduled shows. A DVR’s constraints are optimal playback quality, and near real-time schedulers. With a smartphone, its purpose is to keep people connected while on the go. A phone’s constraints are probably the most severe with extremely limited hardware, coupled with short battery life.

In devices where performance is a key factor, such as video game consoles, and DVRs, I believe you are likely to see less control given to a user. This is to keep the running environment “pristine” and prevent non-core applications from adversely robbing the limited hardware of CPU cycles, and resources. This is the dreaded scenario of any computing device – some process (that the user may not even be aware of) is affecting the performance of the entire device.

To my knowledge, the iPhone was the first smartphone that really encouraged the installation of applications onto the device by the user. This would mean that since users have choice to modify the software running on the phone, the performance of these applications running  could affect the performance of the entire device. In order to work with the extreme constraints of the device, Apple made the decision to remove multi-tasking from their OS. This has been a controversial decision, however one that has not affected the success of this device.

Now other appliances are following the paradigm of no multi-tasking. Microsoft recently announced the Windows Phone 7 OS, and it is rumored to not have multi-tasking support. While this rumor may be unfounded, the Apple iPad device also offers no multi-tasking. The success of this device, and the market that this device will fill is yet to be determined, but the inroads to daily computing without multi-tasking can be seen. A tablet device has a purpose, and constraints parallel to that of a smartphone.

Video game consoles have never offered multitasking until recently, and even now it is very limited. Coupled with smartphones, and tablet devices, the perspective of multi-tasking on the computer seems to be the exception, and not the norm. It can almost be argued that no multi-tasking is a feature that boosts the performance of a device.

Is this how the world will fix the “slow device” problem? Hardware and batteries are constantly improving, but we never seem to get ahead of the curve, since applications become more complex at roughly the same rate. “What Intel giveth, Microsoft taketh away.” Eliminating multi-tasking sure gives devices an unconventional speed boost, but is it seems like a step backwards? Can the human mind truly multi-task anyways?

One final note: it would seem to me that perhaps a solution lies in the UNIX solution of prioritizing process by “nicing” them.

What are your thoughts?

Hardware, Personal

Chrysler Sebring vs Sony MEX-BT2700

mexbt27001It is done. It seems like just yesterday when I installed a custom car stereo into my Chrysler Sebring. After many months of enjoyment, someone felt they were more entitled to my stereo than I was, and made off with it. Thanks buddy (burn in hell)! I have been riding around with my totally awesome factory stereo for too long now, and my wife got me the Sony MEX-BT2700 stereo. It has everything I wanted: HD Radio, MP3 players, Bluetooth calling and music streaming, an auxiliary in port in the front, and a remote control. What follows is my intriguing tale of its installation:

Chrysler’s cryptic wiring schematics look like a mentally retarded child with a 24-count box of Crayola crayons went to town on their blueprints for the Sebring. I decided to get a wiring harness to bypass all of that nonsense this time. I also decided that instead of making my arm bend in awkward positions, I would use a jack and remove the left front tire to get at the car battery. The “maintenance free” battery placement is the most aggregating part of working on my car. The consequences of Chrysler’s placement have ensured that any future car I own will have a battery that is easily accessible. I also ordered a universal mount for my stereo. With these choices, I figured this would be a walk in the park. I was almost right.

The wiring harness was a snap to connect – just connect the colors. The mount fit as advertised. With the wheel gone, it was much easier to get at the battery. I connected my car stereo, reconnected the battery, turned the keys in the ignition and did an inadequate test. I turned the radio to the tuner, and checked that all four speakers were producing sound. They were, and I wrapped everything up and called it a day too prematurely. My test drive was out to the store, and I discovered on the way that the CD player didn’t produce sound, though it looked like it should. I fiddled with all the options, and sure enough, it was spinning the disc, but not producing sound.

I got home, and looked at the manual, and discovered that the Bluetooth source, and the Auxiliary source had the same issue. My phone was connected both times, but sound failed to come out of the car speakers. I did some Googling with no result, and then did a live chat with Sony customer support. The support was slow, but surprisingly helpful – it put me on the right path to resolve my issue.

The next day I cut the wheel in the right, then left fashion needed to get at the battery without removing it (thanks again Chrysler), and disconnected the battery. I then pulled the stereo out, and messed with the wiring in a few combinations from what the support instructed me to do.  From their documentation:

If there is no sound or the amplifier turns off when playing a CD, but there is sound and the amplifier is on when playing the radio, then you may have the solid blue (antenna remote lead – ANT REM) and blue striped (amplifier remote lead – AMP REM) wires connected incorrectly. Try switching these wire connections. In vehicles designed to supply power to the amplifier through the antenna power supply, connect the solid blue antenna wire to the solid red ignition wire. In all other vehicles, connect the blue and white amplifier turn-on wire to the solid red ignition wire.

My first point of confusion is that I don’t have an amp in my car. Apparently this is still applicable. The second point of confusion is that the solid red wires should be connected to the power antenna wire along with the existing power antenna wire. This means that three wires are connected to the power antenna wire (the two red wires, plus the blue wire). In total, I believe this to be 24V of power (12V from red, 12V from blue). I was hesitant to do this for fear of damaging the unit. Apparently, this is by design, however and it will not work unless you do this step.  After this, I hesitantly reconnected the stereo, and voila – it works.

I wanted to throw this out there in case any else has run into a similar problem with their car, or with their stereo.

Hardware, Software, Thoughts, Windows

Windows 7 Upgrade Isn’t All Gravy

Windows 7 logoI decided to give the gift of Windows 7 for the holidays this year. My in-laws had the unfortunate luck of replacing their old desktop with Windows ME to a new computer from Dell with Windows Vista 32-bit Home Premium. It seems they have leapfrogged to the worst OS ever released from the previous worst OS ever released. The fact they kept Windows ME running for almost five years should deserve some kind of medal.

I am running the license-unencumbered Windows 7 beta on my wife’s machine, and I have been fairly pleased with the results. On this positive experience, I took the plunge, and bought the Windows 7 Professional 64-bit upgrade. Lets break this down for a moment. There are (at least) four editions of Windows 7:

.
  • Home Basic
  • Home Premium
  • Professional
  • Enterprise (formerly Business?)
  • Ultimate

The good old comparison page over at Microsoft indicates that each “step up” in version encompasses all the functionalities of the lesser versions, plus some more stuff. This is a lie.

While this may be true for features, the upgrade paths seem to have been determined with a dart board (or more likely greed). Looking at upgrade paths for Windows Vista only, we can see the following:

From Windows Vista (SP1, SP2) Upgrade to Windows 7
Business Professional, Enterprise, Ultimate
Enterprise Enterprise
Home Basic Home Basic, Home Premium, Ultimate
Home Premium Home Premium, Ultimate
Ultimate Ultimate

If you were like me, you may have sprung for the extras in the “Professional” edition, but stopped short of “Ultimate” since the features it offers over Professional are pretty lame. My in-laws machine is currently running Windows Vista Home Premium 32-bit edition, and you can see that by spending more on Professional then Home Premium, I have lost my ability to upgrade. Thats right, you pay more and get less.

This has led to some extravagant solutions. The easiest is to go buy the Windows 7 Ultimate Upgrade, but refunds on opened software are non-existent. Plus, I do not want to give any more money to a company that I feel has already cheated me. Another option is to use a utility to change which edition you have an upgrade disc for. Windows 7 ships with all editions on each disc, and this tool will let you change which one you have. Alternately, you can download a digital copy of a different edition for free.

This doesn’t affect the product key though. A Windows 7 Professional key will only activate the same edition.

To upgrade from Home Professional, I would have to acquire a Windows 7 Home Professional Upgrade disc using one of the methods above. I would then have to install Windows 7 Home Professional, and then use the the Windows Anytime Upgrade to do an in place upgrade from Windows 7 Home Professional to Windows 7 Professional. Then, I can use my Windows 7 Professional product key to activate my copy.

Navigate all that, and keep in mind to watch out for 32-bit and 64-bit since they are not cross-upgradeable, or interchangeable whatsoever. The RAM requirements are even different.

So much for Windows 7 being easy.

Computers, Hardware, Open-source, Personal, Software, Thoughts, Web

Google Set for World Domination

googleI don’t know what has happened in the last six or so months, but my relationship with Google has drastically changed. It started out with their superior searching – it wasn’t quick for me to ditch everything else in favor of Google’s search engine. When Gmail came out, I remember laughing about how excited people were to get a coveted invitation. To me they were a search company who was using their name to advertise what should have been a sub-par email solution.  Now, I am a fan of Gmail.

Next, I remember using Google Maps, then Google Docs. Then came Google Picasa, and I was clued into Google Analytics.

Now is the point in my story where we arrive at Google Calendar. The interface was really good, and I noticed that whatever task I wanted to do always seemed to be right where I was looking for it – no doubt the product of heavy usability testing. When I compared it to Outlook webmail’s calendar interface, I remember laughing at how pathetic Microsoft’s offering was.

This is when I reached critical mass with Google’s products. Quickly following came Reader, the Chrome browser, my new Android phone (a Motorola Cliq), Listen, Wave, Checkout, iGoogle, and Products!

I think that Google is the only company that can provide a consistent, integrated, cohesive experience that rivals what Apple offers its users. I am at the point now where if Google moved to a subscription based model, I would almost undoubtedly pay whatever the monthly fee was.

Congrats Google on continuously creating, and updating solid, tested, quality products!