Apple, Computers, Hardware, Linux, Software, Windows

Using Synergy Over VPN

I’ve been watching a lot of Linus on Tech and one of their sponsors sells a product called Synergy. More information on their product page: https://symless.com/synergy . To summarize, this is a software KVM (no video, so I guess it is “KM”?) solution. The use case was perfect for me. On my desk I have an Apple work laptop, a Windows desktop, and a Linux personal laptop. Anyone that has done extensive work on a laptop keyboard and touchpad know that it isn’t optimal. I didn’t want multiple keyboards and mice on top of my desk because of the clutter. I dropped some money for Synergy, and it just works!

20170731_110826_HDR.jpg

That is until I had to connect to our company VPN the next week. They use a full tunneling solution. When I connect, I lose everything. I can’t print, I can’t access my NAS, but most importantly I can’t access my keyboard and mouse. (The video is fine because it is a hard wire to an external monitor). What to do?

SSH to the rescue! What is SSH? This is a protocol that will allow one computer to securely interface with another computer. Secure SHell. However, we will just be using it for port forwarding, and not for an interactive session. The goal is to take the OS X machine (Synergy client), and SSH into the Windows server (Synergy server). Using this SSH connection, we can forward ports within it. It is a tunnel running inside the SSH connection. This will expose a local port on OS X for 24800 that is actually pointing to the remote server port 24800. This is the port that Synergy uses for its connections.

You will need a few tools, and a little patience. Having just gone through this, I’m sharing for posterity, or maybe for anyone that has thrown in the towel with how crippled VPN makes accessing home devices.

I have the following Synergy setup:

  • Windows 10 Synergy server (keyboard and mouse are physically connected to the desktop)
  • OS X Synergy Client
  • Linux Synergy Client
  • Router with a local area network all these devices share
  • Admin access to the router for port forwarding
  • Autossh package for OS X (available via brew)

First step, get Windows 10 up to speed with SSH. How this isn’t built in as a service in the year 2017 I have no idea. Grab the OpenSSH server package for Windows from https://www.mls-software.com/opensshd.html . After downloading, extract and run the setup file. This will create a new Windows service for OpenSSH that will run on port 22. It prompts you to generate an SSH key for the server.

Once this server is running, you will need to add your user to the list of SSH users. Open up PowerShell as an administrator and change into the C:\Program Files\OpenSSH\bin directory. Run the following commands:

mkgroup -l >> ..\etc\group
mkpasswd -l >> ..\etc\passwd

Try and connect to your SSH server from the OS X client:

ssh <user>@<server IP> # e.g. ssh Ben@192.168.1.95

You should be prompted for your Windows password. Once you can successfully login to the server, we can setup public key authentication. This removes the need for you to type in your password because you identify yourself with an SSH public key. From your OS X machine get your public key:

cat ~/.ssh/id_rsa.pub

Put the contents of this file on your SSH server in the file C:\Program Files\OpenSSH\home\<user>.ssh . This is actually a symlink to C:\Users\<user>.ssh . If the directory .ssh doesn’t exist, you will need to create it first. Now we need to configure the server to allow public key authentication. Edit the C:\Program Files\OpenSSH\etc\sshd_config file and change the following lines:

StrictModes no
PubkeyAuthentication yes
AuthorizedKeysFile .ssh/authorized_keys

Restart the OpenSSH server for the changes to take effect:

net stop opensshd
net start opensshd

You should now be able to SSH into the server same as before but without being prompted for a password.

Now we are ready to create an SSH tunnel. Before we incorporate AutoSSH (which handles retries and monitoring) we will do a naive attempt to SSH. In the following command:

  • -f backgrounds the process
  • -L does port tunneling in the format of <local port>:<remote host>:<remote port>
  • -N do not run a command – just tunnel the port
ssh -f <user>@<remote public IP> -L 24800:<remote public IP>:24800 -N

If this works, you should see a [LISTEN] entry for port 24800 when you list open files:

lsof -n -i | grep 24800

You may need to set your server as the DMZ on your network. Or to be safer you can simply setup port forwarding. We will need port 22 and port 24800 to resolve to the Windows server. The instructions for how to do this on a router widely vary by vendor. Typically it is under a WAN section. It typically prompts for a port, a destination IP, and destination port, and protocol. You want ports 22 and 24800 to route to your server IP for TCP and UDP.

Configure your Synergy client to use localhost instead of the remote IP. You should now be able to operate your client from the server’s peripherals via Synergy.

Everything works great until the VPN connection is made. The reason is that the SSH connection is severed. In order to recover automatically, I have added autossh to persist this tunnel. On the OS X client instead of running SSH do the following:

AUTOSSH_POLL=10 autossh -M 20000 -f -N <user>@<remote public IP> -L 24800:<remote public IP>:24800

Now when a VPN connection is made, or a disconnection happens, the autossh package will detect that it is no longer alive and retry. Because Synergy’s software also retries, after a few seconds your connectivity should begin working again.

Thanks to Synergy for making a solid product, and for having first class Linux support.

Apple, Computers, Ruby, Software

Upgrading Ruby with Rbenv+Homebrew

Heroku has defaulted to Ruby 2.0 for all applications, so its probably time you updated that crufty old version you have been running. Unfortunately the process is less than straightforward, especially when using a version manager. Assuming you are running rbenv with a Homebrew version of ruby-build, this guide will get you running the latest version of Ruby:

To begin, check which versions of Ruby Rbenv knows about. Rbenv delegates this work to ruby-build:

rbenv install --list

Best case scenario you have a recent version of ruby-build and you see the version of Ruby you want in this listing. At the time of this writing, version 2.0.0-p247 is the most current. If your desired version is present, skip the following steps and just install the Ruby version via:

rbenv install 2.0.0-p247

If your version is not present in the list, you will need to upgrade ruby-build so rbenv knows about the more recent versions of Ruby. Assuming you installed ruby-build via Homebrew, you can update it by issuing:

sudo brew update

Issuing this command complained about having untracked files within the Homebrew directory (which is actually just a git clone of the homebrew project). This may not be the correct way to fix this problem, but I issued the following command to stash these untracked files, and uncommited changes so they don’t interfere with the upgrade process:

cd /usr/local && git add . && git stash

This should now be a clean directory, and you can issue the brew update command again.

Now that brew is updated, you should have the latest “formula” for ruby-build. You can then issue the command to update ruby-build itself:

brew upgrade ruby-build

Once this completes, we can list versions of Ruby via rbenv again to ensure our desired Ruby version is now in the list. Once you see this, you can issue the following command to install a known version of Ruby:

rbenv install 2.0.0-p247

To use your shiny new version of Ruby, you can set this to be the default version:

rbenv global 2.0.0-p247

You can also set this per project, or by setting an environmental variable to override, so don’t worry if not all your projects are Ruby 2.0 ready. You can easily switch between versions – and that’s the point of version management right?

You can confirm you are running the latest Ruby version by issuing:

ruby -v

Note that you will need to re-bundle any gems from your Gemfile against the new Ruby version, as well as rehash any rbenv shims for these gem executables:

gem install bundler && bundle install && rbenv rehash

For more information, check out the rbenv, and ruby-build documentation. To discover the latest stable version of Ruby you can peek at the official Ruby download page to find out the latest version and patch number. Finally, check out the Homebrew docs if you are still stuck.

Apple

Unthinking Fullscreen Mode

Apple’s fullscreen windows are driving me nuts! We were co-existing until I “lost” a very important terminal session. Turns out I didn’t lose it, but it was in fullscreen mode. This omitted the window from cycling via Command+~, as well as being absent from the app-specific Expose (though it was present in Mission Control – something I seldom use.)

I hope they change this behavior soon. Here are some observed pain points:

  • The two second animation when switching from your desktop to Fullscreen applications via Control+left|right arrow is not only unnecessary, but frustrating. This could be much faster, and without the easing decelerating at the end of the animation. Same holds true for Command+Tab (though the animation is mercifully quicker)
  • The hotkey for taking an application into Fullscreen mode should be dictated by the window manager, not left up to the developers. Why not F11? Currently its Control+Command+F in Google Chrome, Command+Enter in iTerm, etc.
  • Why are these windows omitted from the task switcher? These programs are still running, so they should be reachable by the switcher. Is there another switcher hotkey to toggle between windows of the same application? Command+~ doesn’t work if the other window is in fullscreen mode.
  • Command+Tabbing between applications is problematic if you have multiple windows open of the same application when one is in fullscreen mode. From my fullscreen window, I switch to another application, then switching back defaults to the non-fullscreen window instance on the desktop instead of toggling back to where I was.
  • Why are they moved into their own workspace? Just remove the window chrome and leave it where it is. I was working in a fullscreen window, and I needed to use the calculator program. This created absolute insanity switching between applications due to the unnecessary animation each time. I ended up having to take the application out of fullscreen mode just to use the calculator in an efficient manner.
  • Why is there no transparency available to fullscreen applications? I like a transparency to my black backgrounded terminal. No such luck if you also use fullscreen mode.

So here is my suggestion: Get rid of the butt ugly arrows for fullscreen in the window chrome, and make that useless green button in the upper left hand (you know the one that sometimes makes windows resize _smaller_) and make that the fullscreen button. If it isn’t broken, don’t fix it.

In the interm, I’ve almost abandoned Apple’s fullscreen mode in favor of Cinch for positioning fullscreen windows. It works like the Windows 7 window manager (and Ubuntu) and allows applications to fullscreen by dragging the chrome to the top of the window.

Apple, Computers, Events, Family, Personal, Ruby, Software, Thoughts, Vacations, Web

Cloudy, Cold and Hip – Two Weeks of Training in Portland

I’ve really enjoyed the last two weeks. My new employer, recently acquired Analog Analytics flew me out to Portland, Oregon for training. Portland is quite an amazing place. Skateboarders, cyclists, and runners abound, but with a laid back attitude. Its the greenest city I have ever visited. Stores seem to only dispense recyclable materials including paper bags, and foods in waxed cardboard containers. The entire city is very walkable without much danger of personal harm. The food was amazing, and the drinks even better. This city knows its coffees, teas, and beers. It has to be home to the most microbreweries of any city. Needless to say I have probably gained 5 pounds, and I am super caffeinated. Also, the proximity to all these hip restaurants is giving me second thoughts about living so far outside of the city limits. No lie, I even glanced at Portland housing prices.

It took me a few days to get oriented to the city and the work environment. The company runs out of the Ford Building, in the heart of quite a few cool restaurants and bars in the Southeast side of the city. In fact, it left me a little jealous considering the hotel is only surrounded by fast food joints.  I got a shiny new MacBook Pro (which I am currently battling to make it as “boring” as possible). I can’t talk too much about the work, but it does hit the sweet spot of what I was looking for – a small team feel with deep pockets, and a launch date.

Kristin and Morrigan joined me for the second week and did their own thing, and they had a blast. They visited OMSI, Powell Books, Finnegans, several parks, and malls, and some tasty food joints. I’m happy they got to experience some of what makes this city awesome.

I’m enjoying several aspects of the job in particular: A remote driven environment, and pair programming. Training isn’t the best test run of this environment, as I am in the office everyday for now. Once I am setup, I pick the hours. People hop online and offline, according to their time zones, availability, etc. Every piece of communication, and workflow is centered around remote teams.

Pair programming makes programming social. Despite the image that telling someone you are a programmer conjures, I really enjoy interacting with people. I remember teaming up with James, John and many others at Clayton State to tackle some large issues with our portal and other systems. Since Clayton State, I have worked on a couple teams, and it was almost always in isolation, save for 5-10 minute high level meetings. The best part is, its actually kind of fun.

Pair programming was a tough adjustment for me. I’m used to presenting a final product and defending its implementation. I have all the answers. I know what the talking points are up front, and I am comfortable because I am the authority on the subject. Pair programming is letting your guard down, and conceding as much as contributing. You are two people working on a problem together, with neither party starting off knowing the complete solution. The work is certainly slower than solo programming, as incorporating input, early refactoring, and general discussion takes up time. This team takes an interesting approach to combat some of the time drain; You can either pair program and merge directly, or work solo but your code requires a peer review before merging. The choice is yours. The solo programming option will probably act as a safety value for those days when I just want some time to myself. They also encourage “switching drivers” to vary the work. Interestingly, being the passenger requires more focus than driving, as you are trying to proactively find issues with the current approach.

I’m still struggling to embrace TDD. I don’t like the zealotry in the community when the topic comes up; presenting the only two options as either you test first, or you are just ignorant, undisciplined, or apathetic to the code you write. The truth is far from it. I figure things out by moving the pieces around – not by staring at it from a distance. That is not to say that there aren’t times when testing first is extremely useful, like when clarifying requirements. The test assertions (even with missing test bodies) is often enough to help solidify an attack plan. The amount of code coverage can be a hindrance though, as real world tests always end up being more tightly coupled than you ideally want them to be. If you make seemingly small code changes, you can end up with quite a bit of the test suite failing (all though with the same few errors repeating). If you mock and stub too much, you aren’t testing much that is useful. Even worse, the workflow doesn’t seem realistic: Write the tests, verify the tests fail, write the code, verify the tests pass. The reality seems to be write the tests (heavily guessing at the exact implementation), verify they fail, write the code, refactor almost all of your tests, and verify they pass. Given the choice, I think I’d still rather write code, then test the code to verify it does what I want in all scenarios. I’ve yet to meet a dyed-in-the-wool TDDer that sees any fault with this extra refactoring step. The subject of pre-written tests needing to be refactored seems to be glossed over. Maybe my opinion will be changed yet.

Things are looking awesome for this next step in my life! I’m keeping my fingers crossed for Railsconf tickets, since they are in my employer’s backyard. There are also a few missed restaurants I am meaning to visit next time I’m back up this way…

Apple, Computers, Linux, Open-source, Ruby, Software, Thoughts, Web

PostgreSQL for Ruby on Rails on Ubuntu

My new desktop came in at work this week, and the installation was painless thanks to the great driver support of Ubuntu 11.10. For anyone setting up a Rails development box based on Linux, I have some tips to get around some pain points when using a PostgresSQL database.

Installation:

Postgres can be quickly and easily installed using apt-get on Debian or Ubuntu based distributions. Issue the command:

apt-get install postgresql

Ruby Driver

In order for Ruby to connect to PostgreSQL databases, you will need to install the pg gem. This gem will need the development package of PostgreSQL to successfully build its native extension. To install the PostgreSQL development package, issue the following command:

apt-get install libpq-dev # EDIT: postgresql-dev was replaced by this package on Ubuntu 11.10

Setup A PostgreSQL Role

You can configure PostgreSQL to allow your account to have superuser access, allowing your Rails tasks to create and drop databases. This is useful for development, but is strongly discouraged for a production. That being said, we can create a PostgreSQL role by logging into psql as postgres as follows:

su postgres -c psql

This will open a PostgreSQL prompt as the database owner postgres. Next, we need to create an account for our user. This should match the response from “whoami”:

create role  superuser login;

We can now exit from psql by issuing “q“. Try to connect to psql directly by issuing the following command from your shell account:

psql postgres

This should allow you to connect to the default database postgres without being prompted for credentials. You should now be able to issue the rake commands for creating, and dropping the database:

rake db:create

Rspec Prompts for Credentials

I was being prompted by Rspec for credentials when running my test suite. If you would like to remove this credential prompt, please read the following:

There are differences in how the PostgreSQL package is configured in Homebrew on OS X, and how it is packaged in the Ubuntu and across other distributions. One difference is in the level of security configured in the pg_hba.conf file. This file is responsible for identifying which sources using which authentication mechanisms should be allowed or denied. By default, Rspec will cause a prompt for a password even if your shell account has trusted permissions. This is because Rspec connects not as a local process, but to localhost. To allow connections to localhost to be trusted, you will need to modify the pg_hba.conf file.

Next, we can modify the pg_hba.conf file located at /etc/postgresql/<version>/main/pg_hba.conf

Comment out the lines any lines at the bottom of the file and append the following:

local   all             all                                      trust
host    all             all              127.0.0.1/32            trust
host    all             all              ::1/128                 trust

This will allow connections from the shell, as well as connections to 127.0.0.1 (localhost) using both IPv4 and IPv6.

You will need to restart PostgreSQL for the changes from this file to take affect:

/etc/init.d/postgresql restart

PostgreSQL Extensions

If you want to make use of any of the additional extensions to Postgres, including fuzzystrmatching, you will need to install the postgresql-contrib package:

apt-get install postgresql-contrib

The extensions will install to /usr/share/postgresql/<version>/extension/

Using the Postgres version 9, you can create these extensions in your database by using the new CREATE EXTENSION syntax. In the case of the fuzzystrmatch extensions, you can issue the following command from inside a PostgresSQL command prompt to load the extensions:

psql ;

Once inside your database:

create extension fuzzystrmatch;
Apple, Computers, Linux, Open-source, Software

Tell Tar to Auto Compress Those Files!

Just discovered a handy shortcut when working with the GNU utility “tar”. Like many other Unix utilities, the switches that you can pass tar change its behavior. To create a “plain” tar file (compressing multiple files down to a single tape archive format – similar to zip) you can execute the following:

bsimpson@Saturn:/tmp$ echo "test" > test
bsimpson@Saturn:/tmp$ tar -cf test.tar test
bsimpson@Saturn:/tmp$ file test*
test:     ASCII text
test.tar: POSIX tar archive (GNU)

The “c” switch creates a new archive, and the “f” switch tells it that the name of the new archive follows. Similarly, we can untar this using the “x” switch, mutually exclusive with “c” for its create counterpart.

We can also apply compression to these new archives:

bsimpson@Saturn:/tmp$ echo "test" > test
bsimpson@Saturn:/tmp$ tar -czf test.tar.gz test
bsimpson@Saturn:/tmp$ file test*
test:        ASCII text
test.tar.gz: gzip compressed data, from Unix, last modified: Tue Jan 11 22:31:34 2011

Or alternately, we can provide the bzip2 compression:

bsimpson@Saturn:/tmp$ echo "test" > test
bsimpson@Saturn:/tmp$ tar -cjf test.tar.bz2 test
bsimpson@Saturn:/tmp$ file test*
test:         ASCII text
test.tar.bz2: bzip2 compressed data, block size = 900k

All well and good, however tar allows you to specify the compression (or lack thereof) based on the new filename. You call it what you want, and tar will figure out what compression to apply. Consider the following:

bsimpson@Saturn:/tmp$ echo "test" > test
bsimpson@Saturn:/tmp$ tar -caf test.tar test
bsimpson@Saturn:/tmp$ tar -caf test.tar.gz test
bsimpson@Saturn:/tmp$ tar -caf test.tar.bz2 test
bsimpson@Saturn:/tmp$ file test*
test:         ASCII text
test.tar:     POSIX tar archive (GNU)
test.tar.bz2: bzip2 compressed data, block size = 900k
test.tar.gz:  gzip compressed data, from Unix, last modified: Tue Jan 11 22:24:43 2011

As a side note, these do not stack. For example “test.tar.gz.bz2” just produces a bzip2 encoded ASCII test file. If you *really* wanted this, you could use the pipe command to chain your compressions.

Hope this saves some time!

Apple, Computers, Hardware, Linux, Open-source, Personal, Ruby, Software, Windows

Living in an Apple World

Welcome readers to what is a first here on my blog – a review about Apple’s OS X. As some of you may know, part of my new job is working on a Mac for 8 hours a day, 5 days a week. Someone asked me about my experiences, and I feel up to sharing my findings. I want to be fair in my assessments, so if it sounds like I am starting to get a little slanted, keep me in check with a comment!

First things first – the initial impression. I have a 27″ iMac and I was initially impressed by the appearance of the machine. The iMac screens and case are one piece, so I have plenty of room to kick around beneath my desk with minimal cord entanglement (not that it matters because I sit cross-legged all day). The compact style keyboard has an aluminum casing, which matches the iMac. The mouse is the Mighty Mouse. Both are wired, which I appreciate – especially on the mouse. I hated the compact keyboard since it feels shrunken, and the addition of the “Fn” key in the bottom row meant every time I tried to press “Control” I missed. After swapping this out for a full-sized keyboard I was much happier, and even unlearned some bad habits. The Mighty mouse absolutely sucks. The tiny wheel stops responding all the time from the slightest spec of dirt, and you have to turn it over and rub it back and forth on your jeans, or the mouse pad. Its one saving feature is the ability to vertically, and horizontally scroll which is occasionally helpful. I am a right click fan, and though invisible, the region on the mouse that registers as a right click versus a left is about 10 times smaller. It feels like the edge of the mouse.

The keyboard on a Mac is different in important ways from its PC counterparts. The “Windows” key is replaced with the Command key, which is utilized far more than the Windows key ever was. In fact, most of the operations of the machine are done using Command (copy, paste, new tab, close window, etc) effectively making it closer to the “Control” key in Windows. However, the Control key remains, which actually introduces a whole new key combination to more effectively use shortcuts. The Command key is located next to the space bar, which is much more convenient than the extreme left placement of the Control key. I do copy, paste, etc operations using my thumb, and not my pinky finger – much less strain.

The computer screen can be tilted, which is nice since the whole world seems to be moving towards the annoying high gloss screens. I can tilt it down, and out of the florescent overhead lights. I really feel that gloss is a showroom gimmick just like turning the brightness up to max on the TVs in the store. If I wanted to look at myself, I would sit in front of a mirror. Fortunately, I have a second non-gloss monitor, and I do most of my coding on this screen. Also, it would be nice if the monitor had a height adjustment, as second monitor isn’t quite the height of the iMac screen.

Enough about appearance – lets talk hardware. This is a dual core Intel-based processor, with 2 GB of memory (later upgraded to 4GB). The video card is decent I suppose (however the interface can get quite “laggy” at times). I don’t have any idea what the machine costs, but this is definitely unimpressive hardware. 2GB of RAM is the minimum I would work with, and it being slow laptop RAM doesn’t help at all. At least there isn’t a laptop hard in it too.

As for the Operating System, it seems pretty stripped down. This isn’t necessarily a bad thing – I can quickly find what I am looking for, without going on a damn field trip through obscure dialog windows. The flip-side to this is it doesn’t feel very “customizable”. You use the stock features, or you don’t use a Mac. Perhaps there are a bunch of third party utilities that I don’t know about? Sometimes I am disappointed by the lack of customization options (there are just a handful of settings for the dock). To be honest, I am not sure what I would customize, but I like to poke around, and I often leave the System Preferences disappointed having not found “setting xyz“.

I really enjoy the file system indexing, and they have the best implementation for full-text search I have seen. It doesn’t bog down the computer, and the results are instantly updated. Magic. It effectively is the starting point for all my open actions. I don’t know why it isn’t available for the first 10 minutes after a boot, but I don’t shut down that much so its ok.

I was surprised by the lack of a default system-wide notification system – something that Growl has aimed to fill. I was also disappointed by the lack of package management on the Mac – again third party solutions exist. The system updates are just as annoying as in Windows which was a disappointment. Once the “restart” prompt stole my typing focus and proceeded to shut down the system. A few times the machine has “beach balled” (the Mac “hourglass” icon), and hard locked. Most of time its fairly responsive and stable which I can appreciate.

Other points of interest are the window management. I use Expose almost as regularly as I do the task switcher (Command + Tab), though admittely sometimes I get lost in the special effects and forget what I was doing. There are a bunch of other window groupings, but I don’t really find them that useful. One particularly frustrating observation is that once you minimize a window, you can’t Command + Tab back to it. Isn’t that the point of the task switcher? It even shows up in the task switcher, but when it is selected, absolutely nothing happens.

As for the software available on the Mac it is more comprehensive than Linux, and less comprehensive than Windows. Some of my co-workers commented that in OS X, there is usually one utility to do something, whether you like it or not. I use Google Chrome, JetBrain’s RubyMine, Ruby, Terminal, Lotus Notes, Adium, and Propane almost exclusively. Because of this, I can’t really assess the state of the Mac software ecosystem, but I will say that all these programs run damn well on the Mac. The only software crash I have is Flash. Flash on Linux and Windows is stable, however on the Mac probably one in ten uses causes the browser tab to lockup. I am not sure whether this is a Chrome issue or not, but something is seriously wrong with the state of Flash on my Mac. Now I understand why so many Mac users hate Flash – as a Windows user, I never experienced the constant crashing.

In summary, due to the nature of my work, I use the Mac at work in essentially the same manner I would use Linux. The terminal is where I spend my time, and I am more or less indifferent about the operating system around it, as long as I can install the system libraries I need for Ruby extensions, and it stays responsive. My next computer purchase will be a netbook and I will install Ubuntu on it, as I can’t justify spending the designer prices of Apple products to use a terminal and a web browser.  Toe to toe with Windows, and many Linux distributions, OS X excels in many areas. Its a fantastic operating system, but I am not sure that it is worth its cost. If I could throw it on my PC at home it would be worth $100. Buying a special machine just to run it is just silly.