DIY Home Theater

We recently bought a house with an extra room in the basement. The other rooms were all the same paint color, carpet, lighting and other finishes. We decided to take one of the rooms and give it some personality by turning it into a home theater. We could have just ordered a TV, some speakers and a couch and called it a day but the room needed some character. Our budget for this project was $15k but we soon realized we would only need about half of this budget and get everything we wanted.

After considering a few options we decided on a dark blue paint (Behr Superior Blue) for the entire room and an accent wall with board and batten. Aside from waiting for a TV sale, the room was transformed inside of just a few weeks. I went with an LG OLED for its superior contrast – something that I’ve been envious of after having seen my friends. There is no LED dimming – and glow with an OLED. But it is costly – and this was our biggest line item on the home theater. I paired this with a Klipsch R-625FA floor speaker pair with the R-12SW subwoofer and R-52C center channel and R-41 bookshelf speakers for the rear. Powering this is the Sony DH790 receiver. For furniture we chose the Larna Park Taupe sectional couch with ottoman. This was our second biggest expense with the speakers+receiver being third. Final touches include a Quaniece console for some additional storage and a pair of the Beaker Plug-In Black Wall Sconces, and Britton series ceiling fan from Home Depot. Stay tuned for the total cost!

Day one was preparation and and supply run to Home Depot. We watched a few YouTube DIY videos and decided on a 3 1/2″ primed pine board and 6 vertical battens and 2 horizontal battens for a total of 12 of the 3 1/2×8′ boards. This was roughly determined by how large we wanted each square of the grid and using an online calculator. This takes the guesswork out of positioning, padding, and board widths and just tells you how far apart to space them. First run of boards, 1 gallon of paint, liquid nails and other materials was ~$200. I was fortunate to be able to get a brad nailer and air tank from my father and law so that was not a tool I had to purchase. To make cleanup easy I decided I would buy a new painting kit for $10 and chuck it in the trash when I was done. Cleaning a paint roller takes forever, and it never works as good the second time anyway. You will also need some other basic materials – tape measure, saw, level, hammer, screwdriver, caulking gun. I’m not including these in the cost of this project since I use them for everything.

First we took down the existing crown moulding on the intended accent wall. We had a decision to make on 1) taking down all the trim, 2) taking down just that trim and joining it to the board and batten or 3) building the frame below the existing crown moulding. We opted for number 2 after pulling down the crown from one wall and seeing we had a sizeable gap between the drywall and ceiling. That would have been a lot to contend with across the entire room. Cutting back the crown I used a saw to make a 90 degree cut so it could fit flush against the batten frame. Brad nails about every two feet, and liquid nails on the back of the board.

The room is so plain – yuck! Day two is when the project really took off. We put up our first vertical board being very careful, measuring everything multiple times, and generally moving slow. It soon became a routing however. Don’t think that your wall height or width is the same and measure each cut individually. We did all the vertical boards, using the brad nails and liquid nails on the board backs. Next we measured and cut the horizontal boards. A few of the videos said use a spacer board but I found it unnecessary. Instead we just measured each board placement and attached. A laser level makes the horizontal boards go much quicker. We made a line all the way across (you want to true level, not parallel to the ceiling/floor. This was pretty much the second day. As a nice end of day bonus the TV and the speakers both arrived.

Day three we prepped for painting. This involved some drywall repair to other sections of the room, sanding, removing face-plates, spackling, and caulking. A wood putty is used on the batten board joins to make it look like a single piece of wood. This has to be applied, allowed to dry, then sanded flush. The spackle took a few passes – applying then letting dry only to build on top in some of the worst places. A lesson learned here was apply caulking to the interior squares. We were not going to do this at first, but it was super quick, and it looks fantastic – especially where the board and the wall had a gap.

Day four was a continuation of our paint with an emphasis on the detail work. We taped off our edges and really tried to cleanup our lines and fix some dripping and sparse areas. This step seems to be the biggest factor in what looks like a DIY project and what looks like a professional project. You just have to keep going and getting more and more precise in your results. Cumulatively you want to look at the paint and have anywhere you look be sharp lines and cleanly applied.

Day five the sectional couch arrived in the morning. We went with a two tone gray and dark brown to match the dark blue paint color. The sectional dimensions work well with the room fitting almost perfectly on the back wall in front of the batten. I absolutely love how much room the sectional provides – the entire family can comfortably sit and watch a movie together. We mounted the TV on the opposite wall with a slim profile wall bracket to make it just blend in like a hanging picture

The slim mount actually had two settings – one with a more aggressive tilt. The TV is so thin however it felt better to have it on a very slight 5 degree tilt to optimize viewing. The vertical on the TV was around 68 inches so we bought a console that was 88 inches so it didn’t look dwarfed by the TV. This has some shelves and doors to hide the controllers, cables, and provide a buffer for the wall to keep people from bumping into the TV screen.

Next up was the speaker installation. The Platin Monaco speakers were unfortunately a big disappointment. Lots of popping, dropping, crackling, and speakers getting “stuck” on a note like a computer freezing. The nail in the coffin however is that WiSA doesn’t support Dolby DTS which is the surround sound format that virtually every Blu-ray disc uses. These went back and were replaced with a more aggressive sound system – but for roughly the same cost by catching a good sale on the speakers, and using an Amazon gift card on the receiver. I took a gamble on WiSA but if you care about your audio then move on to a wired system – no comparison and I’m not even a trained ear.

Running all the speaker wire was one of the hardest projects. I had a door frame on either side of the wall with the TV so getting to the rear speakers was an undertaking. I initially bought some 1/2″ cable raceways and painted them all and was going to tuck the speaker wire inside them. This might have worked well, but the new sound system came with a large 10″ subwoofer that I wanted to locate to the other side of the room between the couch and the wall. Wherever the rear speaker cables went, the suboofer monoaural cable needed to go to. And at lengths > 30″ it is recommended it be shielded – which of course means extra bulk. They were too tight a fit for the raceway, and after painstakingly getting it somewhat crammed in there I realized that the adhesive was not sufficient to keep this stuck to the walls/ceilings. I didn’t want a larger raceway either and I still didn’t want cable staples with visible cable. I ended up debating options and fished the cables under the carpet at the doorway, and tucked them under the baseboards along the rest of the wall the best I could. It was the third or fourth option but I’m happy with it. You don’t notice unless you step on the cable without wearing shoes.

The final step was crimping all the speaker cables, and connecting my speakers and giving it a run for its money. After about 30 minutes of Dunkirk on Blu-ray I realized I wasn’t getting signal from my left speaker. Stupid mistake – on checking the speaker I realized I neglected to connect the left channel – having connected only the Atmos/height channel further up the speaker. Easy remedy.

One other pain point was getting the PS5 to output 4K+HDR via a receiver. The receiver sits between the TV and PS5 and I get all the niceties that entails – synchronized power on/off, and the TV remove controlling the PS5 and the receiver volume. However the PS5 kept informing me that do output HDR I needed to drop to 1080p. I was thinking the receiver was just not going to support this feature when I found a Reddit post of a user with the same issue resolving it by setting the HDMI port the PS5 is plugged into into “enhanced” mode in the AV options. After a restart the PS5 immediately picked up 4K + HDR completing the build.

The painted cable raceways that I abandoned were not a complete loss however. I did use one of the larger ones under the TV instead of using a gang plate and running cables through the wall. This pairs with the batten somewhat unintentionally and is a whole lot easier than fishing cables. The sconce cables were also secured using some of the cable raceway meant for the speaker cable and cleaned up that installation. The discarded raceways even got involved in fishing the cables under the carpet!

The ceiling fan was also a late purchase when we realized it had a wobble at higher speeds. Instead of trying to balance it we realized that the rest of the room had so much character that the contractor model fan just didn’t fit any longer. This was a relative simple swap and has the added benefit of being much quieter while operating and comes with a remote control.

We are really happy with the final product and having done it ourselves we saved quite a bit of money. All said and done we spent about $6800. Having done it ourselves there is an extra feeling of pride and attachment and we are ready to show it off to family and friends. Had we paid to have it done we might have had an electrician run wiring and a new switch for the wall sconces but the plugin ones seem fine – especially with a dark paint. This should make a great room for movie watching, and some gaming sessions on the PS5 and Switch. I do wish the room had already been run for cable but fishing them under the carpet only took an hour or so and I’m happy with the result. The only real disappointment with the project was the WiSA speakers. But I took a gamble with a new technology and instead ended up returning them and got some seriously amazing sound as a backup. Another unintended side effect – these particular sconces rotate downward and the make the perfect reading lights. I’ve been starting my mornings reading my book, listening to music through the speakers, and enjoying the couch.

Future additions include floating shelves instead of the bookshelf, with space underneath for the subwoofer. The monoaural cable is already run for that area because I’m planning on this happening in the near future. We also have some movie art to hang still. Also a fun concession stand stocked with movie theater staples like candy, drinks, and a popcorn machine centerpiece will be right outside the door in a little inlet we have next door to the converted movie theater. It should pair well with our next project – the wet bar with beer and wine storage!

The Case Against Native Apps

Someone recently gave me a gift certificate to an online delivery service. There was nowhere to redeem it on their website. The FAQs had instructions describing links that didn’t even exist. Then it occurred to me – this is an app only feature. Then it struck me – why?

I have been doing full stack development for almost two decades now so it has been interesting to watch some industry trends. When smartphones really started gaining traction we had the App Store race – even when many of those apps just served the website in a web view. Reminiscent of the Dot Com era where everyone needed a website – even if there was no business value in having one just because. Steve Jobs even famously pushed back against the iPhone SDK and wanted developers to use HTML+CSS+JS to build their experiences on his device.

Lets discuss what some of the challenges are when working at a company with an app:

Adding a new feature on the web is as straightforward as merging your feature branch, and deploying. All clients instantly get the update the next time they visit the website. No updates required. Except we threw that out that amazing benefit of distributing software over the web and went back to release cycles with the app images. Worse, there is now an intermediary standing between you and your customers in the form of an app store approval on Apple devices.

Related to release cycles, you also have to deal with legacy version support as not all customers auto-update their apps, or even can auto update to the latest version of your app because of OS restrictions on their devices. This means a long tail of support as you have to keep around legacy code until some threshold of your users are updated and you are comfortable disenfranchising the rest.

As a full stack developer I design the feature, and then the native app teams… redesign the same feature using native screens. This duplication of features costs time and resources. Worse still it is per platform – desktop, Android, and iOS are all separate. Thanks goodness Microsoft failed with their Windows phone or there would be a THIRD native app team. In many ways I think the duopoly at play here cemented the feature workflows we have today. Lets say 10 phones succeeded – no way we wouldn’t have figured out how to build something once and have it work on all the phones (hint – its HTML+CSS+JS). But just two vendors plus desktop? We can turn a blind eye to triplication.

One idea I’ve been thinking about recently is how these native teams integrate in with the other product teams. I’ve traditionally seen the iOS and Android teams almost exclusively partitioned off as their own product teams. But they aren’t products – they are platforms! Perhaps it is better to embed native developers into the product teams from an organizational standpoint.

Now why you may ask do we have these native apps at all? Remember Steve Jobs infamously didn’t want them. Yet in the year 2021 everything is an app. The web just serves as a funnel to the app – how many times are you annoyed to download the app? Is the web not capable of these app experiences? For the most part the web is on par with native app functionality. Javascript has come a long way and can access like sensor data, geolocation, offer speech to text, provide push notifications, camera access, and single page app experiences that rival anything you can do in a native app. Many HTML transitions and animations are hardware accelerated so even on a mobile device you get a super smooth experience. How about the layout considerations on the smaller screens? Responsive layout has this solved too with many ways to adjust your website layout based on different screen sizes via media queries.

So if it isn’t technical (or in large part not a technical hindrance) why do we need these apps?! The answer I think is money. Google and Apple don’t get 30% cuts on payments made on the web because the web is an open platform. They only get that money from their walled gardens. These companies have unfathomable amounts of money and market the hell out of these app stores. I don’t see anything on the scale of that amount of money marketing the web. Despite these same companies sitting on the committees approving web standards and implementing them in their own browsers (on their own operating systems!) .

It feels like the web has been neglected thanks to an advertising campaign of FOMO that rivals the Dot Com era domain name grab. The more apps I try the more I’m convinced this app I downloaded didn’t need to be an app at all. And its worse for it with slower release cycles, duplicated development (that could have been spent elsewhere), and a general inconvenience to the user being bombarded to “Download the app”. No thank you.

I’ll leave you with a wild ass prediction – in 10 years we will look back on apps like how we look back on Rich Internet Application platforms like Silverlight and Flex. It will seem silly that we needed to rebuild what the web already provided us. If Google and Apple really wanted these features they should have just integrated them into their browsers – not made a whole new platform. We will realize that the customer does NOT benefit from an app drawer of 100 icons that we have to hunt and peck the right one to do some task. The web was here before apps, and I believe it will be here long after apps. The native app developers will be happy to know they aren’t out of a job – they can happily find work on the web teams once the platform partitioning falls away. All it takes is a realization that the web is ok, and we don’t need to spend extra money just to give 30% of it to Google or Apple – they will be just fine.

The American Dream

I’m remember being twenty. I was excited by what the day held. I can work anywhere. I can go any place. My life can be anything I want it to be. I just have to finish my degree. Then my real life will start. I just need to decide where to live and buy that house. Then my real connection with my community will start. I just need to find what makes me happy to work on then I’ll have the perfect job and Monday will feel as good as Saturday.

Now I’m in my thirties. It is a little harder to get up each morning. It takes a little longer to get going. I have committed to my job and at this point I don’t know what else I’d do and make this kind of money. A door closes on other careers. Maybe it doesn’t, but I don’t have the energy to get up and check if it is still open – even just a crack. If it is, am I passionate enough about anything to restart?

I have my own family now. I have kids and I want them to have a relationship with their grandparents while they can. I want them to have the memories that I have of going to grandma’s house and reading comics, and playing with different toys, eating cake and cookies, and staying up later than I should have. Of feeling like a VIP. That means of all the places I could have lived, the radius of where I can live with this choice is a lot smaller. Within an hour or so is about as far away as I want to get.

I remember being so excited by remote work. I thought I had found the unicorn job. This would free up so much time to pursue other hobbies, and invest in my personal relationships. The work week would just barely register on my mind for the week. Except that now I don’t know as many people and I’m not meeting any more people so that social circle is shrinking every day. The connections I hoped to form with my neighbors didn’t pan out. We seldom speak, and we don’t share much in common. And the work is the same in an office or at home. The work week is still the dominating force in my scheduling.

The real struggle now isn’t maintaining any of the things I set out to achieve in my twenties. I have a house, and a family, and career. What I struggle with is what’s next. Or what else. Or maybe it is the same question?

And will I have the energy to pursue the Next Thing even if I find it? Or will I never find it because just like those other doors I haven’t the curiosity to get up and peek? Is the next thirty years staying the course? I’m mostly happy, and healthy but I feel under utilized. On paper everything looks good. But I’m struggling to feel that sense of accomplishment internally.

I retire. I live in some dream place. Then what? I’m contemplating what the lesson is here. What should I have done? I hope at the end I have a connection to my family. That was about it. And in a few generations my great great grandkids will struggle to remember my name. Some silly job he did that is now obsolete.

I think that is what bothers me the most. Everything we’ve forgotten. Each life seems to be its own independent arc. And looking down from the apex is really making me question its purpose.

Joe Biden’s Presidential Inauguration Speech

Joe Biden Jr was sworn in as the United States 46th president today. I found the speech an extraordinary one in its call to unity and healing. This is what a president sounds like. The following excerpt I found particularly moving, and it encapsulates to me what it means to be part of the American dream:

And together we will write an American story of hope, not fear. Of unity not division, of light not darkness. A story of decency and dignity, love and healing, greatness and goodness. May this be the story that guides us. The story that inspires us. And the story that tells ages yet to come that we answered the call of history, we met the moment. Democracy and hope, truth and justice, did not die on our watch but thrive.

That America secured liberty at home and stood once again as a beacon to the world. That is what we owe our forbearers, one another, and generations to follow.

So with purpose and resolve, we turn to those tasks of our time. Sustained by faith, driven by conviction and devoted to one another and the country we love with all our hearts.

President Joe Biden – inauguration speech – January 20th 2021


Gaming On Linux – The Metagame

I’ve always been a big PC gamer. But my latest hobby combines that gaming passion with Linux. I’ve been watching Linux gaming from the sidelines for years but my assessment was always that Windows was for gaming, and Linux was for Getting Shit Done. The two worlds didn’t overlap and I was resigned to dual boot, or use two entirely separate machines.

This has all changed in a relatively short period of time, and the catalysts are the DXVK project and Proton which is a compatibility layer for Windows on Linux. It is a fork of WINE but really focuses on gaming compatibility. A really cool trick of DXVK is the ability to convert DirectX calls to Vulkan (DirectX to Vulkan) – a lower level of GPU instructions. This allows games to run very close to (if not even slightly better on some titles like RDR2) than their DirectX counterparts.

In addition to Proton (with DXVK) Two other initiatives have really propelled gaming into what I could call the realm of approach-ability: A native Steam client, and the Lutris project.


Steam, in case you have been living under a rock has a huge market share and influence on how games are distributed. There have been many imitators, but Steam still does it best – at least as far as Linux is concerned. A few titles have native Linux clients but most don’t. What Steam on Linux does is seamlessly integrate their compatibility tool Proton with their UI. You can click a game’s properties and check “Force the use of a specific Steam Play Compatibility Tool”. (Incidentally Deus Ex: Mankind Divided, Pillars of Eternity, Age of Wonders III are all Linux native!)


Lutris takes a more inclusive approach. While it can leverage DXVK WINE builds (and even Proton builds) it allows you to access games from all of the other game clients including Epic Games Launcher, Uplay, and Origin. The software operates as a database of scripts that are user maintained to configure WINE environments for maximum compatibility. It creates a bottle per game and when you uninstall it removes the entire bottle. No left over DLLs, or files – they exist in complete encapsulation. Even better because they are all encapsulated you can run one game as Windows XP, and another as Windows 10 – each one contains its own environment. Below you can see that I have Far Cry 5 (Uplay), Titanfall 2 (Origin) and Control (Epic Games Store), alongside standalone games (Return of the Obra Dinn is a standalone install).

I’ve been so thrilled with Lutris that I support them on Patreon.

Gaming Considerations

Emptor caveat! With all things Linux you need to be prepared to tinker. If you want to crack open a beer, boot a game, and jump right in then you’d best look elsewhere. That being said, I’d take a wild ass guess and say 80% just work, 10% work with minor tinkering, and the remainder either don’t work at all, or don’t work well enough to be enjoyable because of performance issues (looking at you Control, and Anno 1800). Most games run smooth as butter and its easy to forget its not running on Windows.

The biggest performance issue seems to be what I call microstutter. It is an jarring experience where ~5 or so frames drop. This usually happens while assets are loading. I can’t be sure it isn’t CPU overhead from my encrypted partition. Other people talk about caching shaders. This takes a bit of your disk to compile the shaders so they are ready to use and not being translated in real time.

Before purchasing a game I typically check these three things:

  • Is there a native Linux client (the answer is usually no, but I’ve been surprised)

  • Are there compatibility issues listed on ProtonDB?

  • (Non-Steam titles only) Is there an installer on Lutris

Another big consideration is that aside from Steam, most of the gaming clients are bad. I don’t think this is a problem specific to Linux – they are just bad. Often times to fix a crash or performance you will need to disable all of the UI overlay garbage that these launchers try and throw overtop of your game. Often they tell you to put the client into offline mode. This likely means that multiplayer support is going to be limited on these titles. If you need multiplayer – Steam is probably your best bet. Linux is officially supported and the build is very stable, and I’ve played many hours of multiplayer games (Divinity 2, Tabletop Simulator, and others).


A few other thoughts and tools to get you started – MangoHUD is a great FPS overlay. It not only shows FPS, but it goes into frametime, and shows CPU/GPU loads.

Gamemode is a utility that is supposed to improve game performance. It is supposed to change the nice / ionice levels of your games and manage your CPU governor (if on a laptop) to maximize performance. I’m not sure how much of a difference it makes – and these are changes you can make yourself but this tool does it automatically when a game is launched. Lutris and a few titles natively look for this executable and use it if available.

Of course you’ll want the latest nVidia or AMD GPU drivers. There are some useful tools in the nVidia control panel like forcing a composition pipeline which is a rather interesting way to deal with the frame tearing in a way different from V-sync.


Gaming on Linux has been something I never thought I’d see happen. I’m sure that cloud gaming (aka game streaming) has been propelling development in this area. After all, companies like Valve aren’t working on these projects out of the goodness of their hearts. And if you can get a Windows game to run in a DXVK environment you are making it cloud ready and can scale up Linux servers better than you probably could on Windows servers. That is my theory anyway.

It is also wonderful not having to switch operating systems when I want to work or game. I can do both. In fact – my workspace 3 is now the gaming workspace. I’m back to one machine, and one environment that is comprehensive for my needs.

If you’ve been on the sidelines looking on now is a great time to dive in. And with each release of Proton, and Lutris support gets better and better. And you are helping the chicken and the egg problem – developers are more likely to support Linux if there are more Linux gamers. So try the metagame of gaming on Linux – its quite rewarding!

Fetching CircleCI Artifacts

Do you use CircleCI for your continuous workflows? Do you use their scheduled jobs? Recently we had a need to retrieve some performance benchmarking via a Lighthouse service that records page load times out to a text file. This job is scheduled to run daily.

Unfortunately it can be difficult to find a build in the CircleCI UI for a given project since there is no search, and only 20 builds at a time are shown in order of most recently run. Fortunately CircleCI has an API that lets us automate the task of trying to find a build and view the artifacts from the run:

We can fetch up to 100 builds a given project a time. Some scripting allows us to narrow our results down to just the build types we are interested in. From here we now have the build number from the most recent build of a given type. In our case the type is the pageload times from Lighthouse.

Once we have found a specific build for a given project we can use the API again to ask about its artifacts: . This allows us to get the container information and paths of any artifacts produced by the job. Including our page load times.

We now have the URL for a given artifact, and it is just a matter of downloading the file by suffixing the CircleCI token to our URL:

We now have the output from our artifact. From here we can put this information into our company Slack, or even push it to a collaborate spreadsheet that the team routinely reviews. The specifics of how to automate this script, and what to do with its output is outside the scope of this post, but I will share our Ruby script for interacting with CircleCI. Should be easily adaptable to other languages. It can be viewed below:

# Finds a CircleCI job of a given name, and retrieves an artifact from a given CircleCI project build
# Usage:
# $ API_TOKEN=xxx GITHUB_USERNAME=bsimpson GITHUB_PROJECT=some-project TARGET_JOB=lighthouse ARTIFACT=averages_pageload ruby ./circleci.rb
require "net/http"
require "json"
LIMIT = 100
# Recent builds for a single project
# curl
def find_job(job=TARGET_JOB)
offset = 0
while offset < LIMIT * 10 do
url = "{github_username}/%{github_project}?circle-token=%{token}&limit=%{limit}&offset=%{offset}"
uri = URI.parse url % {
github_username: GITHUB_USERNAME,
github_project: GITHUB_PROJECT,
token: API_TOKEN,
limit: LIMIT,
offset: offset
response = Net::HTTP.get(uri)
jobs = JSON.parse(response)
matching_job = jobs.detect { |job| job["build_parameters"]["CIRCLE_JOB"].match(TARGET_JOB) }
if matching_job
return matching_job
puts "Trying offset #{offset}…"
offset += LIMIT
puts "Exhausted pages"
# Return artifacts of a build
# curl
def find_artifacts(job, artifact=ARTIFACT)
build_num = job["build_num"]
url = "{github_username}/%{github_project}/%{build_num}/artifacts?circle-token=%{token}"
uri = URI.parse url % {
github_username: GITHUB_USERNAME,
github_project: GITHUB_PROJECT,
build_num: build_num,
token: API_TOKEN
response = Net::HTTP.get(uri)
artifacts = JSON.parse(response)
matching_artifact = artifacts.detect { |artifact| artifact["path"].match(ARTIFACT) }
return matching_artifact
# Download an artifact
def download_artifact(artifact)
url = "#{artifact["url"]}?circle-token=%{token}"
uri = URI.parse url % {
token: API_TOKEN
response = Net::HTTP.get(uri)
puts response
return response
job = find_job
artifact = find_artifacts(job)

view raw


hosted with ❤ by GitHub

Move over Router – Mesh is Here

Its been a while since I’ve had my quality of life dramatically improved by a device upgrade. I recently moved my home office upstairs, and with it came a shuffling around of wireless equipment from the first floor to the second. The new office space is on the opposite side and floor from the living room with our smart TV. No matter how I positioned the wifi router one side suffered. And putting the router in the middle meant it would be in the kids rooms.

Adding a wireless range extender practically made the problem worse as the devices tried to connect to the wrong network for their location, and the speeds while connected to the range extender were terrible.

Fed up I started doing some research into routers with the furthest range, highest speeds, etc. That is when I came across a new category of “mesh” networks. These devices offer multiple access points called nodes that promise to seamlessly shuffle clients around based on the optimal connection path. After some research I decided on the TP Link Deco M4 3 pack . I had a promo code and the total price came out to ~$150 shipped.

After using it for a few weeks, I’m ready to review. Spoiler alert – I’m bursting with happiness. I’ll address a few main categories of these devices:


I have a 2,500 sq foot house on 2 floors + deck . The 3 nodes cover this easily. Nowhere in the house, or yard have less than 3 bars. I have configured these in a triangular arrangement with two nodes being on opposite sides on the house on the 2nd floor (as diagonal as I could get them). The other node is on the 1st floor half way between the nodes on the top floor.

I haven’t tried a two node setup, which might be more representative of what the old router + range extender were delivering, but why would I? The whole point of a mesh network is that you can keep adding nodes until you have great coverage.

As an experiment I walked around the cul-de-sac and stayed connected an impressive way out. Whatever is inside these nodes (or maybe they are working in aggregate) had great transmitting power, all without looking garish with external antennas everywhere.


On the TP-Link Deco M4 network, I get 300+Mbps anywhere inside the house. Outside on the deck this drops to 200Mbps For comparison, with the old ASUS RT-AC68U + Linksys RE6500 range extender I would get ~200Mbps in the same room as the router. The range extender never got above 100Mbps, and the deck (when I could get signal) would be around 20Mbps. The mesh network link speed blows away the traditional router + extender setup.

One more technical note here – the nodes are tri-band which means that you get the full bandwidth through each node instead of it being halved.


The TP-Link (and many of the other commercial mesh kits) come with a smartphone app to setup the devices. I was initially turned off by this. After all – everything today claims it needs you to install an app when a basic mobile website is probably sufficient.

The app however is clean, and aided in setup versus the traditional laptop approach, potentially having to plugin with an ethernet cable to the router to initially configure the network.

The nodes are all identical, so it doesn’t matter which on you connect to the modem. It correctly figured out that was the Internet connection, and even circumvented the silly tendency for modems to only bind to one MAC address. The physical setup involves nothing more than plugging in the node to an AC outlet, and for the initial node plugging it into the modem. The app detects the node you are next to, and walks you through setting up the wireless network.

Flashing lights on each of the nodes informs you if they are online and working properly or experiencing an issue.

The nodes all share the same wifi network name, and devices will switch between them automatically. Setup options are pretty standard (maybe even somewhat limited). You choose an SSID, create a password, and choose whether to broadcast the network. You don’t even pick between 2.4Ghz and 5Ghz networks – this is all managed for you. The device will use the best network available. My old laptop can’t see 5Ghz networks and connected just fine. The Deco offers a few other features like QOS settings, reserved IP addresses, blacklisting, reports, etc.


This looks to have been a historical weak point for mesh networks. New technologies typically come with premium price tags. I think enough time has passed that mesh network kits are about on part with a new router and range extender. I paid $140 having caught a sale that took $40 off the price.


I would absolutely recommend a mesh network to just about anyone, possibly with the exception of someone that has advanced needs for their network setup. This feels like an evolution of the wireless router. It offers superior range and speeds relative to my previous router + range extender setup for about the same price. Setup is painless, and this has fixed all of my wireless issues throughout the entire house. I’ve retired my router, and range extender.

I’ve also retired my USB wireless adapter for my desktop since I have a mesh node sitting on the desk, and have opted instead to connect with an ethernet cable. I’ve also managed to retire a wifi bridge for a NAS device that I have because again with 3 nodes, I can easily place this NAS next a node and connect with an ethernet cable.

All said and done I threw out more equipment than I setup. This was an absolutely great decision in my opinion and at the risk of sounding like a sponsored post – I can say I couldn’t be happier.

Updating database rows with a default position

I was recently tasked with making a table in a MySQL database sortable by the user. There are a few steps involved to take care of new records, while providing existing records with a default value.

The new records would be created with a ifnull(max(order), 0) + 1 to assign them the next position. This was sufficient since I was positioning based on created_at and a new record was guaranteed to have the highest created_at timestamp.

For the existing records, I wanted to:

  • group by user_id
  • set the order (int) column to an incrementing integer
  • with an initial ordering based on a created_at timestamp

The first approach was to select groups of users, and for each group sort the records then update each record. This approach would work, but it wouldn’t have good performance. I really wanted to do this in a single update statement.

I ended up going with an approach based on this article :

set @user_id = 0, @order = 1

update table1 as dest,
  select, x.user_id, x.created_at,
  @order := if(@user_id = x.user_id, @order + 1, 1) as `order`,
  @user_id := x.user_id as dummy
  from (
    select id, user_id, created_at
    from table1
    order by user_id, created_at
  ) as x
) as src
set dest.`order` = src.`order`
where =;

Let’s break this down:

  1. We are using database session variables to keep track of some state. We default to @user_id = 0 and @order = 1
  2. Working from from the inner-most select, we select the id (primary key), user_id (int)  and created_at (timestamp) as these are the columns needed to do our calculations for the new order value. We order these by user_id and created_at to get the sort ready. This dataset is aliased as x (yes I know I’m terrible at naming)
  3. The next select up pulls the same id, user_id and created_at from the inner-most select and does some calculations persisting the state in the session variables we setup in step 1. The actual work here is:
    1. Calculating the order: @order := if(@user_id = x.user_id, @order + 1, 1) . The := is an assignment in SQL. We set the @order to either a default of 1, or the @order + 1 .
    2. Comparing user to determine what to set the @order to. This is done in a conditional expression:  if(@user = x.user_id). Because the @user_id is set to the x.user_id value on each row, the comparison should return true until we get to the next user, at which point the @order will reset to 1.
  4. Finally the outer update takes the resulting src dataset and does some updating. We constrain to where = to get the correct record, and then we set dest.order = src.order which is our calculated value.

Performance is looking great. On a dataset of 300k records, this runs in 3 seconds on a 4 CPU Docker container.

Bonus: If you have additional criteria (like only sorting favorite records for example) this can be easily applied to the inner-most SELECT in the where clause!

Happy ordering!

new Job().beginTraining()

It has been an exciting month for me. I put in my notice with Influence Health at the beginning of June, and served through the end of the month. After that took 2 weeks off before my training begins for Doximity on July 16th. The training is in San Francisco, and then immediately followed up with a July 30th co-location in Boulder, Colorado. This position is remote, aside from the several co-locations done each year. I am excited to start with a new team, on a new project, with a mix of old and new technologies.

Over the career at Influence Health I don’t feel that I got much deeper in my knowledge of Rails. What I feel I gained instead was the breadth of general programming skills. I configured Github repos, setup Jenkins, scripted a Hubot instance to assign pull requests, and made a continuous integration system. I created new policies following best practices to open a pull request with a change, write unit tests, and have one other person on the team review your changes before merging. I implemented linting, and worked with one of my coworkers to bring Webpack into Rails to rethinking how we manage Javascript. I also went very deep into the AWS services touching S3, Lambda, RDS, Redshift, Data Pipelines, Batch, Step, ELB, EC2, ECS, Cloudfront, and into other technologies like PostgreSQL, Docker, ElasticSearch, Capistrano, EventMachine, and Daemons. Being exposed to all of these new services has made me consider different approaches to traditional problems, and I feel has made me a better developer.

The new job at Doximity sheds my managerial role that I was voluntold to do at Influence Health. I thought I might have enjoyed it (maybe I would have under better circumstances). At the end of the day it wasn’t being a manager that killed the deal for me. It was being a manager, while still being a tech lead, while being an architect, a core contributor, and many other things. To manage well is a full time job. Tacking it onto an existing role made me feel inadequate as a manager, and I don’t like that feeling. So with the managerial role off my title the new role is back to software developer, and I’m ok with that. The compensation is right, and I felt like I was getting further away from the code than I wanted to be. At the end of the day developing something and seeing it work is what drives me. There is a technical lead track that I might pursue in several months if I feel like I am ready.

The technology stack is a mixture of Ruby and Javascript. After working with Javascript more heavily the last 6 months I have mixed feelings. I’m definitely excited because I do think the future of web development has coalesced on Javascript. And Javascript has risen to the challenge and gotten a lot better. Gone are the imposter “rich internet applications” like Silverlight, and Flex. Gone are the browser plugins for languages like Java and Flash. Javascript just works. And the browsers are really blazing trails, even Microsoft, so I believe that learning Javascript is a solid career investment. There is an excitement in the ecosystem (a little too excited imo, but I’ll take that over being dead)

Popularity aside, Javascript has less magic than Ruby which is a again, both good and bad. I appreciate seeing require statements, and knowing with absolute certainty that private methods are private. In Ruby for everything you can do to protect something, someone else can (and I find frequently does) find a way to circumvent it. I especially appreciate the strong linting culture that mitigates entire debates on code style.

I find the syntax of Javascript to be unattractive coming from Ruby, but it is more consistent. All of the parenthesis, semicolons, etc are just noisy. The surface area of Javascript is also much smaller which leads everyone to jump to utility libraries like Lodash, Underscore, etc. The language just needs to keep maturing and building in common methods. Date manipulation in particular is atrocious. async/await seems like we finally have a clean syntax for managing asynchronous code.

I do still feel like we are fitting a square peg into a round hole by having client side, single page applications. This isn’t the way the web was designed (it was made for request/response), and the fat client pattern still feels immature. Having a client side framework like Angular, (or even a library like React) does take care of managing some of the complexities . GraphQL takes the sting out of fetching all the combinations of data you might want from the server. Auth is taken care of with JWT and services like Auth0.

On the server side, using Node has been a mixed bag. I would like to see a few big frameworks that the community standardizes on, however the mentality seems to be build your own framework from a collection of your favorite libraries. As a result you can’t just jump into a project and immediately know where things are. I do however really enjoy asynchronous execution of code. It is a little harder to write and understand but wow can it be fast. I have already seen very positive results from Ruby implementations of batch jobs that took hours converted to Javascript and taking minutes. You simply don’t wait around, and it can think of your dependency tree in a completely new way.

At the end of the day I am excited but a little cautious. Ruby + Javascript sounds like a killer combination if you use each tool for what it does best. I don’t see the Ruby community lasting another decade so this is the perfect transition point to jump into Javascript. And I’m glad that it was Javascript that won out over Flex, Silverlight, JSP, etc. At least for the next 5 years until the new shiny technology comes out and people jump ship.

Fahrenheit 451

Everyone must leave something behind when he dies, my grandfather said. A child or a book or a painting or a house or a wall built or a pair of shoes made. Or a garden planted. Something your hand touched some way so your soul has somewhere to go when you die, and when people look at that tree or that flower you planted, you’re there. It doesn’t matter what you do, he said, so long as you change something from the way it was before you touched it into something that’s like you after you take your hands away.

  • Ray Bradbury