Gaming on Linux – 2023 Edition

A few years ago I wrote about Proton and DXVK and what it meant for gaming on Linux: . I figure it is time to revisit how its going.

I suspected at the time that Valve was pouring money and development into these projects with the goal of building a Linux backed cloud gaming service to compete with nVidia’s Geforce Now (aka the final resting place of the $1200+ GTX 4080s no one is buying), and the now defunct Google Stadia. Side note – when Sundar Pichai announced at a Stadia event that he wasn’t a gamer we should have all guessed it was dead on arrival.

Fast forward to 2023. Valve has unveiled and started shipping the Steam Deck – a portable PC running SteamOS using Arch Linux under the hood. It promises wide compatibility with games including Windows-only titles. Gearing up to launch, the number of Proton compatible games exploded on . WINE, and DXVK, Proton, and related projects have seen rapid releases which bring big new features. There hasn’t been this much focus on Linux gaming possibly ever. But now that there is commercial interest in building products backed by these libraries, the ecosystem is being funded and is maturing rapidly.

It has been a while since I’ve dual-booted to Windows to play an online game. The sole exception would be the metastatic Xbox Live service which has spread into the Windows OS in a way that rivals the Internet Explorer debacle of Microsoft vs the DOJ from 1998. It’s a mess – even on Windows it barely works (usually open, crash, updates needed, crash, reboot, open crash, open, finally!). Almost every other gaming platform has some way to access and play its titles in Linux. Steam is at the forefront with a native Linux client, but Epic Games Store has support via Lutris, and Heroic. GoG allows downloads of Linux versions of its games, and has Lutris and MiniGalaxy support. Other launchers like Ubisoft, and Origin also run via Lutris with minimal fuss.

This has all built up to my latest purchase. I’ve had an nVidia GPU for as long as I can remember – probably 20 years now. This span was AMD’s dark ages. They’ve emerged now in a Renaissance era of Ryzen chips, and a new line of fast Radeon GPUs. I initially ignored them as the first generation of Radeon GPUs were not competitive with their green counterparts. However nVidia has made a number of decisions lately that have just given the game to AMD:

  • nVidia GPU pricing has no basis in reality. My price point every generation has been around $500 for a high-end GPU, (but not bleeding edge). The 40x generation with an MSRP at $1,200 makes this a poor proposition. AMD has also gone up in pricing but is still considerably less than the competition. Scalping and crypto have exacerbated the pricing issue and I’ve yet to see the previous generation sell at even MSRP and stay in stock for long enough to add it to the cart. Side note: At Microcenter there were plenty of 40x series cards sitting ignored behind their locked glass case
  • nVidia has continued to see issue after issue with latest generation. EVGA pulled out as a partner – further limiting consumer choice. The power draw has reportedly melted cables. The dimensions are the size of a mid-sized sedan, and won’t fit in many existing cases. Then there was the 4080 which was really a 4070 in all but marketing. Nice try marketing folks.
  • nVidia “kind of” supports Linux. Their approach is a proprietary module you load into the kernel. AMD has their driver in the kernel. The driver quality discrepancy is quite large now in AMD’s favor. Looking at and other Linux forums things work if you have AMD, and don’t work if you have nVidia. I have been unable to run Wayland on my GTX 1080 without artifacts, applications failing to paint, screen capture issues, etc. There is only so much the community can do with a closed-source blob.
  • AMD was smart about having some ecosystem integration. nVidia only makes GPUs, and Intel is in the very early stages of making GPUs. AMD sits comfortably on both sides of the CPU/GPU fence and has a cool feature of resizable BAR with Ryzen CPUs.

This all culminated in my decision to support AMD with my wallet again (I’m a Ryzen owner) and go with a Radeon 6800 XT. It trades blows with a GTX 3080 in performance, aside from ray tracing and DLSS. nVidia is the clear winner there, but it doesn’t justify everything else. I would rather have a GPU that works flawlessly under Linux, at a lower retail price. And that is exactly how it played out. I bought a 6800 XT for $512 from Microcenter last weekend and put it in my Ubuntu 22.10 PC. IT JUST WORKED. Zero driver setup, I fired up a Steam game (Psychonauts 2) and it ran like butter. Windows gaming might be in the rearview mirror between my PS5 and my Ubuntu desktop.

Out of curiosity, I decided to fire up Control – one of the first games that supported ray tracing just to see how the 6800 XT fairs. I watched it all from the sidelines before there were Linux drivers or support for ray tracing. While the non-ray tracing version of Control is a one-click install via Lutris, setting up ray tracing support took a bit more research. I wanted to share my findings in case anyone wants to try the same. In Lutris you want to configure your system options and add the following Environmental variables:

  • AMD_VULKAN_ICD=RADV (there is a “Vulkan ICD loader” select list for this that does the same)
  • VKD3D_CONFIG=dxr

I got a solid 45fps out of my 2560×1080 monitor with ray tracing on medium settings. You can independently adjust render resolution and engine resolution so it technically is upscaling this to a 3440×1440 but it has little or no impact on performance. All in all, I was impressed by the performance of a GPU that isn’t known for having outstanding ray tracing performance. To hit these numbers on an emulated Windows title is a testament to how mature this ecosystem has become.

I’m excited about the future of Linux gaming. The Steam Deck has really moved this into the mainstream. The driver support is there, the software is there, and many of the titles I’ve played recently even have native Linux builds (Warhammer III, Psychonauts 2, Hollow Knight, Wasteland, Desparators, Pillars of Eternity, Slay the Spire, Deus Ex, and more added all the time. It is hard to even know which games are native vs running in Proton/DXVK because the performance is about the same. In fact, I played about 2 hours of Warhammer III before I realized I was running it through Proton needlessly because there is now a Linux build available.

So if you are nervous about taking the plunge, or curious how AMD GPUs would fair, or how ray tracing works on Linux there you go! Happy gaming!


Ludicrous Cores Mode

I feel like the Intel i3/5/7 series processors were around for so long its hard to remember a time before. I’ve always liked supporting the underdog and seeing a good comeback story. AMD really knocked it out of the park with the Ryzen launch. I grabbed a 2700X processor (8C / 16T) at launch in 2018. I remember thinking at the time “I just can’t buy yet another 4 core processor.” I had a first gen i7, a 4th? 5th? gen i5, and I was staring down the barrel at a 10th gen i9.

The 16C/32T 5950X

Meanwhile their competitor AMD was giving out higher core counts like Skittles! The 2700X was double the cores from what I had. And to double again would take 16 cores to make me move. Sure enough the 5950X launched for a small fortune and I’ve been watching it for several years until it became what I’d consider affordable – $500. So I snagged one.

Why the heck do I need 16 cores? Well, I probably don’t. But I’m an enthusiast and I’m celebrating a job promotion so give me a break! I do development, and some tasks like ~30+ docker containers, unit testing, linting, webpack compiling, and of course gaming will probably see an uplift in performance. At least that is how I sleep at night.

First challenge in researching? Motherboard chipset compatibility. I had an X470 chipset, and AMD has always been MUCH better at backwards compatibility with chipsets than Intel. Sure enough, two generations later all I needed was a BIOS flash and I was good to go with my current motherboard and RAM. Intel meanwhile might as well solder the CPU to the chipset because it only works with that year’s processor.

Next up? The 5950X is a beast of a machine, but it doesn’t ship with a heatsink. While I understand that this isn’t written in stone anywhere, I’d say it is… unusual. Typically you get a box cooler. Again if you are used to Intel this is basically some crumpled up soda can with a fan that has the cubic air flow of an asthmatic 90 year old. I was debating an AIO water cooling solution, but I found a blog post listing the best CPU coolers and high end air coolers are surprisingly competitive. I do have some anxiety about liquid being suspended above expensive hardware. I snagged some Noctua case fans about 6 months back and loved them. They were so quiet I was paranoid they weren’t getting power. I had to crawl under my desk with a flashlight to confirm they were indeed spinning. That was all the salesmanship I needed for a Noctua D15 cooler. This thing is a beast. Look at the size of the CPU and the size of the cooler.

Wraith Prism (stock) cooler left, Noctua D15 cooler right (minus the fans)

Other things I had to verify – does this monster of a cooler fit inside my case? Yes… barely. Its comical how close it comes to the sides of the case but it works! One less thing I have to purchase to get running on 16 cores.

Install day!

First I removed the old CPU and stock cooler. Which… hat tip to you AMD – for being a stock cooler this thing is well built. A little noisy but compared to Noctua everything is noisy.

AMD Wraith Prism (stock) cooler. The 2700X CPU actually came out with the heatsink. Its stuck on there.

Next I took off the stock case fan brackets and put in the Noctua mount. I had to go with a configuration that was parallel to my RAM otherwise part of the heat pipes made contact with the closest RAM module. It was smart engineering that you can rotate this in either configuration so that saved the day.

On top and bottom the Noctua brackets is installed

Next I very very very carefully removed the 5950X from its case. The Noctua D15 kit included some thermal compound and at this point I absolutely trust they know what they are doing. I did make an X pattern after doing some research (over the recommended dot in the middle) for better coverage.

Next I set this monster heat pipe and fins in the parallel to RAM configuration and tightened in the screws as much as I dared.

Finally I clipped on the Noctua case fans, both pointing to the rear of the case venting hot air from the radiators. I originally wanted to put one behind the fins closest to the RAM but I would have needed to lift the fan up over the RAM module. This is an option that Noctua supports however that would have pushed it out past the case lid so I opted not to. Instead I have both fans pointed in the same direction towards the back of the case, pulling air across the two fins. The farthest from the RAM is just pulling, but the other fan is in the middle so it both pulling from the RAM side fin, and pushing to the other fin stack. Still monitoring temps but the fans haven’t even cycled up to their highest settings under moderate work.

This heat fin sits just above my RAM modules. I ended up clipping the middle fan to pull from the left

Next – put it all back together! And a quick prayer to the BIOS POST gods. Success! Well, except I obstructed one of the fan blades with a wire that I had to move. Then success!

Feeling extra confident I even tried the thing I’ve been hoping I could do for a while… running my 3600Mhz RAM at a full 3600Mhz. I’ve read that this isn’t possible on every processor. Sure enough, I POSTED, and running DOCP and factory settings let me hit 3600Mhz. Its been about an hour so I’m hopeful we are stable. The 2700X only hit 2933Mhz… a sore point after having paid a premium for fast RAM during a period of high RAM costs.

So here are some more pictures. And of course the glorious visual of my cores. It was a fun project, a solid upgrade, and pretty painless (fingers crossed!)

32 Threads! Wondering what I should do first?
All done! These Noctua fans sure multiply

DIY Home Theater

We recently bought a house with an extra room in the basement. The other rooms were all the same paint color, carpet, lighting and other finishes. We decided to take one of the rooms and give it some personality by turning it into a home theater. We could have just ordered a TV, some speakers and a couch and called it a day but the room needed some character. Our budget for this project was $15k but we soon realized we would only need about half of this budget and get everything we wanted.

After considering a few options we decided on a dark blue paint (Behr Superior Blue) for the entire room and an accent wall with board and batten. Aside from waiting for a TV sale, the room was transformed inside of just a few weeks. I went with an LG OLED for its superior contrast – something that I’ve been envious of after having seen my friends. There is no LED dimming – and glow with an OLED. But it is costly – and this was our biggest line item on the home theater. I paired this with a Klipsch R-625FA floor speaker pair with the R-12SW subwoofer and R-52C center channel and R-41 bookshelf speakers for the rear. Powering this is the Sony DH790 receiver. For furniture we chose the Larna Park Taupe sectional couch with ottoman. This was our second biggest expense with the speakers+receiver being third. Final touches include a Quaniece console for some additional storage and a pair of the Beaker Plug-In Black Wall Sconces, and Britton series ceiling fan from Home Depot. Stay tuned for the total cost!

Day one was preparation and and supply run to Home Depot. We watched a few YouTube DIY videos and decided on a 3 1/2″ primed pine board and 6 vertical battens and 2 horizontal battens for a total of 12 of the 3 1/2×8′ boards. This was roughly determined by how large we wanted each square of the grid and using an online calculator. This takes the guesswork out of positioning, padding, and board widths and just tells you how far apart to space them. First run of boards, 1 gallon of paint, liquid nails and other materials was ~$200. I was fortunate to be able to get a brad nailer and air tank from my father and law so that was not a tool I had to purchase. To make cleanup easy I decided I would buy a new painting kit for $10 and chuck it in the trash when I was done. Cleaning a paint roller takes forever, and it never works as good the second time anyway. You will also need some other basic materials – tape measure, saw, level, hammer, screwdriver, caulking gun. I’m not including these in the cost of this project since I use them for everything.

First we took down the existing crown moulding on the intended accent wall. We had a decision to make on 1) taking down all the trim, 2) taking down just that trim and joining it to the board and batten or 3) building the frame below the existing crown moulding. We opted for number 2 after pulling down the crown from one wall and seeing we had a sizeable gap between the drywall and ceiling. That would have been a lot to contend with across the entire room. Cutting back the crown I used a saw to make a 90 degree cut so it could fit flush against the batten frame. Brad nails about every two feet, and liquid nails on the back of the board.

The room is so plain – yuck! Day two is when the project really took off. We put up our first vertical board being very careful, measuring everything multiple times, and generally moving slow. It soon became a routing however. Don’t think that your wall height or width is the same and measure each cut individually. We did all the vertical boards, using the brad nails and liquid nails on the board backs. Next we measured and cut the horizontal boards. A few of the videos said use a spacer board but I found it unnecessary. Instead we just measured each board placement and attached. A laser level makes the horizontal boards go much quicker. We made a line all the way across (you want to true level, not parallel to the ceiling/floor. This was pretty much the second day. As a nice end of day bonus the TV and the speakers both arrived.

Day three we prepped for painting. This involved some drywall repair to other sections of the room, sanding, removing face-plates, spackling, and caulking. A wood putty is used on the batten board joins to make it look like a single piece of wood. This has to be applied, allowed to dry, then sanded flush. The spackle took a few passes – applying then letting dry only to build on top in some of the worst places. A lesson learned here was apply caulking to the interior squares. We were not going to do this at first, but it was super quick, and it looks fantastic – especially where the board and the wall had a gap.

Day four was a continuation of our paint with an emphasis on the detail work. We taped off our edges and really tried to cleanup our lines and fix some dripping and sparse areas. This step seems to be the biggest factor in what looks like a DIY project and what looks like a professional project. You just have to keep going and getting more and more precise in your results. Cumulatively you want to look at the paint and have anywhere you look be sharp lines and cleanly applied.

Day five the sectional couch arrived in the morning. We went with a two tone gray and dark brown to match the dark blue paint color. The sectional dimensions work well with the room fitting almost perfectly on the back wall in front of the batten. I absolutely love how much room the sectional provides – the entire family can comfortably sit and watch a movie together. We mounted the TV on the opposite wall with a slim profile wall bracket to make it just blend in like a hanging picture

The slim mount actually had two settings – one with a more aggressive tilt. The TV is so thin however it felt better to have it on a very slight 5 degree tilt to optimize viewing. The vertical on the TV was around 68 inches so we bought a console that was 88 inches so it didn’t look dwarfed by the TV. This has some shelves and doors to hide the controllers, cables, and provide a buffer for the wall to keep people from bumping into the TV screen.

Next up was the speaker installation. The Platin Monaco speakers were unfortunately a big disappointment. Lots of popping, dropping, crackling, and speakers getting “stuck” on a note like a computer freezing. The nail in the coffin however is that WiSA doesn’t support Dolby DTS which is the surround sound format that virtually every Blu-ray disc uses. These went back and were replaced with a more aggressive sound system – but for roughly the same cost by catching a good sale on the speakers, and using an Amazon gift card on the receiver. I took a gamble on WiSA but if you care about your audio then move on to a wired system – no comparison and I’m not even a trained ear.

Running all the speaker wire was one of the hardest projects. I had a door frame on either side of the wall with the TV so getting to the rear speakers was an undertaking. I initially bought some 1/2″ cable raceways and painted them all and was going to tuck the speaker wire inside them. This might have worked well, but the new sound system came with a large 10″ subwoofer that I wanted to locate to the other side of the room between the couch and the wall. Wherever the rear speaker cables went, the suboofer monoaural cable needed to go to. And at lengths > 30″ it is recommended it be shielded – which of course means extra bulk. They were too tight a fit for the raceway, and after painstakingly getting it somewhat crammed in there I realized that the adhesive was not sufficient to keep this stuck to the walls/ceilings. I didn’t want a larger raceway either and I still didn’t want cable staples with visible cable. I ended up debating options and fished the cables under the carpet at the doorway, and tucked them under the baseboards along the rest of the wall the best I could. It was the third or fourth option but I’m happy with it. You don’t notice unless you step on the cable without wearing shoes.

The final step was crimping all the speaker cables, and connecting my speakers and giving it a run for its money. After about 30 minutes of Dunkirk on Blu-ray I realized I wasn’t getting signal from my left speaker. Stupid mistake – on checking the speaker I realized I neglected to connect the left channel – having connected only the Atmos/height channel further up the speaker. Easy remedy.

One other pain point was getting the PS5 to output 4K+HDR via a receiver. The receiver sits between the TV and PS5 and I get all the niceties that entails – synchronized power on/off, and the TV remove controlling the PS5 and the receiver volume. However the PS5 kept informing me that do output HDR I needed to drop to 1080p. I was thinking the receiver was just not going to support this feature when I found a Reddit post of a user with the same issue resolving it by setting the HDMI port the PS5 is plugged into into “enhanced” mode in the AV options. After a restart the PS5 immediately picked up 4K + HDR completing the build.

The painted cable raceways that I abandoned were not a complete loss however. I did use one of the larger ones under the TV instead of using a gang plate and running cables through the wall. This pairs with the batten somewhat unintentionally and is a whole lot easier than fishing cables. The sconce cables were also secured using some of the cable raceway meant for the speaker cable and cleaned up that installation. The discarded raceways even got involved in fishing the cables under the carpet!

The ceiling fan was also a late purchase when we realized it had a wobble at higher speeds. Instead of trying to balance it we realized that the rest of the room had so much character that the contractor model fan just didn’t fit any longer. This was a relative simple swap and has the added benefit of being much quieter while operating and comes with a remote control.

We are really happy with the final product and having done it ourselves we saved quite a bit of money. All said and done we spent about $6800. Having done it ourselves there is an extra feeling of pride and attachment and we are ready to show it off to family and friends. Had we paid to have it done we might have had an electrician run wiring and a new switch for the wall sconces but the plugin ones seem fine – especially with a dark paint. This should make a great room for movie watching, and some gaming sessions on the PS5 and Switch. I do wish the room had already been run for cable but fishing them under the carpet only took an hour or so and I’m happy with the result. The only real disappointment with the project was the WiSA speakers. But I took a gamble with a new technology and instead ended up returning them and got some seriously amazing sound as a backup. Another unintended side effect – these particular sconces rotate downward and the make the perfect reading lights. I’ve been starting my mornings reading my book, listening to music through the speakers, and enjoying the couch.

Future additions include floating shelves instead of the bookshelf, with space underneath for the subwoofer. The monoaural cable is already run for that area because I’m planning on this happening in the near future. We also have some movie art to hang still. Also a fun concession stand stocked with movie theater staples like candy, drinks, and a popcorn machine centerpiece will be right outside the door in a little inlet we have next door to the converted movie theater. It should pair well with our next project – the wet bar with beer and wine storage!

Updates! (January 2022)

Lifted the surround speakers off the ground with tables. Double as drink/remote holders.
Relocated the subwoofer under the floating shelves and did some cable management with some staples

Floating shelves are installed, and the rear speakers have been lifted to the same listening height has the front. Pictures are finally up on the wall too. Very happy with how this project turned out. It took months to get all the right pieces but well worth the wait.

The Case Against Native Apps

Someone recently gave me a gift certificate to an online delivery service. There was nowhere to redeem it on their website. The FAQs had instructions describing links that didn’t even exist. Then it occurred to me – this is an app only feature. Then it struck me – why?

I have been doing full stack development for almost two decades now so it has been interesting to watch some industry trends. When smartphones really started gaining traction we had the App Store race – even when many of those apps just served the website in a web view. Reminiscent of the Dot Com era where everyone needed a website – even if there was no business value in having one just because. Steve Jobs even famously pushed back against the iPhone SDK and wanted developers to use HTML+CSS+JS to build their experiences on his device.

Lets discuss what some of the challenges are when working at a company with an app:

Adding a new feature on the web is as straightforward as merging your feature branch, and deploying. All clients instantly get the update the next time they visit the website. No updates required. Except we threw that out that amazing benefit of distributing software over the web and went back to release cycles with the app images. Worse, there is now an intermediary standing between you and your customers in the form of an app store approval on Apple devices.

Related to release cycles, you also have to deal with legacy version support as not all customers auto-update their apps, or even can auto update to the latest version of your app because of OS restrictions on their devices. This means a long tail of support as you have to keep around legacy code until some threshold of your users are updated and you are comfortable disenfranchising the rest.

As a full stack developer I design the feature, and then the native app teams… redesign the same feature using native screens. This duplication of features costs time and resources. Worse still it is per platform – desktop, Android, and iOS are all separate. Thanks goodness Microsoft failed with their Windows phone or there would be a THIRD native app team. In many ways I think the duopoly at play here cemented the feature workflows we have today. Lets say 10 phones succeeded – no way we wouldn’t have figured out how to build something once and have it work on all the phones (hint – its HTML+CSS+JS). But just two vendors plus desktop? We can turn a blind eye to triplication.

One idea I’ve been thinking about recently is how these native teams integrate in with the other product teams. I’ve traditionally seen the iOS and Android teams almost exclusively partitioned off as their own product teams. But they aren’t products – they are platforms! Perhaps it is better to embed native developers into the product teams from an organizational standpoint.

Now why you may ask do we have these native apps at all? Remember Steve Jobs infamously didn’t want them. Yet in the year 2021 everything is an app. The web just serves as a funnel to the app – how many times are you annoyed to download the app? Is the web not capable of these app experiences? For the most part the web is on par with native app functionality. Javascript has come a long way and can access like sensor data, geolocation, offer speech to text, provide push notifications, camera access, and single page app experiences that rival anything you can do in a native app. Many HTML transitions and animations are hardware accelerated so even on a mobile device you get a super smooth experience. How about the layout considerations on the smaller screens? Responsive layout has this solved too with many ways to adjust your website layout based on different screen sizes via media queries.

So if it isn’t technical (or in large part not a technical hindrance) why do we need these apps?! The answer I think is money. Google and Apple don’t get 30% cuts on payments made on the web because the web is an open platform. They only get that money from their walled gardens. These companies have unfathomable amounts of money and market the hell out of these app stores. I don’t see anything on the scale of that amount of money marketing the web. Despite these same companies sitting on the committees approving web standards and implementing them in their own browsers (on their own operating systems!) .

It feels like the web has been neglected thanks to an advertising campaign of FOMO that rivals the Dot Com era domain name grab. The more apps I try the more I’m convinced this app I downloaded didn’t need to be an app at all. And its worse for it with slower release cycles, duplicated development (that could have been spent elsewhere), and a general inconvenience to the user being bombarded to “Download the app”. No thank you.

I’ll leave you with a wild ass prediction – in 10 years we will look back on apps like how we look back on Rich Internet Application platforms like Silverlight and Flex. It will seem silly that we needed to rebuild what the web already provided us. If Google and Apple really wanted these features they should have just integrated them into their browsers – not made a whole new platform. We will realize that the customer does NOT benefit from an app drawer of 100 icons that we have to hunt and peck the right one to do some task. The web was here before apps, and I believe it will be here long after apps. The native app developers will be happy to know they aren’t out of a job – they can happily find work on the web teams once the platform partitioning falls away. All it takes is a realization that the web is ok, and we don’t need to spend extra money just to give 30% of it to Google or Apple – they will be just fine.

The American Dream

I’m remember being twenty. I was excited by what the day held. I can work anywhere. I can go any place. My life can be anything I want it to be. I just have to finish my degree. Then my real life will start. I just need to decide where to live and buy that house. Then my real connection with my community will start. I just need to find what makes me happy to work on then I’ll have the perfect job and Monday will feel as good as Saturday.

Now I’m in my thirties. It is a little harder to get up each morning. It takes a little longer to get going. I have committed to my job and at this point I don’t know what else I’d do and make this kind of money. A door closes on other careers. Maybe it doesn’t, but I don’t have the energy to get up and check if it is still open – even just a crack. If it is, am I passionate enough about anything to restart?

I have my own family now. I have kids and I want them to have a relationship with their grandparents while they can. I want them to have the memories that I have of going to grandma’s house and reading comics, and playing with different toys, eating cake and cookies, and staying up later than I should have. Of feeling like a VIP. That means of all the places I could have lived, the radius of where I can live with this choice is a lot smaller. Within an hour or so is about as far away as I want to get.

I remember being so excited by remote work. I thought I had found the unicorn job. This would free up so much time to pursue other hobbies, and invest in my personal relationships. The work week would just barely register on my mind for the week. Except that now I don’t know as many people and I’m not meeting any more people so that social circle is shrinking every day. The connections I hoped to form with my neighbors didn’t pan out. We seldom speak, and we don’t share much in common. And the work is the same in an office or at home. The work week is still the dominating force in my scheduling.

The real struggle now isn’t maintaining any of the things I set out to achieve in my twenties. I have a house, and a family, and career. What I struggle with is what’s next. Or what else. Or maybe it is the same question?

And will I have the energy to pursue the Next Thing even if I find it? Or will I never find it because just like those other doors I haven’t the curiosity to get up and peek? Is the next thirty years staying the course? I’m mostly happy, and healthy but I feel under utilized. On paper everything looks good. But I’m struggling to feel that sense of accomplishment internally.

I retire. I live in some dream place. Then what? I’m contemplating what the lesson is here. What should I have done? I hope at the end I have a connection to my family. That was about it. And in a few generations my great great grandkids will struggle to remember my name. Some silly job he did that is now obsolete.

I think that is what bothers me the most. Everything we’ve forgotten. Each life seems to be its own independent arc. And looking down from the apex is really making me question its purpose.

Joe Biden’s Presidential Inauguration Speech

Joe Biden Jr was sworn in as the United States 46th president today. I found the speech an extraordinary one in its call to unity and healing. This is what a president sounds like. The following excerpt I found particularly moving, and it encapsulates to me what it means to be part of the American dream:

And together we will write an American story of hope, not fear. Of unity not division, of light not darkness. A story of decency and dignity, love and healing, greatness and goodness. May this be the story that guides us. The story that inspires us. And the story that tells ages yet to come that we answered the call of history, we met the moment. Democracy and hope, truth and justice, did not die on our watch but thrive.

That America secured liberty at home and stood once again as a beacon to the world. That is what we owe our forbearers, one another, and generations to follow.

So with purpose and resolve, we turn to those tasks of our time. Sustained by faith, driven by conviction and devoted to one another and the country we love with all our hearts.

President Joe Biden – inauguration speech – January 20th 2021


Gaming On Linux – The Metagame

I’ve always been a big PC gamer. But my latest hobby combines that gaming passion with Linux. I’ve been watching Linux gaming from the sidelines for years but my assessment was always that Windows was for gaming, and Linux was for Getting Shit Done. The two worlds didn’t overlap and I was resigned to dual boot, or use two entirely separate machines.

This has all changed in a relatively short period of time, and the catalysts are the DXVK project and Proton which is a compatibility layer for Windows on Linux. It is a fork of WINE but really focuses on gaming compatibility. A really cool trick of DXVK is the ability to convert DirectX calls to Vulkan (DirectX to Vulkan) – a lower level of GPU instructions. This allows games to run very close to (if not even slightly better on some titles like RDR2) than their DirectX counterparts.

In addition to Proton (with DXVK) Two other initiatives have really propelled gaming into what I could call the realm of approach-ability: A native Steam client, and the Lutris project.


Steam, in case you have been living under a rock has a huge market share and influence on how games are distributed. There have been many imitators, but Steam still does it best – at least as far as Linux is concerned. A few titles have native Linux clients but most don’t. What Steam on Linux does is seamlessly integrate their compatibility tool Proton with their UI. You can click a game’s properties and check “Force the use of a specific Steam Play Compatibility Tool”. (Incidentally Deus Ex: Mankind Divided, Pillars of Eternity, Age of Wonders III are all Linux native!)


Lutris takes a more inclusive approach. While it can leverage DXVK WINE builds (and even Proton builds) it allows you to access games from all of the other game clients including Epic Games Launcher, Uplay, and Origin. The software operates as a database of scripts that are user maintained to configure WINE environments for maximum compatibility. It creates a bottle per game and when you uninstall it removes the entire bottle. No left over DLLs, or files – they exist in complete encapsulation. Even better because they are all encapsulated you can run one game as Windows XP, and another as Windows 10 – each one contains its own environment. Below you can see that I have Far Cry 5 (Uplay), Titanfall 2 (Origin) and Control (Epic Games Store), alongside standalone games (Return of the Obra Dinn is a standalone install).

I’ve been so thrilled with Lutris that I support them on Patreon.

Gaming Considerations

Emptor caveat! With all things Linux you need to be prepared to tinker. If you want to crack open a beer, boot a game, and jump right in then you’d best look elsewhere. That being said, I’d take a wild ass guess and say 80% just work, 10% work with minor tinkering, and the remainder either don’t work at all, or don’t work well enough to be enjoyable because of performance issues (looking at you Control, and Anno 1800). Most games run smooth as butter and its easy to forget its not running on Windows.

The biggest performance issue seems to be what I call microstutter. It is an jarring experience where ~5 or so frames drop. This usually happens while assets are loading. I can’t be sure it isn’t CPU overhead from my encrypted partition. Other people talk about caching shaders. This takes a bit of your disk to compile the shaders so they are ready to use and not being translated in real time.

Before purchasing a game I typically check these three things:

  • Is there a native Linux client (the answer is usually no, but I’ve been surprised)

  • Are there compatibility issues listed on ProtonDB?

  • (Non-Steam titles only) Is there an installer on Lutris

Another big consideration is that aside from Steam, most of the gaming clients are bad. I don’t think this is a problem specific to Linux – they are just bad. Often times to fix a crash or performance you will need to disable all of the UI overlay garbage that these launchers try and throw overtop of your game. Often they tell you to put the client into offline mode. This likely means that multiplayer support is going to be limited on these titles. If you need multiplayer – Steam is probably your best bet. Linux is officially supported and the build is very stable, and I’ve played many hours of multiplayer games (Divinity 2, Tabletop Simulator, and others).


A few other thoughts and tools to get you started – MangoHUD is a great FPS overlay. It not only shows FPS, but it goes into frametime, and shows CPU/GPU loads.

Gamemode is a utility that is supposed to improve game performance. It is supposed to change the nice / ionice levels of your games and manage your CPU governor (if on a laptop) to maximize performance. I’m not sure how much of a difference it makes – and these are changes you can make yourself but this tool does it automatically when a game is launched. Lutris and a few titles natively look for this executable and use it if available.

Of course you’ll want the latest nVidia or AMD GPU drivers. There are some useful tools in the nVidia control panel like forcing a composition pipeline which is a rather interesting way to deal with the frame tearing in a way different from V-sync.


Gaming on Linux has been something I never thought I’d see happen. I’m sure that cloud gaming (aka game streaming) has been propelling development in this area. After all, companies like Valve aren’t working on these projects out of the goodness of their hearts. And if you can get a Windows game to run in a DXVK environment you are making it cloud ready and can scale up Linux servers better than you probably could on Windows servers. That is my theory anyway.

It is also wonderful not having to switch operating systems when I want to work or game. I can do both. In fact – my workspace 3 is now the gaming workspace. I’m back to one machine, and one environment that is comprehensive for my needs.

If you’ve been on the sidelines looking on now is a great time to dive in. And with each release of Proton, and Lutris support gets better and better. And you are helping the chicken and the egg problem – developers are more likely to support Linux if there are more Linux gamers. So try the metagame of gaming on Linux – its quite rewarding!

Fetching CircleCI Artifacts

Do you use CircleCI for your continuous workflows? Do you use their scheduled jobs? Recently we had a need to retrieve some performance benchmarking via a Lighthouse service that records page load times out to a text file. This job is scheduled to run daily.

Unfortunately it can be difficult to find a build in the CircleCI UI for a given project since there is no search, and only 20 builds at a time are shown in order of most recently run. Fortunately CircleCI has an API that lets us automate the task of trying to find a build and view the artifacts from the run:

We can fetch up to 100 builds a given project a time. Some scripting allows us to narrow our results down to just the build types we are interested in. From here we now have the build number from the most recent build of a given type. In our case the type is the pageload times from Lighthouse.

Once we have found a specific build for a given project we can use the API again to ask about its artifacts: . This allows us to get the container information and paths of any artifacts produced by the job. Including our page load times.

We now have the URL for a given artifact, and it is just a matter of downloading the file by suffixing the CircleCI token to our URL:

We now have the output from our artifact. From here we can put this information into our company Slack, or even push it to a collaborate spreadsheet that the team routinely reviews. The specifics of how to automate this script, and what to do with its output is outside the scope of this post, but I will share our Ruby script for interacting with CircleCI. Should be easily adaptable to other languages. It can be viewed below:

# Finds a CircleCI job of a given name, and retrieves an artifact from a given CircleCI project build
# Usage:
# $ API_TOKEN=xxx GITHUB_USERNAME=bsimpson GITHUB_PROJECT=some-project TARGET_JOB=lighthouse ARTIFACT=averages_pageload ruby ./circleci.rb
require "net/http"
require "json"
LIMIT = 100
# Recent builds for a single project
# curl
def find_job(job=TARGET_JOB)
offset = 0
while offset < LIMIT * 10 do
url = "{github_username}/%{github_project}?circle-token=%{token}&limit=%{limit}&offset=%{offset}"
uri = URI.parse url % {
github_username: GITHUB_USERNAME,
github_project: GITHUB_PROJECT,
token: API_TOKEN,
limit: LIMIT,
offset: offset
response = Net::HTTP.get(uri)
jobs = JSON.parse(response)
matching_job = jobs.detect { |job| job["build_parameters"]["CIRCLE_JOB"].match(TARGET_JOB) }
if matching_job
return matching_job
puts "Trying offset #{offset}…"
offset += LIMIT
puts "Exhausted pages"
# Return artifacts of a build
# curl
def find_artifacts(job, artifact=ARTIFACT)
build_num = job["build_num"]
url = "{github_username}/%{github_project}/%{build_num}/artifacts?circle-token=%{token}"
uri = URI.parse url % {
github_username: GITHUB_USERNAME,
github_project: GITHUB_PROJECT,
build_num: build_num,
token: API_TOKEN
response = Net::HTTP.get(uri)
artifacts = JSON.parse(response)
matching_artifact = artifacts.detect { |artifact| artifact["path"].match(ARTIFACT) }
return matching_artifact
# Download an artifact
def download_artifact(artifact)
url = "#{artifact["url"]}?circle-token=%{token}"
uri = URI.parse url % {
token: API_TOKEN
response = Net::HTTP.get(uri)
puts response
return response
job = find_job
artifact = find_artifacts(job)

view raw


hosted with ❤ by GitHub

Move over Router – Mesh is Here

Its been a while since I’ve had my quality of life dramatically improved by a device upgrade. I recently moved my home office upstairs, and with it came a shuffling around of wireless equipment from the first floor to the second. The new office space is on the opposite side and floor from the living room with our smart TV. No matter how I positioned the wifi router one side suffered. And putting the router in the middle meant it would be in the kids rooms.

Adding a wireless range extender practically made the problem worse as the devices tried to connect to the wrong network for their location, and the speeds while connected to the range extender were terrible.

Fed up I started doing some research into routers with the furthest range, highest speeds, etc. That is when I came across a new category of “mesh” networks. These devices offer multiple access points called nodes that promise to seamlessly shuffle clients around based on the optimal connection path. After some research I decided on the TP Link Deco M4 3 pack . I had a promo code and the total price came out to ~$150 shipped.

After using it for a few weeks, I’m ready to review. Spoiler alert – I’m bursting with happiness. I’ll address a few main categories of these devices:


I have a 2,500 sq foot house on 2 floors + deck . The 3 nodes cover this easily. Nowhere in the house, or yard have less than 3 bars. I have configured these in a triangular arrangement with two nodes being on opposite sides on the house on the 2nd floor (as diagonal as I could get them). The other node is on the 1st floor half way between the nodes on the top floor.

I haven’t tried a two node setup, which might be more representative of what the old router + range extender were delivering, but why would I? The whole point of a mesh network is that you can keep adding nodes until you have great coverage.

As an experiment I walked around the cul-de-sac and stayed connected an impressive way out. Whatever is inside these nodes (or maybe they are working in aggregate) had great transmitting power, all without looking garish with external antennas everywhere.


On the TP-Link Deco M4 network, I get 300+Mbps anywhere inside the house. Outside on the deck this drops to 200Mbps For comparison, with the old ASUS RT-AC68U + Linksys RE6500 range extender I would get ~200Mbps in the same room as the router. The range extender never got above 100Mbps, and the deck (when I could get signal) would be around 20Mbps. The mesh network link speed blows away the traditional router + extender setup.

One more technical note here – the nodes are tri-band which means that you get the full bandwidth through each node instead of it being halved.


The TP-Link (and many of the other commercial mesh kits) come with a smartphone app to setup the devices. I was initially turned off by this. After all – everything today claims it needs you to install an app when a basic mobile website is probably sufficient.

The app however is clean, and aided in setup versus the traditional laptop approach, potentially having to plugin with an ethernet cable to the router to initially configure the network.

The nodes are all identical, so it doesn’t matter which on you connect to the modem. It correctly figured out that was the Internet connection, and even circumvented the silly tendency for modems to only bind to one MAC address. The physical setup involves nothing more than plugging in the node to an AC outlet, and for the initial node plugging it into the modem. The app detects the node you are next to, and walks you through setting up the wireless network.

Flashing lights on each of the nodes informs you if they are online and working properly or experiencing an issue.

The nodes all share the same wifi network name, and devices will switch between them automatically. Setup options are pretty standard (maybe even somewhat limited). You choose an SSID, create a password, and choose whether to broadcast the network. You don’t even pick between 2.4Ghz and 5Ghz networks – this is all managed for you. The device will use the best network available. My old laptop can’t see 5Ghz networks and connected just fine. The Deco offers a few other features like QOS settings, reserved IP addresses, blacklisting, reports, etc.


This looks to have been a historical weak point for mesh networks. New technologies typically come with premium price tags. I think enough time has passed that mesh network kits are about on part with a new router and range extender. I paid $140 having caught a sale that took $40 off the price.


I would absolutely recommend a mesh network to just about anyone, possibly with the exception of someone that has advanced needs for their network setup. This feels like an evolution of the wireless router. It offers superior range and speeds relative to my previous router + range extender setup for about the same price. Setup is painless, and this has fixed all of my wireless issues throughout the entire house. I’ve retired my router, and range extender.

I’ve also retired my USB wireless adapter for my desktop since I have a mesh node sitting on the desk, and have opted instead to connect with an ethernet cable. I’ve also managed to retire a wifi bridge for a NAS device that I have because again with 3 nodes, I can easily place this NAS next a node and connect with an ethernet cable.

All said and done I threw out more equipment than I setup. This was an absolutely great decision in my opinion and at the risk of sounding like a sponsored post – I can say I couldn’t be happier.

Updating database rows with a default position

I was recently tasked with making a table in a MySQL database sortable by the user. There are a few steps involved to take care of new records, while providing existing records with a default value.

The new records would be created with a ifnull(max(order), 0) + 1 to assign them the next position. This was sufficient since I was positioning based on created_at and a new record was guaranteed to have the highest created_at timestamp.

For the existing records, I wanted to:

  • group by user_id
  • set the order (int) column to an incrementing integer
  • with an initial ordering based on a created_at timestamp

The first approach was to select groups of users, and for each group sort the records then update each record. This approach would work, but it wouldn’t have good performance. I really wanted to do this in a single update statement.

I ended up going with an approach based on this article :

set @user_id = 0, @order = 1

update table1 as dest,
  select, x.user_id, x.created_at,
  @order := if(@user_id = x.user_id, @order + 1, 1) as `order`,
  @user_id := x.user_id as dummy
  from (
    select id, user_id, created_at
    from table1
    order by user_id, created_at
  ) as x
) as src
set dest.`order` = src.`order`
where =;

Let’s break this down:

  1. We are using database session variables to keep track of some state. We default to @user_id = 0 and @order = 1
  2. Working from from the inner-most select, we select the id (primary key), user_id (int)  and created_at (timestamp) as these are the columns needed to do our calculations for the new order value. We order these by user_id and created_at to get the sort ready. This dataset is aliased as x (yes I know I’m terrible at naming)
  3. The next select up pulls the same id, user_id and created_at from the inner-most select and does some calculations persisting the state in the session variables we setup in step 1. The actual work here is:
    1. Calculating the order: @order := if(@user_id = x.user_id, @order + 1, 1) . The := is an assignment in SQL. We set the @order to either a default of 1, or the @order + 1 .
    2. Comparing user to determine what to set the @order to. This is done in a conditional expression:  if(@user = x.user_id). Because the @user_id is set to the x.user_id value on each row, the comparison should return true until we get to the next user, at which point the @order will reset to 1.
  4. Finally the outer update takes the resulting src dataset and does some updating. We constrain to where = to get the correct record, and then we set dest.order = src.order which is our calculated value.

Performance is looking great. On a dataset of 300k records, this runs in 3 seconds on a 4 CPU Docker container.

Bonus: If you have additional criteria (like only sorting favorite records for example) this can be easily applied to the inner-most SELECT in the where clause!

Happy ordering!