June 26, 2021

PLANET’S MAD

BAAUER. Yes, this is the person that made Harlem Shake” back in 2012 and seeing BAAUER released a new album I was skeptical. It took a while but towards the end of 2020 someone pointed me towards the movie” that was authored as part of the album. What movie? Yes, BAAUER commissioned a 40 minute video that consists purely of rendered footage synchronized to the music. The effort here is immense, I’ve speculated personally about how they made it and I assume that they extracted MIDI from the tracks and imported as events as animating this by hand seems like a lot of effort.

What’s really cool about it is that the video is like a throwback to MOSQUITO which was aired on Swedish national television between 1998-2003, although with less PHONG shading and more path tracing. Back in the late 90’s, creating weird visuals with aliens was all the jam and seeing the style re-imagined with modern technologies makes me smile. Some of the visuals are definitely rougher, especially during REACHUPDONTSTOP but synchronization makes up for the style.

There’s really not that much more to say other than enjoy these 40 minutes of JUNGLE TERROR

June 1, 2021

Struggles with Limescan

In 2018 I built a small BLE (Bluetooth Low Energy) RFID reader to enable our small iPad PoS app that I called Limetree. It’s been about 3 years since I created this and looking back I’ve come far from this very simple design. This was my first 3D printed project and at the time I was still designing the parts in SketchUp. For an unrelated reason I needed the NRF52832 feather in this device for another project so I pulled it out somewhere in mid-2020 as I had no need for the scanner during COVID. As there is a high chance I get to travel to Sweden again this year (fully vaccinated now!!!), it was time to reassemble the scanner!

Well, it didn’t work out as planned. For some reason the exact same wiring and code just refused to work and in frustration I soldered and desoldered until I damaged the Feather, which lead to the NRF52 chip refusing to boot. Oh well, guess it was time to upgrade to the NRF52840 anyway. Quick primer on the differences between these two feathers is that the latter of the two (hereafter referred to 840) has a native USB stack, which means the main chip handles the USB interface against the host computer. On the 832, there is a separate microcontroller handling USB and bootstrapping the device which prohibits you from having fancy functions such as U2F. This sounded great to me, usually microcontrollers are the less components, the better” and I decided to buy the 840 instad of the 832 as a replacement.

The NRF52840 arrived and I started the work on updating the code to make it run on the NRF52840. The device has a new RGB LED which I of course wanted to take advantage of. Once I felt reasonably certain that the code matched the differences I soldered on the wires to the PN532 breakout that I had from the earlier version and booted it up. It didn’t work. After tinkering back and forth with the code I got really frustrated because it seemed to work SOMETIMES. I would change something, burn the firmware for it to work for a few minutes and then restart it and it wouldn’t work again. If you’ve ever done programming, the worst problems to debug are the ones that are not predictable because there is usually very little indication what’s going wrong.

I don’t own a SWD debugger so most of my debugging here is print statements and blind debugging using the oscilloscope. However the problem I seemed to be having here happened before the serial line initialized properly, meaning that whatever output I wrote to the computer from the microcontroller was lost into the void. I quickly realized that I could use the on-board status LED to just show different colors upon reaching certain parts of the code and wrote a quick status” function which incremented colors every time it was called, allowing me to see where it stopped. In theory, this should yield me the same color every time I started the chip which would allow me to see where it got stuck on boot. Of course, that didn’t work out either. It stopped at random points during the bringup of the chip. Almost out of ideas I started doing really harsh clean turkey” debugging, starting with all code commented out and bringing piece after piece into the library.

Eventually it cleared, the Adafruit Neopixel library I was using for easy control of the LEDs was somehow interfering with the BLE shim that Adafruit provided which sits on top of Nordic Semiconductors BLE stack. To understand this it’s important to know that while all these microcontrollers can execute code within the Arduino Framework” (which I wrote my shit in because I couldn’t be bothered in 2018), the framework” is implemented differently depending on which microcontroller you’re using. Coming from using x86 processors for the past 25 years this is a bit hard to wrap my head around but the Arduino environment” is more like libc, a set of common functions you can compile against which has stuff like ports, GPIO and hardware functions mapped for you. It also provides a setup() function which is called once and loop() function which is called in a loop as long as the device is powered.

On an old AVR Arduino from which the framework stems from, you essentially controlled the entire CPU. Your code ran in a loop and the loop was bootstrapped by the framework. This means that once the processor enters loop(), you are in control of the execution of the entire processor. There is no operating system or multitasking going on which allows you to do a lot of nifty tricks with the hardware. something that other libraries often used.

//This code is really from the Arduino AVR core
//https://github.com/arduino/ArduinoCore-avr/blob/master/cores/arduino/main.cpp
int main(void)
{
    init();
    
    setup();
    
    for (;;) {
        loop();
        if (serialEventRun) serialEventRun();
    }
        
    return 0;
}

On a microcontroller like NRF52840 or another popular chip like Espressif’s ESP32 this is no longer true. These devices need to keep a BLE radio or WiFi radio active and these radios are way too complex to be controlled by the user and managing the timing requirements of these stacks. For that reason these devices often run on an RTOS (real-time operating system), the NRF52840 on Zephyr RTOS and ESP32 on FreeRTOS. What this means in reality is that your loop() function is scheduled onto the OS with the lowest priority, allowing the core to deal with other more important tasks for the radio to operate in between the calls to your function. As these cores often run at significantly higher speeds (64 MHz/240MHz vs 16Mhz for the AVR) there is enough headroom to run normal Arduino code and schedule the radio without impacting the Arduino libraries. The delay() function that on AVR halted the execution insteads just yields back to the scheduler with a request to be woken up after X milliseconds.

Why does all this matter? I wanted to re-use the code for this project even though I prefer the ESP32 (the ESP32 having dual cores, making it easier to schedule tasks) but ince the NRF52840 already ran a RTOS I figured that I could just schedule tasks exactly like the rest of the libraries and yield() in the Arduino bootstrapped loop. However every time I tried to create one additional task with the Neopixel and the BLE stack, the microprocessor froze. After a lot of searching I stumbled onto a lead on the Adafruit Forums where one person seems to acknowledge that combining the Neopixel library with the BLE stack seems to cause a conflict which leads to the crash. After digging even deeper it seems that the Adafruit Neopixel library is one of those libraries that try to make the most out of the AVR core. Specifically the task that it gets scheduled on runs for a long time which is fine” when you only have one task to defer to but as soon as you start introducing multiple tasks, the BLE stack gets starved” and crashes the execution as it fails to maintain realtime. After having said FML a couple of times while realizing this I quickly switched out the Neopixel library to another library I use on the ESP32 named FastLED and this worked without a problem.

OK WE DONE HERE NOW! RIGHT? RIGHT???? right… ?

I connect it all back up together to start testing against the iPad and see that the status LED still doesn’t fully go to green (the color I selected for success). The device fails to connect to the PN532 (RFID reader)??? Down the rabbit hole again it seems. However this is yet again one of those sometimes” issues where every reboot seems like a gamble if it will come up. I connect the feather and breakout to the oscilloscope and start trying to identify what’s going on.

Clear

Full

There is a clock signal and there is a data signal going in both directions so it’s clear that the chip is at least speaking SPI. I switch back and forth between the computer and the chip to try to figure out what’s going on. Eventually I see that the signal only bangs out sometimes”. Thinking back to what we had before, the PN532 is wired over SPI and I have a suspicion that the earlier problem is related. But why did this work so well with the NRF52832? After searching for a long time I think that I eventually found the issue in the unlikely place of the CircuitPython implementation for Adafruit, as the developers of that runtime also dealt with this. You see that the NRF52832 had an errata” (faulty hardware implementation) for the SPI which lead the developers at Adafruit to use a software SPI implementation for the older chip, documented here. So essentially what I’m discovering here is the hardware SPI on the NRF52840 is weird and fails to sync with the PN532 for some unknown reason. I’m guessing it’s some sort of assumption in the Adafruit library for the breakout, as many people have complained about similar problems with the ESP32 in the GitHub repo.

At this point I couldn’t be bothered to go deeper and decided to switch the hardware protocol to use I2C instead. The benefits of I2C vs SPI can be debated but for my use case these are interchangeable apart from having to solder the additional pullup resistors. I soldered this together and configured the code to use I2C and it worked on the first try. Board comes up with BLE and the device now works exactly as the NRF52832 did. At this point I’ve massacred the code a fair bit and it was never in good shape to begin with. One should always leave code in better shape than one found it so I end up spending some time on cleaning up the codebase and utilizing the ZephyrRTOS task scheduling instead for the loop based programming model I did earlier.

Because I had spent so much time digging into the code I also finally learned how BLE works”. My previous mental model about BLE is that it’s a wireless serial port” which is very far from the truth. A much better way to think about BLE is that you can write values into the void and tag them as representing specific characteristics, anyone that happens to listen to those messages can easily decode those. With that knowledge I could rewrite a lot of the earlier code I had to work around BLUART and just make my own characteristic. Code is of course on Github.

LETS REASSEMBLE IT

With my luck throughout this simple switch”, of course the first thing that happens when I reassemble it into the old 3D printed case is that I crack the pins in the case. This design was my first and in hindsight it’s so badly designed that I’m almost ashamed of it. Sure, everyone begins somewhere but this case was too dumb of a design to be SLA printed. Only one thing to do, open up Fusion and design a new one. I re-used the formfactor from the previous design as I couldn’t be bothered to buy a new piece of frosted plexiglass.

Clear

Full

This was also an opportunity to fix up the cabling from the earlier version. The earlier version relied on adhesive tape to fixate the Feather to the center of the board for the 4x8 LED array. This time I 3D printed a separate module that would hold the LED featherwing. Mounted this together using some screws and it’s night and day in fit compared to the older version. I do not want to re-experience the pain of debugging this microcontroller but improving physical designs is really fun.

Mounted

Working

Closing words for this one is that it’s easy to assume that other people have figured your specific use case out before and designed for it. This might be true in the world of X86 where there’s millions of users but for smaller microcontrollers, the likelihood of someone solving one problem the way you do is very low. I’ve also realized why most developers end up transitioning out from using the Arduino framework, while it’s a great way to bootstrap learnings it just makes too many assumptions about logic flow on more modern processors. I hate to blame the hardware” and I should probably just bite the bullet and buy an SWT debugger at some point. My digital caliper also broke when designing the case so I got a new one, icing on the cake to this I guess.

May 25, 2021

Biking in the Bay Area

Before moving to San Francisco I heard a lot about many different parts of the Bay Area. Tech culture, bars, commute, cost of living, hiking, sierras etc but none mentioned the insane biking that the surrounding areas gives you. There is absolute golden roads to bike less than 10 kilometers from San Francisco. For anyone who lives in SF and regulary uses Strava, these images are almost a meme at this point. It’s obligatory for any cyclist to tag their Strava uploads with these images over and over but I realized that not everyone lives in SF and uses Strava. These document my three favorite routes: Paradise Loop (90 KM), Reyes Loop (132 KM) and the Mt. Tam climb through Fairfax (105 KM).

Paradise loop

Reyes Loop

Reyes Loop

Climbing up to Hawk Hill Climbing up to Hawk Hill

Climbing up to Hawk Hill Descending Hawk Hill

Heading out with Nikola Heading out with Nikola

Climbing up to Mt. Tam Climbing up to Mt. Tam through the Lagunitas watershed.

On the ridge Up on the ridge.

Stopping for water

Stopping for water

Descending Tam Looking out from the rigde.

March 8, 2021

3D PRINTED FOOT PEDAL

Working remotely is draining in many ways but endless video conferencing is probably what one will miss the least about this entire experience. I’ve also learned that very few people seem to have grown up with long form voice conferencing and is as a result absolutely terrible at muting when not speaking. I’ve sat through so many meetings where people fail at muting and has microphones that sounds like they are screaming while skydiving next to a helicopter.

My theory around this is that personal video conferencing quality is tied to ones organizational level. In general: The higher up you are in the company organizational structure you are, the shittier your AV setup is. (Coral’s law for short).

hub

Growing up with Ventrilo -> Teamspeak -> Discord has teached one the importance of using PTT (Push To Talk). It might sound annoying to have to hold a button every time you want to speak but after a short time it becomes second nature, even to the point where I will press this button even though the application I’m using at the moment doesn’t support PTT.

Using PTT can be annoying at times. When trying to type at the same time as talking or not having your hands close to the PTT button is frustrating. A friend gave me the advice to try using a foot pedal, however I couldn’t find one that fit my requirements. I wanted one that could map to any obscure button without using active software to trigger the button. The ones that did were too expensive, which leaves me with only one option: building one myself.

I’m using what I had on the shelf for this one, meaning:

I designed a very simple enclosure that printed in two parts, one serving as both the base plate and the enclosure for the electronics and the second as the hood. I took the inspiration from pedals commonly seen for sewing machines and expression control for synthesizers.

Full

Clear

Full

Clear

Wiring it up was easy (essentially one pushbutton) and I’m using NicoHood’s HID library which made the code short. I’ve uploaded the project on my github along with the STL files if you want to build one yourself.

March 6, 2021

BÄSKA DROPPAR

Sweden has this absolutely terrific yet horrifying and awful liquor named Bäska Droppar” which I make a point of introducing to as many people as possible. On top of being almost undrinkable there is a Swedish studentesque tradition of drinking this in weird ways at weird times. The bottle doesn’t beat around the bush and proudly shows:

Noone is indifferent to the characteristic taste of Bäska Droppar which has made it a classic.”

- The tagline on the back of the bottle. Really.



bottle

There are three major milestones to keep in mind when consuming this trash liquor:

23 Besken

This is a true Swedish classic. When turning 23, one is supposed to drink 23 cl of what used to be article number 23 in Systembolaget (The Swedish liquor monopoly). 23 cl is just above a half marathon (21.0975 cl) which leads us to the next milestone.

The Marathon

If 23 cl wasn’t enough for you, you can always go the full marathon. This means drinking exactly 42.194988 cl of this experience. I do promise that after having consumed this volume of Bäsk, you’re going to wish that you didn’t.

The Victory Lap

After consuming 42.194988 cl you might experience a rush of adrenaline from having cleared this much of the horrible taste. If you still feel in good shape you should go for the victory lap, meaning clearing the remaining 7.805012 cl still left in the bottle

If you get the chance to consume this bottle of pure horror, do take the opportunity. It’s after all a Swedish classic.

March 1, 2021

1

2

3

4

February 8, 2021

Leaving Twitch

In February of 2021 I resigned from Twitch after 6 years of employment and joining Discord tomorrow after writing this post. It’s scary leaving any company where you spent 1/6th of your life inside of. The question has come up enough times now about why I’m leaving and to explain that I first have to talk about how it all started.

To understand why I joined Twitch I need to start with how I met Trance. Cristian Tamas, also known as Trance” on the internet was one of the founders of The GD Studio”, which back in 2012 was one of the few influential broadcasters within the Starcraft 2 esport scene. I got introduced to Trance through my work at DreamHack and helped him and James 2GD Harding with some of the shows up until 2014. Trance was one of the first employees of Twitch in Europe with the intent of acting as a tech/BD arm in Europe, as esport was heavily European at the time. In 2014, the studio was awarded the contract from Valve Software to produce a pre-show” to their DOTA2 tournament with $10 million USD in prize pool that would stretch over 2 weeks. The GD Studio’s idea here was to produce a 24/7 show, flying in all the talent to their house up in Tyresö, Stockholm.

hub

Trance reached out to me to help assemble the production behind the show, as there were tons of parts the studio needed to make this concept work. Although the budget was significant for the studio, they had somehow still managed to bite more than they could chew. Apart from shuttling 20-30 people from Stockholm to Lidingö at any hours of the day they had to cook, commentate games, entertain the people and somehow also produce a show. On top of this they also had 10 newborn kittens behind the sofa. In reality I have no idea how they pulled it off but the technical architecture of the show worked and I became good friends with Trance as a result of this event.

I ended up leaving my job at DreamHack in September to pursue more freelancing gigs similar to the one above. Kept talking to Trance a lot and helping him out with streaming configurations and whatnot. In November of 2014 I got a call from a man named Cyrus Hall in the middle of the night. A recruiter had reached out a while earlier and asked if I was interested in working for Twitch through recommendation from Trance and I said sure, knowing that nothing would come of it. Cyrus had scheduled to talk to me for about an hour but we ended up spending closer to 2-3 hours on the phone, talking about networks, video and my general thinking about where esport broadcasting was headed. It’s hard to remember today when esport broadcasting often rivals or beats traditional broadcasting what scrappy operation it used to be and I felt that my role in all this was to help the esport orgs with the broadcast knowledge I had. At the same time I stood on the other side of the fence, communicating from the esport orgs to the streaming platforms about the requirements that a broadcast” has in comparison to UGC (user generated content) gaming stream.

In February of 2015 I accepted the offer to join Twitch to help them figure out how to make esport broadcasters happy from a technical standpoint. As both esport and Twitch grew at a rapid pace, the scope of my work expanded quicker than I was able to personally tackle, which had me build up a team with people that could help me build the relationships and architect the tooling. There were a lot of onsite events and meetings at this period in life, mostly to build a relation with the organisations on the other end and understand their problems.

trance

One needs to mention all the great colleagues I met at Twitch. Since Twitch encompassed my sober year and I had a lot of time on my hands, I took up playing DOTA2 with coworkers from Twitch while visiting San Francisco. Staying in the office until 04.00 in the morning while bashing out game after game, dreading the meeting starting in less than 5 hours or the flight to some random location. Had the absolute pleasure to meet one of the best directors (now VP) who served partly as my boss but more as my mentor and great friend. If it wasn’t for this individual I wouldn’t be at Twitch past 2016 when he flew out to Sweden. Me and Trance took him to the Twitch afterparty in Jönköping and while I don’t have a great picture of him, I saved this gem taken at 5AM with Trance.

How the sausage is made

Everyone who has met me knows that I never shy away from solving time sensitive problems. It’s a skill that has come from my years in working with live shows & touring gigs and broadcast. One gets used to the high pressure and develops systematic ways of approaching situations like that after a while. That’s the same reason why first responders can act swift but stay calm, they’ve practiced it so many times that it’s not foreign. The same goes for working with live shows, the tone is often tense and agressive but people are only interested in resolving issues as fast as possible.

Of all the weird things that happened over my time at Twitch, the best how the sausage is made” moment I experienced was during E3 2018. During one keynote that was broadcasted on Twitch with a large number of viewers, the feed started exhibiting visual artifacts. Since the viewership on Twitch was significant, people started getting really worked up about this issue, as viewers were spamming the chat about it. I was in the production studio for another reason and got asked by an engineer if I had any idea what could be going on. In broadcast you don’t get to redo. There is no let’s fix that one for next time” when you have a show that runs once a year with massive viewership.

When approaching broadcast problems I’ve found that it helps to always approach it left to right”. As in you start with verifying the absolute source and then work your way to the right” as in down the chain to figure out where the issue is introduced. Baseband video is funny in that way that you can essentially tap the feed at any given point and view it without having to do complex network routing or subscribing to a livestream, just route the matrix to an SDI output and watch the source. This goes against the common debug strategy that people often take, working right to left from where the problem is exhibited towards where the problem isn’t visible any longer. My problem with this approach is while it seems good on paper it fails to take all the branching patterns a signal can take which makes it hard to easily trace backwards. After stepping through the signals it was clear that the issue happened before entering the SDI matrix. Tracing the cable backwards I found that the actual input source was looped through a scan converter (broadcast term for video scaler). The actual fiber feed into the building didn’t have these artifacts and they used the scan converter to split the signal into primary/backup. In this case, both were affected.

To solve this I realized that we needed to replace the scan converter that drove the entire feed for the studio (single point of failure), mid-show, without affecting the video. On top of this we had about 3 minutes to solve it. I realized that the content that was shown at the time was game trailers for up and coming games, all ending on a black slate for about a second. If we could time the switchover to occur exactly at that second, no one would notice.

Doing this however required three physical cables to actually be switched in less than a second. The fiber input needed to be switched to a new scan converter and the lines going out from the old scan converter had to be swapped into the new scan converter. Here I also realized that the feed we got had about 1.2 seconds of latency, meaning that if I dialed into the production bridge of the company who’s feed we were tapping, heard their countdown, we would be able to monkey patch it and notify our production exactly in time.

Switching in the AXS studio

Here I am, sitting on the floor while waiting to switch the SDI cable. Once we felt that it was time, I notified our production that it was going to happen in less than 30 seconds and waited for the cue on my other headphones. The switch went flawless, the new scan converter locked onto the signal and no one watching saw the switch happen. Of course this was dumb and the show should never had split the signal in that way but this is sadly the way it goes a lot of the times in broadcast. It’s a balance between budget and time and often the right decisions gets made but on the wrong risk profile.

Moving on


vegas

Twitch realized that we had built one of the worlds best platforms for live video on the internet and went on to build Amazon IVS which I was lucky to be part of. This was a big undertaking and has taken a lot of my focus for the past years, working in a variety of different functions but primarily managing other people and eventually managers. IVS almost speaks for itself, it’s the battle tested video system that drives Twitch packaged up as an AWS service for customers to use in new and unexpected ways. As I had spent the years prior to this meeting customers I helped the person leading the product on a number of customer meetings for IVS.

So why leave Twitch? Isn’t life great? Exciting work on the horizon? Sure, that’s the problem.

I joined Twitch to be part of the transformation from linear TV to online media and the already transformation happened. Society went from old media” with news anchors and linear programming to Netflix, YouTube and Twitch. Streamers, influencers and individual content producers have become part of our daily life and have started overshadowing the traditional media machine. I’m sure there are tons of exciting new developments happening but the large ship has already sailed. It might not seem like it but consider where you get the majority of your entertainment today, chances are it’s OTT services or UGC services mixed and matched after your interests.

COVID was probably an eye-opener for many people but what this worldwide event has highlighted is a glimpse of how our social fabric would look if it moved onto the internet. Zoom has become a behemoth of video conferencing and the online media services have seen tremendous growth over 2020. What I think has passed under the radar for many people though is how the concept of hanging out with your friends has started to move onto the internet. Someone like me grew up using software like Ventrilo and Mumble to connect to VOIP chat rooms with friends all the way back in 2005 but that experience never really went mainstream.

Over the winter break I realized that Discord had finally done just that, taken the experience I had when in Ventrilo when I was 15 and built a product that enables people to actually drop into a space” and hang out with whomever happens to be there for the moment. It’s all transient which tends to eliminate. I myself communicate with my friends back home in Sweden using Discord on the daily so I personally believe that they are onto something that no other service currently manages to fill. For that reason I decided to pursue an opportunity at Discord, ending my tenure at Twitch.

In the words of Primal Scream:

I’m movin’ on up now.
Getting out of the darkness.”


img