Neural network racing cars around a track

During Christmas of 2019 I got curious about machine learning so I put together an experiment. I wanted to see how well ML could drive a car around a racing track. I made a video of the application running and put it on YouTube, and it got quite some attention!

2.5 millions views… Just wow…

I kind of got a job offer or two as a result 😉 and the video also got the attention of Google’s AI guru David Ha who posted about it on LinkedIn! Lots of people had questions about it, how it worked, about the ML algorithm, the sourcecode availability.. and honestly I was a bit overwhelmed and I haven’t really taken the time to answer them.

So what else is up? I started another machine learning project which IMO is a lot cooler… 😉 but as usual, real life intervened with work and obligations, so that’s currently in limbo. If/when something gets done, I’ll post something on the YouTube channel…

The current situation, January 2020

Well I finally got up and running again after some issues with Traefik and SSL certs. I think I’ve figured it out now.

I removed the front blog, instead it redirects directly here to the main blog.

I’ve been looking through the blog. Most of it is horribly outdated, incomplete or broken. And it’s just not a priority to fix any of that right now. And there’s not much more to say about that.

Getting CORS to work on C# WebAPI

There are many guides for this, and it should be a simple thing, but there is one possible complication: duplicate header entries.

So I’ll make this quick. Refer to for more information.

“Where can I get System.Web.Http.Cors?”


“How do I enable CORS?”

Add to WebApiConfig.Register():


But now, depending on your other settings, you might get a duplicate Access-Control-Allow-Origin header. This is how you remove it:

Add to WebApiApplication in Global.asax.cs:


That’s it. Should work now.



Skrev denna dikten nĂ„gonstans mellan Ă„ren 92 – 95, tror jag..

BuggjÀvel, buggjÀvel, var gömmer du dig?
jag letar i sÄsen men hittar dig ej,
gurun mediterar titt som tÀtt,
jag som trodde det hĂ€r var lĂ€tt…

BuggjÀvel, buggjÀvel, var gömmer du dig?
hur funkar debuggern? Den svarar mig ej…
kanske jag skriver i minnet dÀr jag ej fÄr vara,
kanske Àndrade jag nÄgot men glömde spara?

BuggjÀvel, buggjÀvel, var gömmer du dig?
jag tröttnar pĂ„ skiten, jag hĂ€nger mig…
Men tÀnk om jag snart lyckas mosa lusen,
dĂ„ Ă€r det ju dumt att abrupt slĂ€cka ljusen…

BuggjÀvel, buggjÀvel, var gömmer du dig?
Du slukar min kraft, jag förbannar dig!
Att tiden ska gÄ Ät till sÄdan smörja,
att leta löss, man kan bara sörja…

BuggjÀvel, buggjÀvel, var gömmer du dig?
Nu skiter jag i det, jag skiter i dig,
jag skiter i programmet och sÀljer min maskin,
och köper istÀllet en fÀrgglad gardin.


While cleaning up I found this text I wrote in 2005, about a compression algorithm I was working on… putting it here in the blog instead (for laughs) and removing yet another hardcoded html page from the unfathomable depths of…


“Mission Goal: Creating the Ultimate Compression Algorithm. My priority is compression ratio, ie creating small files. Compression speed is more or less irrelevant. Decompression speed is more important, but less so than compression ratio.

050314: So there I was, re-compressing my MAME roms-collection using ZipMax, to squeeze out a few more saved bytes… My friend Mgt was working on his assignment, creating a Huffman encoder. Fashinating stuff. I read up on the Huffman algorithm on the web, as well as the LZW algorithm, to get a better idea of what we were talking about.

Previously I kind of had the idea that we were near the limits of compression, but I saw many problems with the algorithms and I had some ideas on how to improve them. I lay awake at night thinking up an improvement on the Huffman algorithm.

050315: Some researching on Huffman revealed that my new improved Huffman algorithm already existed. It’s called “Adaptive Huffman”, and it was even a bit smarter than my own “new” algorithm. Darn. Back to the drawing board. Luckily I still have a few ideas on improvements, both on Huffman and LZW.

050315: I had an idea on how to improve the Adaptive Huffman algorithm. Unfortunately, I now found that, again, I was too late:

“For adaptive Huffman coding, Gallager suggests an “aging” scheme, whereby recent occurrences of a character contribute more to its frequency count than do earlier occurrences [Gallager 1978]. This strategy introduces the notion of locality into the adaptive Huffman scheme. Cormack and Horspool describe an algorithm for approximating exponential aging [Cormack and Horspool 1984]. However, the effectiveness of this algorithm has not been established.”

Strange, the effectiveness has not been established? Just code it and try it out! What’s stopping them? It has been 20 years!!! Anyway, my algorithm is simpler and smarter than “approximating exponential aging”. Maybe I’ll implement my algorithm and run a few tests.

A few links:

Adaptive Huffman:

The original LZW paper by Terry Welch:

LZW article by Mark Nelsson in DDJ:

ZIP format (by PKWARE):

050315: I’ll definitely use my own format for the compressed archive. It’s time to get rid of ZIP and old, ugly, inefficient, bloated file formats like that.

050318: I searched the net for some decent tree class, but found nothing.. Nothing, I tell you, nothing! My old jeTree class that I coded a long time ago was not exactly what I needed, so I started coding on a new jeTree class. The base class is now complete and I will sooon finish the jeAdaptiveHuffmanTree class…

I learned that I am not alone in my quest. MGT’s Power Packer 2005 is coming along nicely and could be a serious threat. We shall have to wait and see which will be the Ultimate Compressor…”

The modesty, it burns! Anyway, it’s safe to say that I will never finish that project. Same old story: No time, no money. Gotta prioritize.