Sunday, September 23, 2018

Another grumpy post, this time about Application Insights

What is Application Insights, and why is it awful?

Hey, maybe for some people it's not a terrible time suck. Maybe for some people, it's providing value. How many ways has it sucked up my time so far? Let me count the ways:
  1. When I made my Windows App, Microsoft not very helpfully added a reference to it in the project. I spent time to understand what it was (it is supposed to send up "telemetry" to some dashboard somewhere). I decided to keep it because I didn't want to offend the Awful Scary Microsoft Monsters.
  2. When I got scary email saying that I'm about to spend hundreds of dollars a month to get some alleged insights because Microsoft decided that hosting a ton of data was expensive, and they didn't want to foot the bill forever. Luckily my actual usage is almost nothing (I think).
  3. When my updated projects simply refused to compile. Except when they did compile. But they never compile when I'm making an app store
  4. And just now, when I finally ripped off the bandage, and remove the Application Insights packages from my app.
  5. Oh, and while I'm at it, I figured that I might as well look and see what awesome insights the Application Insights is giving me. I turns out that whatever data has been sent up is gone; when I go to my Azure Portal, there's a grand total of no data at all, or even a hint that any data has ever been sent up
Total wastage: too much.

Friday, June 1, 2018

Ethereum uses what base encoding?

Base 58 is about the dumbest thing ever

I've been learning Ethereum (because, you know, bitcoin). Being a networking kind of person, I'm looking at the networking protocols. Let's leave aside questionable choices like using Keccak-256 which can be argued are "forward looking" and not "completely unsupported by major languages".

No, lets look at encoding. Each Ethereum address is, of course, a big binary number. It's written out as hex (arguably silly, but whatever). It's then translated using, not base-64, but using base-58. As far as I can tell, this is something they just made up.

I'll ignore the lack of support in major languages.

Base-64 has the nice property that it's exactly 2^^6. This means that one byte transforms into one complete Base-64 value with 2 bits left over. Three bytes transforms neatly into four output bytes. A reader or writer can deal with small, finite-sized, easily handled values.

Base-58, on the other hand, is 5.858 bits. That's nothing useful. It means that any hand-crafted library is more likely to be wrong than to be right. The supposed benefit? So that a few characters that might be misinterpreted can be dropped.