Grinding through mal-js implementation, can't decide between using js objects or a bunch of classes.

JS doesn't have 'symbol' or 'keyword' separate from string, or 'vector' different from array, but I can add a type property for the times when things are different.

Writing my own types is theoretically safer, since mal code won't be calling js methods directly there's less surface area for naughy code to fall through.

Dunno. Going to sleep on it.


It's been a bit odd at work the last few days. I've ported a couple of tools from Visual Basic for Applications (VBA - Microsoft Excel macros) and powershell to modern C# (for reasons that I haven't paid a lot of attention to, but I think are related to the security people at work tightening up execution policies).

The code isn't very complex - download a few files, calculate and compare a hash, unzip them, and move then into place - but the team are all like "OMG they're so fast!"

The download/install now takes two/three minutes with a good network and fast disk, compared to maybe half an hour previously. That's a useful speed up, sure, but the amount of praise I'm getting feels very out of proportion to the amount of work it took (especially as I didn't put any thought into speed).

Husband's advice is to just accept the praise, and I'm doing my best, but I'd far prefer someone to gush over my clever stuff.

Ah, well.


I do have a couple of ideas about how I could speed up the install, but I'd have to test them to see if it's worth it.

Mostly, it's to not bother saving the zip to disk, but instead unzip the stream on the fly, teeing off the stream into the hash calculation as we go.

There are four zips, a root and then three children that unzip into a folder under the root. At the moment I'm unzipping all four each into their own folders, moving the children into place under the root, and then swapping the root info final position.

If I go for the stream unzip, then I'd skip the first step, and unzip all the files into place (so skipping one of the moves). I'm not sure how much it would save, the moves are all on the same filesystem, so should be cheap/fast (and watching the process supports this). It would also create complexity, as files from any given folder might end up in more than one of the child zips (don't ask! I tried to get some clarity on this but just got white noise) so there might be conflicts/races when creating folders.

It's a well known trope, apparently, that users really don't care about fancy coding tricks but will very much like the duck animation they get while they're waiting....


To remember your current position in the blog, this page must store some data in this browser.

Are you OK with that?