Skip to main content

Posts

Showing posts from 2013

TimeMachine encryption performance: Synology's vs. Mac OS X's

A quick experiment to see what is faster to encrypt a TimeMachine network volume: using Synology's folder encryption or OS X's automatic TM encryption. Copying a directory with 1200 files and directories in it (source files and compilation products), about 125 MB in total: Unencrypted: 86 sec Synology's encryption: 90 sec OS X's encryption: 10 sec So, not much to discuss about it: OS X's encryption wins. Notes: This was done by the verily scientific method of trying a couple of times and counting seconds aloud. At the beginning I was thinking about doing it in some scripted, repeatable, strict way to control for things like caching and blah blah. But, a 1:9 difference? That's convincing enough for me, kthxbai. I simulated OS X's TM encryption by creating an encrypted disk image (AES 256 bit, sparsebundle) in an unencrypted folder in the Synology's server. This might not be the same than the real TM encryption, since that seems to be done by F

Codility

Lately I came across Codility, a company which present themselves quite simply: “we test coders”. I had been thinking for some time about trying my hand at some coding competition, and was in the process of searching for a new job, so I became curious. But after a quick look at their website the curiosity became disdain. It all sounded like boasting to be a coarse-grained filter to separate the chaff from the wheat, quickly. Only that the chaff are people too. Sounded pretty dystopian. Probably this is a First World problem, but hey, after interviewing in a couple of places things seemed sufficiently bleached and dead inside as to not need any more automatic filter to dehumanize the interviewing / testing even more. Typically in my interviews I missed feeling anything that would make me want to work with the interviewers; I wanted to find someone with passion, instead of someone boringly fulfilling the recruiter role for a hollow company. And the Codility premise didn’t seem to hel

Estimation questions in job interviews

I had heard about those questions in job interviews where you are asked to estimate on the spot something, though having no real data about it. There was some example about estimating the quantity of gas stations on the whole USA; I seem to remember that was for a Google job application. The guy had the guts to throw up numbers in a hunch with the handwaviest idea of how they could be good numbers… and, luckily, at the end the result was even not a bad approximation. I realize now that at the moment, when I read that, the “luckily” part is what stuck on me. While it’s true that the guy had the wit to back the numbers and/or the chutzpah to pick them out of thin air, and even though he also had some brillian moments to just turn around and cross-check some imagined number by using another set of assumptions … the thing is that finally he was near the true number. Luckily . It was fascinating, and rather intimidating. How in hell would I be able to do something like that?

Type punning, aliasing, unions, strict-aliasing, oh my!

Imagine that you have a struct s which sometimes needs to be volatile (because it maps to a set of hardware registers) but other times you'd rather have it non-volatile (because you are working on a bunch of such structs that have been stored, so they are no longer used to interface to the hardware and you could do without the performance penalty of volatile). That is easy enough: you can have your variables declared volatile or not depending on the situation. But what happens when you have a function to deal with those structs? Could it be implemented so it not only works, but does the right thing in both volatile and non-volatile structs? The trivial way to have such a thing is by just defining 2 versions of the function (with different names of course) with the differently qualified parameters; but another possibility is to define a union type in which one member is the plain type and the other is the volatile-qualified type, and make the function parameters use that

My first hardware bug report?

I have found what looks like a hardware bug in the AT90CAN128! I guess I could be happy in a geeky way, but the number of hours spent trying to track the problem make it a bit difficult :P. The thing is that aborting a pending MOb in the CAN controller embedded in the AT90CAN128 can leave CONMOB in an unexpected state. I am not totally sure if this is a hardware bug… or could be Atmel's library at90CANlib_3_2 being buggy AND misleading.

Atmel Studio 6 and blank Solution Explorer

Atmel Studio 6, which is (ununderstandably!*) based on Visual Studio (2010?), suddenly stopped showing anything in the Solution Explorer pane. The pane was there, it just remained empty, blank, apart from the "Properties" button (and the "Show all files" button shows up too, depending on the frontmost pane).

Kiepska jakośc obrazków w T-Mobile

To jest mój pierwszy zapis blogu po polsku, bo to dotyczę użytkowników T-Mobile Polska. Przepraszam za mój polski, nie jestem polakiem ;P. Od kilka tygodnie, może nawet kilka miesiąc, wracałem uwaga na to że jakość obrazków w internecie w ogóle było po prostu kiepsko. Wszędzie! Każda strona internetowa miał za mocno kompresowany JPEGy. Zawsze myślałem że chodziło o kiepski strony internetowe który robili jakaś nie-dobra lokalna kopia od jakiś lepszy oryginalny obrazek... ... ale dziś wracałem uwaga że nawet generalnie dobrze zrobione strony internetowe, który powinni mieć odpowiedni zdjęcie, też mieli bardzo kiepski JPEGy. Co się dzieje?? Każda strona ten samym głupi problem?? A może nie chodzi o każda strona, ale o mój operator...? Tak, taki jest. T-Mobile używa "Blueconnect compressor". Od dawno to istnieje, ale jeśli dobrze pamiętam, to tylko działał kiedy surfujesz w telefonie; przynajmniej, ja nigdy nie wracałem uwaga że też tak działa w komputerze (połączony do

Taking a snapshot of the ports in an AVR (AT90CAN128)

A small inline function to take a snapshot of all the ports as simultaneously as possible, which means 7 cycles from first to last, which means less than 0.5 usec from first to last at 16 MHz. Can't be faster. Had to be done in asm inline because if not gcc prefers to interleave the input instructions with store instructions. Done this way, the inputs come first and the storage comes later, generated automatically.

A hacky fix for SuperDuper running out of space: "delete first" with rsync

Rsync has a possibility to delete the to-be-deleted files in the destination before it starts syncing files that actually need syncing. That is sorely missing in Shirt Pocket's program SuperDuper!, which is actually a nice backup program, and for some time was one of the few sane options available to make fully reliable backups. SuperDuper! just starts copying blindly, and can then find itself in the situation where the backup destination can't contain the new files + the old files that should be deleted but still weren't. So that problem would be solved with rsync's "delete first" behaviour. I see there have been people complaining about this in Shirt Pocket's forums for at least 5 years, and the developers seem to only say "yes, we will do something sometime". But they still didn't. So, this is the command line to be used: sudo rsync --delete --existing --ignore-existing --recursive  /Volumes/ORIGINAL_VOLUME/ /Volumes/BACKUP_VOLUM