explog
Clever encoding of numbers that makes arithmetic efficient, always magical
The COVID-19 situation is the first such in my life. It has affected me and my family in various ways, but boy am I lucky.
I am Indian and have been living in Europe for a few years now. When they announced lockdown where I live, I was still allowed to go out and excercise by the river. Managers at work explicitly acknowledged the difficulty of switching to WFH. I was asked not to worry at all about being as productive as I was in the office.
When I moved to Zürich last year, I was drawn to percussion in general. I played around with a drum-kit, and I loved it. The coordination required to keep time and play the right notes takes my mind somewhere else. I get to focus only on the current moment, because if I don’t, I’m going to miss the next beat.
I find the tabla unlike many other percussive instruments in that:
The 36th Chaos Communication Congress took place in Leipzig from 26-30 December, 2019. I’d always wanted to be part of the congress, but it is quite difficult to score a ticket if you’re not part of a hackerspace. Last year though, I was determined, and managed to get a ticket to 36c3 during the second presale.
I was quite excited, but also quite nervous, as I had no idea what to expect from this mega congress of 17000 people!
The borrow checker is arguably one of the biggest sources of frustration when learning Rust. Once the Rust compiler is satisfied with the syntax, it gets down to the real stuff: proving that your program is free from data races caused by aliasing of memory.
However hard it might seem, the borrow checker is really right, and once you internalize its ways, data-race prone code written in any language will jump out at you, even without actually running the code.
Suppose we have a Perl module that allows us to query a set of business objects (for example, purchases made on our website). The API of the module looks something like:
1 2 3 4 5 6 7 8 9 10 my $collection = Purchase::Collection->new(); my @purchases_2018 = $collection ->date_after('2017-12-31') ->date_before('2019-01-01') ->region(['EU', 'US']) ->revenue_collected(1) ->with_amount({ '>', 1000 }) ->fetch(); So all the selector methods like date_after or revenue_collected update internal filter state of the collection object, and all these filters are applied only at the end with the fetch().
Files already have a lot of user accessible metadata associated with them – the last time of modification, access control bits, etc.
Extended File Attributes (xattr) is a mechanism to store extra metadata against files, in the filesystem. This metadata takes the form of key:value pairs, with certain (platform dependent!) size restrictions. As of now, xattr is not a POSIX standard, but is independently supported by many modern filesystems like ext2, ext3, ext4, XFS, ReiserFS, etc.
The HAMCYCLE problem 🔗If a graph $G\left(V, E\right)$ contains a Hamiltonian cycle, we can pick a subset $E’ ⊆ E$ of edges in the graph such that:
All vertices in $G$ appear in the resulting subgraph and, Starting at any vertex, we are able to visit each vertex exactly once, except for the starting vertex, which is also the last vertex in the cycle. The Hamiltonian cycle problem asks if a graph $G$ contains a Hamiltonian cycle – i.
Lockfiles are commonly used for process level mutal exclusion. For example, a cronjob processing hourly logs can hold a lock so in the event it ends up taking more time than an hour, the next hourly job does not clobber the working directory. Databases like Postgres also use lockfiles in their data directories to ensure at most one serving process is handling the data.
On Unix, a very simple way of doing this is to open a file with the desired path with O_RDWR and O_EXCL specified:
I decided to read interesting deep learning papers often and try to summarize them to aid my own understanding of the topics.
This paper, ImageNet Classification with Deep Convolutional Neural Networks demonstrates a record breaking result on the ImageNet LSVRC-2012 competition. The authors participated in the competition under the name SuperVision (which is extremently difficult to Google, especially in the “deep learning” context, which makes Google surface supervised learning related results).