Public key infrastructure and digital certificates


In this post, I will talk about public key cryptography, how encryption/decryption happens in asymmetric key cryptography, and how digital signatures and certificates are useful. Public key cryptography Also, known as asymmetric key cryptography. Asymmetric because there are 2 keys - public and private. Public key is available to anyone in public. But the private key remains with the original owner. Can be used to encrypt/decrypt data or to verify the authenticity of the data(digital signatures).…
Read more ⟶

What is Wi-Fi anyway?


You might ask what you mean by ‘What is Wi-Fi?’:confused: - I connect my device to the WiFi network and boom I am accessing the internet. Well, I held this abstraction for quite some time and the time had come to delve into a lower abstraction, because why not, there is no harm in knowing some background details. For the sake of not losing my mind, I wouldn’t dive into super lower-level details nor I have looked into them :smiley:.…
Read more ⟶

HyperLogLog


Table of contents Streaming algorithm and sketching algorithms Count Distinct problem Hyperloglog(HLL) HLL improvements Demo Streaming and sketching algorithms Algorithms for data streams that examine data a few times and have low memory requirements. Storing individual elements is not possible in data streams as in most cases the size is enormous. We try to get approximate or “good enough” estimates while storing the “sketch” of the data(i.e some representation of data) but not the data itself.…
Read more ⟶

Spark aggregation with native API's


Table of contents Spark aggregation Overview TypedImperativeAggregate[T] abstract class Example Spark aggregation Overview User Defined Aggregate Functions can be used. But are restrictive and require workarounds even for basic requirements. Aggregates are unevaluable expressions and cannot have eval and doGenCode method. Basic requirement would be to use user defined java objects as internal spark aggregation buffer type. And, passing extra arguments to aggregates e.g aggregate(col, 0.24) Spark provides TypedImperativeAggregate[T] contract for such requirement (imperative as in expressed in terms of imperative initialize, update, and merge methods).…
Read more ⟶