The Transformation Layer

Ramblings about tools and techniques for effective data transformation.
In the past decade most of the heavy lifting in data was abstracted away, what remains is a need practical data modeling.

Why data transformation is even needed?

Considering the state of modern technologies, how come we still spend most of our waking time crunshing numbers on spreadsheets or signing and reviewing several piles of forms and protocols?

The rise of data transformation "frameworks"

Rise hadoop and spark, a mix of configurable and reliable heavy-lifting with the dynamic approach to model data in transit.

With time this layer was abstracted into services specialized in data ingestion, and a whole platform layer for abstracted cloud resources, which separated the heavy-lifting of distributed computing from the computing model used to think about data, even more so for structure data.

Now we have the flexibilty to use compute and storage distributedly with the freedom to use more succint interfaces such as SQL, python and javascript.