In 1999, I made the "iPic", which became known as the "World's Smallest Web-Server".
It is a complete TCP/IP stack and HTTP in just 256 assembly instructions in a 6-pin 8-bit microcontroller.
The implementation demonstrates the power of De-Layering.
Unlimited TCP connections were squeezed out of 5 bytes of RAM by re-purposing network latency as protocol state memory. ("Delay Line Memory rises like the Phoenix"?). Supposedly "non-information bearing junk" in protocol headers were commandeered to calibrate a lousy drifty RC clock, conjuring 4% stability out of thin air. (Hmm... a "Software Defined PLL over HTTP"?). The wasteful bureaucracy that is an OS was optimised out by reifying shared entropy across the layers, through formal refinements and cross-layer code-transformation.
De-Layering can squeeze extreme efficiency out of implementations without compromising formal correctness and maintainability, and without roiling modularity and software engineering principles.
The same techniques can also be applied to the high-performance systems software end of the spectrum. This has led to high-performance stream processing engines and ML algorithms in FPGA. When applied to enterprise software, de-layering has enabled boosting storage compression and ad-hoc querying performance by one or two orders of magnitude.
Source code modules necessarily recapture social and corporate structures that create them. But should post-optimized artifacts also needlessly continue to carry that burden in their decomposition?
In man-made systems evolution, does ontogeny still have to recapitulate phylogeny?