Compilers are the New Frameworks
Increasingly, the bytes that get shipped to browsers will bear less and less resemblance to the source code that web developers write.
In the same way that a compiled Android binary bears little resemblance to the original Java source code, the assets we serve to users will be the aggressively-optimized output of sophisticated build tools. The trend started by minifiers like UglifyJS and continued by transpilers like Babel will only accelerate.
This is a loss in some ways (who else got their web development start with View Source?) but is a huge win for users, particularly in emerging markets.
But that’s not to suggest that the task is as simple as just porting good ideas to web APIs. The constraints are very different.
Native code tends to have the luxury of not really caring about file size—a small 40MB iOS app would get you laughed out of the room on the web. And AAA game titles accept minutes-long load times in exchange for consistent 60fps performance, but I shudder to think what a 30 second load time would do to the conversion rate of your e-commerce site, 60fps or not.
Our job now is figuring out how to adapt the ideas of high-performance native code while preserving what makes the web great: URLs, instant loading, and a security model that allows us to forget that we run thousands and thousands of untrusted scripts every day.
So here’s my advice for anyone who wants to make a dent in the future of web development: time to learn how compilers work.
When we’re trying to speed up some part of our code, we want quick, targeted feedback about how our changes perform against the initial implementation. It’s common practice to write a microbenchmark: a small program that runs just the code you’re interested in and measures how well it performs. But be warned: microbenchmarks are fraught with peril, even for experts.