It's good because it tells people, again, what they should be practicing but oh so many aren't...optimise your assets. However it's not great because it's the same conversation we've been having for over a decade now. The problems of the turn of the century, where people would "go crazy" and design for 56k modem users leaving those of us on the 28.8k connections languishing in ever longer download times, are the problems of now...but on a more global scale.
I had high hopes...
A visitor to your website might be using a high-PPI tablet or phone from the comfort of her couch, or from the middle of the Arizona desert. Likewise, those brand-new Retina Macbook Pros could be connected to the internet via Google Fiber, or tethered to a 3G hotspot in an airport. We must be careful about our assumptions regarding pixels and bandwidth.
...but ultimately this avenue was simply not progressed. And thus we have the problem about discussion of proper responsive web design in the modern time: It focuses *only* on the front end development team.
I love front end development, the creativity and freedom to express is fantastic...but I also love that when there is a problem that front end development simply can't address, I can fall back to my server side knowledge.
So when I see this...
Other attempts exist, such as bandwidth detection, cookie setting, server-side detection, or a mixture of all three. As much as I’d like robots to solve my problems, these solutions have a higher barrier to entry for your average web developer. The major pain point with all of them is that they introduce server/cookie dependencies, which have been historically troublesome.
We need a purely front-end solution to high resolution images.
...I just have to groan in despair. (emphasis is mine)
You see, the problem of serving the right image size for the right device is indeed a front end problem. You measure the size of the screen you've got, you take the resolution in to account, and you put the right image in to keep everything looking pretty and engaging.
Bandwidth though, bandwidth... that is a server side problem (for now), and one we have gone on for too long ignoring. The idea that you can have a purely front-end solution to the paradoxes of bandwidth management is unrealistic right now, and may always be. I guess that's why the rest of the article ignored this issue.
Take for example your "average web developer", who is on holiday in the moors. They have, at best, a 2g/edge connection to the internet over mobile networks but their company has just called them to sort out a critical problem with a website.
They pull out the (hypothetical, but soon to happen) large screen laptop they develop on knowing this kind of call was possible, with a 200DPI screen. Your purely front-end developer produced website loads some catastrophically large images that take ages, and cost loads, to load and assess.
On the flip side you have someone with an old iPhone2, sitting at home on their WiFi, with bandwidth that exceeds the kind of speeds you'd ever need to buffer your favourite episode of Breaking Bad with no buffering issues, and they are being served heavily optimised images just in case they have little network capability.
Each person receives the opposite experience that they deserve, and all because our purely front-end approach doesn't take in to account the main factor that we're trying to be responsible about.
Browsers need to take more responsibility. Our only communication with the user's situation is through the browser. We are sitting in a room with an earpiece on that is connected to a single person. That person can talk to anyone they want, but we can only listen to, and speak to, them.
The browsers need to start taking information from the user's machine, and from their very browsing, to generate reliable and mid-term data of that person's bandwidth. Through access to the device's inner workings, and complimented by being able to keep short-term calculations of something as simple as an ongoing average download speed on all sites, would provide our websites with the information they need to do right by the user.
This could be as simple as sending this information along within the data that is sent to the server as standard (in a HTTP request). A numeric value of the current likely download speed would let server-side developers take that value and guide what is served to the user, by managing which CSS files are delivered, or more directly through a server-side image processing solution such as Adaptive Images.
But it's not impossible to assume that the browser developers themselves can't go one step further. If we are to implement the srcset or picture aspects of HTML5, then why not future proof the web and consider that a value for bandwidth may also be included?
If a media-query style selector was available for bandwidth, such as "min-bandwidth" and "max-bandwidth"...dealing with the potential problem for the value of this bandwidth changing (and thus changing content)...we could add a better granularity to our front-end image requests.
Looking at the Apple pushed srcset attribute, and using a value that represents the current average download speed in KB/s, we could see this (taken from the A List Apart article's example and modified)...
<img alt="Cat Dancing" src="small-1.jpg"
srcset="small-2.jpg 2x 50b, // this is pretty cool
large-2.jpg 100w 100b, // meh
large-2.jpg 100w 2x 400b // meh@2x
You see that I'm following the same nonsensical syntax that has been proposed by saying that "b" represents the bandwidth availability of the client, as "w" represents the pixel width.
If we were to look at the "A List Apart" example, that incorporates the picture element, it may look more like this:
<picture alt="Cat Dancing">
<source media="(min-width: 45em) and (min-bandwidth: 400kb)" srcset="large-1.jpg 1x, large-2.jpg 2x">
<source media="(min-width: 45em) and (min-bandwidth: 200kb)" srcset="large-1.jpg 1x">
<source media="(min-width: 18em) and (min-bandwidth: 100kb)" srcset="med-1.jpg 1x, med-2.jpg 2x">
<source media="(min-width: 18em) and (min-bandwidth: 50kb)" srcset="med-1.jpg 1x">
<source media="(min-bandwidth: 50kb)" srcset="small-1.jpg 1x, small-2.jpg 2x">
This would solve a whole load of problems, aside from being even more verbose than the already verbose suggestion! For a start it would begin by looking for a small and generic image that all can use. If the screen is a high resolution display it'll go for the higher resolution image, but only if the bandwidth measurements are such that it is responsible to do so.
The end result is that the developer with the large laptop in the hills will get poor resolution imagery, but it'll be fast...while the person at home will get crystal clear high quality imagery, also delivered fast.
By being more holistic in our approach to how we make decisions, whether front-end or server-side, we can deliver better experiences. Adding the ability to control content by an average bandwidth value, at the start of a user's browsing session ideally, we can control load times. We can ensure that no image we serve is likely to take more than 3 seconds to load regardless of the quality being delivered, all because we are deciding on that quality based on their bandwidth limitations.
EDIT: There is a further look at this, and how much broader we can go beyond screen dimensions, on Matthew Palmer's site.