That’s pretty much what I was thinking but I don’t know very much about CDNs so I wasn’t sure quite how it’d work. For example, let’s assume that someone’s proxy is looking at an AI enabled site – what size image does the proxy get? If that proxy machine is fed a 301/302 redirect, does it simply cache the redirected image instead of the source URL? If so, then it’s broken and won’t work in the same way it’d be broken if you let any intermediate cache store an AI image. I just don’t think that there’s a way to make AI work with CDNs/Proxies because one of the design decisions was that the mark-up doesn’t change. Which in turn means the request URI doesn’t change. Which in turn means any cache that isn’t the end user’s will be wrong most of the time.
I’m not a PHP expert by any means (until AI it had been 6yrs since I did any PHP), and I don’t understand why images served through it are seen to be ‘slow’. If the image exists cached, all PHP is doing is flinging the image out. If the slowness is simply instantiating the PHP then it’s already been done through the .htaccess and there’s no real penalty from serving an image vs having the PHP issue a HTTP redirect (which the browser then has to deal with). Also, the PHP is required to check that the cached version of the image is newer than the source file – otherwise AI could serve outdated cached images in the event that someone changed the source image after the cache had been generated.
Wouldn’t it be the case that HTTP redirects are going to be considerably slower than simply supplying an image? We’d double the humber of HTTP calls and that means a lot of latency, especially on mobile networks where it’s not so much bandwidth that’s a problem as latency. More HTTP calls would surely be worse? Isn’t that why we package up libraries and use CSS sprites?
I think there are a couple of potential options, but without some serious testing it’s hard to know which is better. My gut is telling me that redirects won’t work out too well, but I would love to be better informed.
As a side-thought (and philosophical question): is it really a bad thing to send different sized versions of the same content from one URL? It’s not ideal, but is it that bad? We are sending the same semantic content after all, it’s not like we’re sending a semantically different document based on the size of the users screen. Likewise, when exactly would someone need or want to save a different sized version of an image than the one which they were already looking at? Does it actually matter?
@John B
That’s pretty much what I was thinking but I don’t know very much about CDNs so I wasn’t sure quite how it’d work. For example, let’s assume that someone’s proxy is looking at an AI enabled site – what size image does the proxy get? If that proxy machine is fed a 301/302 redirect, does it simply cache the redirected image instead of the source URL? If so, then it’s broken and won’t work in the same way it’d be broken if you let any intermediate cache store an AI image. I just don’t think that there’s a way to make AI work with CDNs/Proxies because one of the design decisions was that the mark-up doesn’t change. Which in turn means the request URI doesn’t change. Which in turn means any cache that isn’t the end user’s will be wrong most of the time.
I’m not a PHP expert by any means (until AI it had been 6yrs since I did any PHP), and I don’t understand why images served through it are seen to be ‘slow’. If the image exists cached, all PHP is doing is flinging the image out. If the slowness is simply instantiating the PHP then it’s already been done through the .htaccess and there’s no real penalty from serving an image vs having the PHP issue a HTTP redirect (which the browser then has to deal with). Also, the PHP is required to check that the cached version of the image is newer than the source file – otherwise AI could serve outdated cached images in the event that someone changed the source image after the cache had been generated.
Wouldn’t it be the case that HTTP redirects are going to be considerably slower than simply supplying an image? We’d double the humber of HTTP calls and that means a lot of latency, especially on mobile networks where it’s not so much bandwidth that’s a problem as latency. More HTTP calls would surely be worse? Isn’t that why we package up libraries and use CSS sprites?
I think there are a couple of potential options, but without some serious testing it’s hard to know which is better. My gut is telling me that redirects won’t work out too well, but I would love to be better informed.
As a side-thought (and philosophical question): is it really a bad thing to send different sized versions of the same content from one URL? It’s not ideal, but is it that bad? We are sending the same semantic content after all, it’s not like we’re sending a semantically different document based on the size of the users screen. Likewise, when exactly would someone need or want to save a different sized version of an image than the one which they were already looking at? Does it actually matter?