Someone have probably thought about this before, if so why doesn't the browsers have this ;-)
If we could use filehashes like DAOS to find if we have downloaded the exact same file before and the resource tag could have a filehash tag from the webserver. then we could get a much faster web.
i.e.
<link http://Dominoserver/database.nsf/mycss.css" rel="stylesheet" type="text/css" filehash="78AD782EA9292">
If the webbrowser found this file hash in the cache regardless of server the same resource could be used. Of course there is probably some security aspects in this but if it worked the web would be much faster.
We could even drop most of the cache aspects in the browsers because content without the filehash uses the old way of caching. But if filehash is found webbrowsers could always find if the same content exists locally, as soon as you change the content your users download the new content and never need to download content/resources they already have locally.
This is an idea nothing that todays webbrowsers or webservers support but perhaps in the future.
Update on post
There is a HTML tag called ETag that is apart of the http 1.1 spec, this adds a crc check of the content. Great news, didn't know about this but we still have the problem that we only want to load the same content once. Regardless of the server we have loaded the resource from. To get a maximum utilization of the cache.