A few things I’d like to see in the near future:
– the ability to mark a link as spam in a search engine so that I won’t ever get a result from that page again;
– the ability to put specific images into the browser cache and be able to choose what is deleted when emptying the cache;
– finally a version of IE that handles CSS the right way (almsot sounds like a utopian vision);
– the standardization of the WHATWG specifications;
– the merge of DomainKeys with SPF and the resulting protocol’s propagation;
– the ZRTP protocol becoming a public standard (wait, the public standard has been presented at the IETF conference in Seoul only a few days ago… cool!) and its integration into Skype (still have to wait for that one to happen though);
– IPTV finally having a breakthrough;
– VoD in Luxembourg now that digital tv is coming;
– affordable HDTVs;
– a real alternative to Google;
– QuickTime transcoding movies as fast as ffmpegX or there’s just no sense in buying the Pro version;
– no more party.lu, hot.lu, weekend.lu or anything alike (and no, they are not funny anymore, they’re just stupid);
– an ISP like this in Luxembourg;
– the video store in iTunes available outside of the US (particularly here in Luxembourg, where after all they have their European registered office).
Please feel free to add other things you want to have in the comments. :)
I think it’s possible to solve no. 1 fairly simply with greasemonkey. Although search engines are good enough that if you see a spam site among the results you know you wouldn’t find what you’re looking for anyway.
I’ve never heard of Greasemonkey so far, but it sounds like quite an interesting extension, I’m surely going to test it. Thanks for the tip!
Yes, it is fairly easy to recognize spam sites, but it would still be nice if the search engine would offer you the opportunity to block it (and now that everyone’s so crazy about AJAX, that would be something which could change my opinion about Web 2.0). I know I’m kind of obsessed with clean and nice websites. :)
What I meant is that spam sites are generally not considered relevant by search engines. If you search for something and get spam sites in the first few results it’s very likely that you won’t come up with anything remarkable using that particular search key, even if you browse dozens of result pages. (In other words: the spam problem ought to be solved by search engines getting smarter, not by you saying “this is spam”).
p.s. If you didn’t know about greasemonkey, chances are you don’t know about bookmarklets. Try them, especially the “zap” ones, they’re really good at fixing some of the most annoying website design flaws.
What I’d like to see is some functionality of dmoz.org integrated into the big search engines such as Google or Yahoo!. I perfectly understand that it is impossible to check millions of websites and that by sole human control you’ll never get all the webpages into a search engine, but a combination would be great, i.e. bots browsing the web, an algorithm checking on what’s spam and what’s not and finally users having the possibility of refining the lists (the search engines’ algorithms might improve, but the spammers techniques unfortunately also do…). In my opinion, fighting spam ought to take place at several levels and shouldn’t be limited to one single method.
No, I didn’t know about bookmarklets yet either, but it seems as if they are going to improve my surfing experience a lot. Thanks!