QUOTE(CharlotteWebb @ Sat 31st July 2010, 1:19pm)
![*](style_images/brack/post_snapback.gif)
Seems kinda hypocritical for them to even study that sort of thing, considering that article page attempts to load the following items, few of which serve a legitimate purpose.
QUOTE(AdBlockPlus)
That's the modern Web. You can't browse it with a slow connection and an old computer anymore, because every damn commercial website wants to load scores of cross-site scripts. A Windows 98 PC with 64 megabytes of RAM used to be more than adequate for web-browsing, a mere 10 years ago. Anyone remember
tinyapps.org? In fact, I used to have a Mac Quadra with only 36 megabytes, it successfully ran Netscape 4, albeit a bit slowly.
Today, it would probably hang the instant it went to the WSJ main page. People turn their noses up at a PC with less than 2 GB of memory---and they have to, because current applications love to waste main memory. And this is considered
"progress"......
Wikipedia doesn't use lots of tracking cookies and JS (they didn't use any, until the new "scheme" went live ~3 months ago). However, I would not be surprised if they add more and more scripts in the future.
That's what programmers do in this hellish age of fatware---they look for ways to waste computing resources.