FOAM TOTEM


Foamy
Christmas Radio
[high] [low]
Last update:

"Do Not Track" is Mostly Useless

There's a push for web analytic services to respect a "Do Not Track" option which would disable the use of cookies (or whatnot) from collecting and collating information about you across websites. This initiative has come up again recently (I think) because of Google's recent privacy policy change which treats everything owned by Google as a single "site", hence allowing them to aggregate tracking across them. (To be honest, I was surprised that they weren't already.) But that's just a tangent.

The way it works is that you tell your web browser that you don't want to be tracked. It then tells every web server it contacts that you don't want to be tracked. Then the web server ignores this respects this and doesn't do any tracking or logging or whatever for you. This won't work, because it's entirely voluntary. Even if it's somehow legislated via the FTC, I doubt that one would even be able to detect that they were being tracked against their wishes.

Another problem is that it's not clear (at least to me when reading through the materials) exactly what is disallowed. In some places in the various write-ups, it seems like cross-site tracking is disallowed. In others, it seems to extend even to cross-session tracking (meaning it doesn't aggregate this visit to a site with what you did the last time you visited the same site).

But it is a nice idea, and I'm sure the big companies will respect it. But it won't really stop tracking if there's money to be gained by doing it. The only way to completely stop something like this on the internet is to make it technologically impossible. And that will be very, very difficult as the existence of zombie cookies proves.

As people live more of their life online, their privacy is eroding. This isn't necessarily bad. The biggest problem I see with this is knowing when you're leaking information. When you buy things with your credit card, it's probably not a surprise to you that the credit card company knows every place you've bought something, though they don't know exactly what you bought.

(Though even data collection by a single company can be surprising. For example, see the interesting tale of Target inferring a customer's pregnancy from their purchases. They can even estimate the due date.)

Online, though, things we naturally feel are separate really aren't. Tracking your behavior across websites, for example. You feel like you're leaving one site and going to another, but it's not like the real world where you can only be in one place at a time. I think many people don't yet understand how different the real world is from the digital world. All your common-sense rules are broken. (Just look at how the movie and video industries are thrashing around as they refuse to learn this.)

Every single web site you visited today has your IP address. Your internet provider may have logged every single IP address your computer communicated with (including that clandestine visit to howtoharvestorgans.com). It's in your best interest to know who has access to this information. I don't; do you? Do your terms of service (that I know I didn't read) protect this information from being harvested and used? Would you be shocked to get a coupon in the mail from "The Organarium" a couple days after visiting a competitor's site?

The debate over our online/digital rights has been going on for a while. I expect that it will only grow more contentious as time goes on. Nothing is really decided yet and I predict some knock-down drag-out privacy wars in the coming years.

[Updated to fix a typo and also rescind the statement that credit card companies know what you've bought. At the same time, I also added the story about Target.]

Retro Metro

I find Microsoft's Metro interface oddly retro. They've done everything they've can to simplify the look. It's flat and square-cornered. There are no shadows or bevels. Gradients are subtle to the point of non-existence. The icons are one-color stencils. It's an interface based on flat rectangles and typography. It reminds me of the text-mode UIs of the 80s and 90s.

As a tactile interface that one would touch, it makes a lot of sense. It makes each tile clear, the colors are inviting, and the icons are big enough to be unambiguous. The simple shapes and structure mean that it's not difficult to draw, which is a boon to telephones and tablets with limited battery life.

The Metro Start Screen

For a desktop interface, though, there seem to be many places it goes wrong.

First off, every Metro screen I've seen is a rainbow of various colored tiles. As far as I can tell, these colors have no structure and don't mean anything. I find that they keep my eye from being able to settle on any one thing. Worse, many tiles are constantly scrolling or fading in new information. For example, the mail tile shows the most recent message and a twitter tile constantly updates with new tweets. I fear that it will be a bit too kinetic.

And why, on a desktop, would I want to look through the lens of a tiny tile when I could flip to the app? I think the idea of desktop widgets largely failed. Windows had variations on them since Win95, and I don't think anyone has ever really used them on the desktop.

Those who know me will also know that I find the unused space on the screen vexing. I'm not saying that it needs to be packed to the gills or that empty space isn't an important organizing mechanism, but Metro is often simply wasteful. Over a third of the available vertical space in the screenshot above is blank, for example. That seems excessive. The giant space plus title ("Start" above) seems to be a common Metro cliche. And in the examples I've seen, the whole thing is often redundant. Does this screen need "Start" on it? Does the Photos library need "Photos" on it?

Pretty, but lacks utility

I'm also surprised that Microsoft has hidden important things behind gestures. How does one shut down the computer given the Start picture above? Well, you put your mouse in the lower-right corner and a menu pops out. There's no icon or anything to indicate this. Likewise, when you're in an app you put the mouse in the lower-left corner to get back to the Start page. There's nothing wrong with these gestures (in fact, they're great in terms of Fitt's law) but how would anyone ever learn them without being told? I watched several videos of people showing off the consumer preview and all of them had a moment where the reviewer couldn't figure out how to exit the app they were running and go back to the desktop.

In short, I think Metro looks great and is well-designed for tablets. It's meant for touching. In a touch environment, once needs the empty space for gesturing. But on a desktop, I fear it'll be a mess. Though Windows 8 lets you use non-Metro apps in what looks like a Windows 7 desktop, it seems like they're really pushing everything into Metro. Even the newest Office seems to be going that direction. It's a daring, though I think bad, move.