please_upgrade_your_please_upgrade_pageIf you have visited windows.microsoft.com lately using Internet Explorer 7, you would probably see the “It’s time to upgrade your browser” nag, which explains that IE7 and IE6 no longer supported and blocks you from browsing their site until you upgrade.

This is a great step, as even with XP support going away, Vista shipped with Internet Explorer 7, so it will not be dead for some time. When they first started doing this on the Windows site, I thought it was cool that they were finally doing something to clean up the mess they created with their fragmented browser ecosystem.

However, Internet Explorer 8 is still a pretty bad browser…certainly better than IE7, but that isn’t saying much.

If you are going to break your website to force an upgrade, it would be great to use that as a platform to get them into the latest version of Internet Explorer that you can. So, if they are on Vista, go ahead and tell them to upgrade to IE9. Better yet, go ahead and add an optional tool they can use to verify automatic updates is on and set to update automatically, as if they are running IE7, they may not be getting security updates either. And, if they are on IE8, go ahead and add a nag for that too! Although, I think that one may be trickier, as in a in corporate environments, upgrading past Internet Explorer 8 may not be possible. So, rather then fully breaking the site, probably a nice warning would suffice. Be a nice kick in the butt for companies that haven’t upgraded yet as well.

This seems like the right thing to do, especially as dropping support for IE8 has already begun on a number of popular websites. Even Microsoft’s Office 365 has recently announced they are no longer supporting IE8.

While doing a fresh XP install this morning and waiting on a few updates to finish, I fired up IE6 to get a few of the downloads out of the way.

However, MSN.com, the default landing page for a clean XP install, is unusable in Internet Explorer 6. To be sure, I did a reboot too, with same effect, as I had been installing drivers[1][2].

Unusable to the point that it simply freezes and errors out each time I open it. This isn’t the first time I have run into this, but the furthest I can generally get before IE6 crashes is typing a word or two in the search box.

The work-around is to go into the control panel and change the default home page to Google, as even just opening IE 6 on MSN.com crashes it, when it tries to load.

As a web-developer, IE is often a source of pain. I don’t generally run into layout issues, aside from occasional bugs, certainly not as often as when I first started developing anyway and much less after the rise of IE7/8. However, it is still quite problematic and ties ones hands, not to mention obscure javascript issues and/or limitations, especially when you start working with forms or want to get even a little fancy.

I dropped free IE6 support some time ago too, now providing only IE7/8 legacy(occasionally very minimal) support with new designs, so I am more than ready to let the time-sink that is Internet Explorer 6 fade to obscurity.

So, I do see the humor in writing a post bemoaning the lack of IE6 support for anything.

However, it was Microsoft that saddled us with abomination that is Internet Explorer 6 and then essentially dropped it on us for 5+ years, basically as soon as Netscape(competition) went belly up stopping innovation. There is no telling the developer hours that move alone cost, not to mention the huge security problems Internet Explorer is responsible for. IE7/8 were fairly bad too, just the horror that was IE6 made them look so much better.

As such, I don’t think it is unreasonable to expect a Microsoft to, at minimum, assure that their flagship pages will load in IE6. Even better, provide a simple landing page to facilitate upgrades could make life so much easier.

Windows XP install still has at least 2 more years of official support(longterm support being a selling point of windows,) and being able to do a search and or open the browser from a fresh install doesn’t really seem like asking a lot. Especially when there is so much opportunity there to get people to upgrade to IE8.

[1] Because I was curious, I checked again after installing Service Pack 3. It just redirects to a blank page that appears to be related to a Facebook error, MSN.com is still unusable, unless you click on something before the redirect, but it isn’t crashing anymore.

[2] Don’t get me started on having to install a Lan driver in 2012, so fundamental and a basic need to do anything else. I understand XP, but even sometimes with Windows 7 I find that I need to dig up a network driver.

Today, Microsoft went on the offensive and called Google out for abusing a well known security hole in Internet Explorer that lets websites set 3rd party cookies, despite Internet Explorer being set to reject 3rd party cookies by default.

Rather than take the opportunity to fix the bug that has been publicly known for well over 2 years, affects multiple versions of Internet Explorer, has been promoted as an IE fix on Microsoft’s own support website, and has been abused by a number of large websites, including Google and Facebook, Microsoft instead used it to attack Google and suggests blacklisting Google Domains.

The timing of Microsoft’s release of this well known issue nicely ties into an article describing how Google was bypassing a Safari privacy setting, which the MSDN blog links to and was released recently.

Read the MSDN Statement: blogs.msdn.com/b/ie/archive/2012/02/20/google-bypassing-user-privacy-settings.aspx

Abusing a 2+ Year Old IE Bug

The protocol in question is P3P, the Platform for Privacy Preferences Project, which is only really supported in any meaningful way by Internet Explorer. It is basically intended to allow websites to provide a privacy policy with their cookies, with the idea being that the privacy policy states how the user’s information will be used and then, depending on the browsers P3P policy, the cookie would be allowed or denied.

In Internet Explorer, even though 3rd party cookies are set to be disabled, you can bypass this by sending an invalid cookie header. As a result, you can easily bypass Internet Explorers default cookie policy via P3P.

In addition, W3C suspended further work on P3P, with the 2006 Privacy Preferences 1.1 appearing to be the last time they have worked on what is a somewhat complicated protocol.

The problems with Internet Explorer’s P3P implementation are well known and were reported by the New York Times in September 2010. in 2010, they stated that a “Large numbers of Web sites, including giants like Facebook, appear to be using a loophole that circumvents I.E.’s ability to block cookies.”

You can read the paper the New York Times Article was based on here: The Misrepresentation of Website Privacy Policies through the Misuse of P3P.

In the above research paper, published in 2010, they state “We discovered that Microsoft’s support website recommends the use of invalid [P3P Cookies] for problems in IE.” They go on to state that the code on Microsoft’s support website was found in about 25% of all invalid cookies they tested.

In this case, Google was taking advantage of, exploiting(?,) this bug to set a third-party cookie.

Fix the Underlying Problem or Start a Campaign Against Your Competitors Services?

Now, anyone one who does any work online should know that you can’t trust people to do the right thing. In a perfect world, we would not need anti-virus software, locks on our houses, or to provide driver’s licenses when we withdraw money from our bank accounts.

However, we don’t live in a perfect world and if something can be abused, it will be.

So, the logical solution would be to fix the gaping hole in Internet Explorer that is opened up by P3P, as it is openly being abused by multiple websites and has been for some time.

I think even if you completely ignore the 2010 security paper, it is safe to say that if Google is Abusing it and it is, especially now, a very well known bug, it is safe to say that A LOT of much more shady websites/businesses are probably abusing it too.

However, Microsoft decided to go another route, suggesting that users black list 12 Google Domains from setting these cookies.

For the purpose of this post, we will disregard the fact that blacklisting is an oft ill-advised solution that can be cumbersome, ineffective, and easily bypassed.

However, this solution conveniently targets their competitor’s services, while not making an effort to address other websites or the underlying problem. So, rather than preventing websites from being able to abuse this now even more well known bug, they are suggesting IE users block the their search, advertisement, phone, operating system, and browser competitor’s web services.

Ultimately, while I don’t like to be tracked period, I am much less worried about the 12 Google domains they block than shady ad-networks that make money selling malware adverts, which could potentially be abusing this bug. These sorts of companies will not be stopped by Microsoft’s “fix.”

Do We Need to Pull Out the Pitchforks for Google?

Obviously, assuming Microsoft’s report is accurate, which given independent research and Google’s own P3P policy, it is probably safe to say they are, Google is abusing a bug!

This is not acceptable and is dishonest. They are, arguably, taking advantage of an exploit in a browser to serve their cookies.

We can, and should, hold Google and other companies to a higher standard, especially when they are in the business of collecting personal information.

Even if their intentions are, as they state in their P3P privacy policy, to get around a limitation in Internet Explorer, at the end of the day, they are taking advantage of a bug and this is not good business. At the very least this is a dishonest move by Google, at worse a malicious attempt to circumvent browser security settings.

So, Google is certainly not without blame.

Broken By Design

It would be interesting to see how many Microsoft Services rely on P3P to poke holes in Internet Explorer’s cookie policy, because that is the only reason I can see for keeping it in place, especially after work on it was suspended by W3C, although Microsoft’s history of honoring web standards is another discussion.

The ideas behind P3P are logical and even could be a nice addition to the way we browse the web. You visit a web-site, it says it uses its cookies for x, y, z and you can block or accept the cookie, without having to read through 10 pages of legalese.

However, while a neat idea, this relies too much on trust in a world that is filled with people that are more than willing to abuse it.

If setting a P3P header stating that the website does not intend to track is all it takes to bypass user-cookie settings, a dishonest ad-network or website is not going to think twice about abusing it. With sites like Google, or Facebook, both of which have abused this bug in the past, there is a good chance for shaming them into doing the right thing. However, there are a lot more sites out there that do not care about reputation management.

Tracking is Big Business: Pot Meet Kettle

While exploiting a bug is not acceptable, all companies go out of their way to track users and Microsoft is no exception. There is big money in tracking and companies use whatever means they can to get user data.

For example, do you block scripts and disable third-party cookies?

Microsoft is still tracking you via Omniture, using a tracking image within a noscript block.

Their premise is that Google is deliberately bypassing a security policy, yet they go out of their way to poke a hole in a user’s security policy too, because this type of data is valuable to them. All large websites, like Facebook, Microsoft, and Google go out of their way to collect user data.

This is also not, as evidenced above, something that Microsoft just figured out or suddenly noticed. This is a well known bug and in the past, Microsoft suggested exploiting it on their own support website to get around IE bugs.

It is possible that they only now found out Google was doing it, which is very unlikely given Google’s public P3P privacy policy, however I think it is more likely they thought now would be a good time to capitalize off the Safari privacy issues Google has been having.

How to Fix

Microsoft provided a blacklist to use that will disable third party cookies on certain Google Domains, but this is short sited and aside from appearing to be a nice way to hurt one of their main competitors in search/advertisement/ect, I would be more worried about the unknown websites and much less reputable ad-networks that can and will abuse this bug.

If they did not know about it before, they do now and it is a whole lot harder, if not impossible, to create a blacklist for all the shady websites that could abuse this.

Until they patch their browser and re-evaluate the largely unsuccessful p3p protocol, you can disable third-party cookies completely via Internet Explorer settings. I have not tested this, but apparently if you actually disable them, instead of relying on the default cookie policy, these sorts of cookies would get blocked.

Or, install a better browser, like Firefox…

Test Test Test

February 12, 2012

As the saying goes, you should never assume, because you make an Ass out of You and Me.

Recently, I had the opportunity to work behind a local web design company who had built a website for a local restaurant.

When they developed the website, they added a contact form, using jQuery Datepicker and Contact Form 7, so visitors could make a reservation at the restaurant.

However, the hours and dates they were open were hard-coded in a javascript file.

The web design company went in and blocked out Christmas and a few other days for the next five years, to ensure the design had a bit of a lifespan. However, if you wanted to change any of the days/times, you would need to go in and edit the custom javascript file.

I was given the job of creating a way for the restaurant owners to open/close hours and days, blocking reservations for certain times as needed, without having to use a developer for each edit.

A Nice Inexpensive Solution

Using the base javascript they developed, I used custom fields in wordpress to allow the owners to manually block days, as well as set custom hours for any given day.

They were wanting to go a little less expensive, so I did not develop a full plugin, just a simple text-based solution.

The end result was a php generated javascript file, with nice formatting, which pulled data from the custom-fields of a post to open/close days and hours.

I did it programatically, so the javascript output was controlled by a few functions, making changes easier. Although the logic behind it took a bit to get right, I think it ended up being elegant enough and easy to update.

Testing the Solution

I did a fair amount of testing, including a bunch of checking in IE8, Firefox, Chrome, Android, and iPhone.

However, I did not do as much as I would normally, because I assummed that the company I worked behind had done some testing as well.

Ultimately, IE7, slipped through the cracks and I did not check the code in Internet Explorer 7.

If I had, I would have found the disabled select options, obviously, were not working in IE7 at all and certain versions of IE8. I have run into this bug before, but let my guard down a little this time.

This is something I am not proud of, as it is out of character of my often rather obsessive website testing practices.

It Always Fails When you Need It

As is often the case with Murphy’s Law, faults in the reservation form did not become apparent for a few months when they finally started using the tool.

It soon became clear that people were able to make reservations whenever they wanted.

I added some testing and soon narrowed it down to iPad as being the main issue. So, I disabled it for the iPad and added some more checks to see what browser people were using.

However, I soon found that people in IE7 could also make reservations at any date, as well as some IE8 users. It was then that I tested for, apparently, the first time ever the design in Internet Explorer 7 and found I had overlooked a glaring bug.

The original web-design company used attr(“disabled”, “disabled”) on the select options to block out times. However, this would never work in IE7, because disabling a select that way is not supported in certain versions of Internet Explorer. The bug extends to Safari Mobile too, which is why it was not working on the iPad.

The Fix

Once I saw what I overlooked, javascript that could literally NEVER work in IE7 on XP, I had a head meets wall moment. I have run into this bug before and it is very well documented.

There are a few javascript based fixes, including adding css “disabled” class to the select or changing it to an optgroup. However, in this case, a lot of javascript was already in place and those types of solutions would have been A LOT more complicated.

So, the fix ended up just being changing the logic of how business hours were displayed.

Instead of showing ALL the hours and then disabling closed ones, I completely removed all the hours and ONLY added open times.

Looking for an Out

It is easy to want to blame someone else and for many this is human nature.

In fact, we reached out to the original Web Design company, as soon as we found the iPad issue.

They were quick to blame it on me, suggesting that his coders could fix my “patch” and “…write our own query in java…” to fix my broken code. Whether this is ignorance or he was just trying to use technical jargon to confuse my client, I do not know, but the point is he saw an out and used it.

Despite the code being broken when it shipped and me providing a test site with their original code on it, they have not yet reached out to admit any fault.

No One To Blame But Myself / Test Test Test!

At the end of the day, however, I share just as much blame as the original web design company.

I should have tested it like I do everything else I design, which is essentially EVERY browser I can get my hands on.

Even though it shipped broken, I should have checked and caught this when I rolled out my update. Instead of being lax, because I assumed they would not have shipped broken code, I should have tested it just like any of my other code.

So, the moral of the story is ALWAYS TEST TEST TEST!