Tethering is exhilarating

February 23, 2010 10:25 am | 40 Comments

I love my iPhone. That sentiment doubled the day I followed Nat Torkington’s pointer to Ten Second iPhone Tethering. Later that month I flew to Boston and basked in the freedom of having an Internet connection at the airport and hotel without paying wifi fees.

Then it all came crashing down. iPhone 3.1 came out. I had to choose between visual voicemail and tethering or consider jailbreaking my iPhone. Tech support in my household is limited (me) so I said goodbye to tethering. I’m back to paying hotels $10 per day to use their wifi, or signing up for a day of T-Mobile Hotspot usage at Starbucks.

Then I got my Nexus One. I really like it. It’s a huge improvement over the G1 I got last year. iPhone is still my dominant phone, but I carry the Nexus One and Palm Pre with me and am spending more time on the Nexus One.

I’m gearing up for some travel so revisited the topic of tethering. I was stunned when I spoke to AT&T tech support two days ago and they told me they support tethering. How did I miss this?! Then the guy said I had to jailbreak my iPhone. It seems weird to have tech support recommend jailbreaking. I guess that’s a result of the AT&T/Apple love/hate relationship. Same story with Palm Pre – gotta jailbreak it.

My hopes rose when I found articles saying you could tether with Nexus One. I installed PdaNet. That went smoothly. It works on Mac and Windows. I’m Mac at home but when I travel I take my Windows laptop, so that’s the critical platform for tethering. I’m always wary of new installations bogging down Windows, but PdaNetPC.exe is only 17M of memory and 0% of CPU when not in use, so I’m fine with that running in the background.

I tested it last night at home, but the real test was this morning. I stopped for coffee at Peets, booted up Windows, tethered my Nexus One, opened a ssh session, and drove to work. At every stoplight I verified my ssh session session was still active. I was reading email, surfing the Web.  It was exhilarating. I know that’s incredibly geeky to say, but I revel in the freedom it gives me. +1 for tethering without jailbreaking. All smartphones should do this.

40 Comments

new Browserscope security tests

February 19, 2010 10:25 am | 3 Comments

Browserscope is an open source project based on my earlier UA Profiler project. The goal is to help make browsers faster, safer, and more consistent. This is accomplished by having categories of tests that measure how browsers behave in different areas. Browserscope currently has these test categories: Network, Acid3, Selectors API, Rich Text, and Security.

The project is led by Lindsey Simon. Today he blogged about updates to the Browserscope security tests. The security test category was created by Collin Jackson (CMU) and Adam Barth (UC Berkeley). They worked with David Lin-Shung Huang and Mustafa Acer (both from CMU) on today’s release of tests for HTTP Origin Header, Strict Transport Security, Sandbox Attribute, X-Frame-Options, and X-Content-Type-Options. Check out their blog post for more details on what these tests actually do.

There are other new features in today’s release. We’ve updated the list of “top” browsers (notice we dropped IE 6). Lindsey added a dropdown menu to each test category for easier navigation. I run the Network test category. In that area I broke the overloaded parallel script loading test into four more specific tests that measure whether external scripts load in parallel with images, stylesheets, iframes, and other scripts. Brian Kuhn (Google) contributed a test for measuring whether the SCRIPT ASYNC attribute is supported.

One of the key aspects of Browserscope is that all the data is crowdsourced. This is critical. It allows the project to run without requiring a dedicated test lab. And the data is gathered under real world conditions. But to be successful, we need people in the web community to participate. When you’re done reading this post, point your browser to the Browserscope test page and click “Run All Tests”. It’ll only take a few minutes and you can sit back while it walks through all the tests automatically. We’re all in this together. Join us in making the web experience faster, safer, and more consistent.

How Does Your Browser Compare?

3 Comments

Performance of 3rd Party Content

February 17, 2010 5:17 pm | 55 Comments

Last month Jesse Robbins and I co-hosted the Velocity Summit. For the last three years we’ve gathered 30-40 industry gurus in the area of web performance and operations to help identify important topics to highlight at Velocity.

One session at this year’s Velocity Summit was called “Performance for the Masses.” It was led by Pat Meenan of WebPagetest.org fame. The discussion focused on the question “what improvements can we make in the world of web performance that will have a big impact for a lot of people?” The performance problems of ads came up, and everyone in the room got pretty excited. I might even say incensed.

From the beginning of my work in web performance back in 2004, web site owners have cited ads as the cause of web sites being slow. Embedding third party content in a web page is a performance challenge. In the best of situations, it’ll have some performance impact. In the worst of situations, third party content can make your page unusable. JavaScript errors, blocked rendering, HTTP timeouts, numerous resources, and just plain huge files can prevent users from seeing the main content of your page.

The problem of third party content is even more complicated for today’s web sites. In addition to ads, there are also widgets and analytics. I had a concall last month with Mashable to talk about performance. Their page has 40+ widgets including TweetMeme, Facebook, and Digg. Talk about a performance challenge! In the world of analytics the biggie is Google Analytics. That’s why I was so psyched to see them release their async snippet. Loading the GA code asynchronously reduces its impact on the main page’s performance.

During the Velocity Summit discussion, Alex Russell (Dojo, ChromeFrame) suggested creating a new tag that could wrap 3rd party code snippets and make them perform better. Alex called this the FRAG tag since it would be like creating a document fragment. The biggest performance gain this tag would support is asynchronous document.write. Many (most?) 3rd party snippets use document.write to modify the main page, which blocks the page from rendering and downloading further resources. Figuring out a way that the browser could continue to load the main page in parallel with 3rd party content containing document.write would be a big win. Jonas Sicking (Firefox), Alex, and I have had some follow-up discussion. We’ve also been looking at the proposed changes to IFRAME for HTML5, including the seamless and sandbox attributes. This might be a possible path, although we’d have to figure out how to make it degrade gracefully in older browsers, something a new tag would do nicely.

To help ground the discussion and make sure any proposal solves what’s wrong, I promised to review a dozen or so examples of popular ads, widgets, and analytics. Over the next month I’ll publish a series of posts that looks at specific code snippets. Some that I’m thinking about: Google AdSense, TweetMeme, Facebook Sharer, Digg, Meebo, Google Analytics, and Quantcast. Let me know what other popular 3rd party content snippets you think I should look at.

OK. Time to go look at snippets…

55 Comments

Browser Performance Wishlist

February 15, 2010 4:25 pm | 28 Comments

What are the most important changes browsers could make to improve performance?

This document is my answer to that question. This is mainly for browser developers, although web developers will want to track the adoption of these improvements.

Before digging into the list I wanted to mention two items that would actually be at the top of the list if it wasn’t for how new they are: SPDY and FRAG tag. Both of these require industry adoption and possible changes to specifications, so it’s too soon to put them on an implementation wishlist. I hope these ideas gain consensus soon and to facilitate that I describe them here.

SPDY
SPDY is a proposal from Google for making three major improvements to HTTP: compressed headers, multiplexed requests, and prioritized responses. Initial studies showed 25 top sites were loaded 55% faster. Server and client implementations are available, and some other organizations and individuals have completed server and client implementations. The protocol draft has been published for review.
FRAG tag
The idea behind this “document fragment” tag is that it be used to wrap 3rd party content – ads, widgets, and analytics. 3rd party content can have a severe impact on the containing page’s performance due to additional HTTP requests, scripts that block rendering and downloads, and added DOM nodes. Many of these factors can be mitigated by putting the 3rd party content inside an iframe embedded in the top level HTML document. But iframes have constraints and drawbacks – they typically introduce another HTTP request for the iframe’s HTML document, not all 3rd party code snippets will work inside an iframe without changes (e.g., references to “document” in JavaScript might need to reference the parent document), and some snippets (expando ads, suggest) can’t float over the main page’s elements. Another path to mitigate these issues is to load the JavaScript asynchronously, but many of these widgets use document.write and so must be evaluated synchronously.A compromise is to place 3rd party content in the top level HTML document wrapped in a FRAG block. This approach degrades nicely – older browsers would ignore the FRAG tag and handle these snippets the same way they do today. Newer browsers would parse the HTML in a separate document fragment. The FRAG content would not block the rendering of the top level document. Snippets containing document.write would work without blocking the top level document. This idea just started getting discussed in January 2010. Much more use case analysis and discussion is needed, culminating in a proposed specification. (Credit to Alex Russell for the idea and name.)

The List

The performance wishlist items are sorted highest priority first. The browser icons indicate which browsers need to implement that particular improvement.

download scripts without blocking
In older browsers, once a script started downloading all subsequent downloads were blocked until the script returned. It’s critical that scripts be evaluated in the order specified, but they can be downloaded in parallel. This has a significant improvement on page load times, especially for pages with multiple scripts. Newer browsers (IE8, Firefox 3.5+, Safari 4, Chrome 2+) incorporated this parallel script loading feature, but it doesn’t work as proactively as it could. Specifically:

  • IE8 – downloading scripts blocks image and iframe downloads
  • Firefox 3.6 – downloading scripts blocks iframe downloads
  • Safari 4 – downloading scripts blocks iframe downloads
  • Chrome 4 – downloading scripts blocks iframe downloads
  • Opera 10.10 – downloading scripts blocks all downloads

(test case, see the four “|| Script [Script|Stylesheet|Image|Iframe]” tests)

SCRIPT attributes
The HTML5 specification describes the ASYNC and DEFER attributesfor the SCRIPT tag, but the implementation behavior is not specified. Here’s how the SCRIPT attributes should work.

  • DEFER – The HTTP request for a SCRIPT with the DEFER attribute is not made until all other resources in the page on the same domain have already been sent. This is so that it doesn’t occupy one of the limited number of connections that are opened for a single server. Deferred scripts are downloaded in parallel, but are executed in the order they occur in the HTML document, regardless of what order the responses arrive in. The window’s onload event fires after all deferred scripts are downloaded and executed.
  • ASYNC – The HTTP request for a SCRIPT with the ASYNC attribute is made immediately. Async scripts are executed as soon as the response is received, regardless of the order they occur in the HTML document. The window’s onload event fires after all async scripts are downloaded and executed.
  • POSTONLOAD – This is a new attribute I’m proposing. Postonload scripts don’t start downloading until after the window’s onload event has fired. By default, postonload scripts are evaluated in the order they occur in the HTML document. POSTONLOAD and ASYNC can be used in combination to cause postonload scripts to be evaluated as soon as the response is received, regardless of the order they occur in the HTML document.

resource packages
Each HTTP request has some overhead cost. Workarounds include concatenating scripts, concatenating stylesheets, and creating image sprites. But this still results in multiple HTTP requests. And sprites are especially difficult to create and maintain. Alexander Limi (Mozilla) has proposed using zip files to create resource packages. It’s a good idea because of its simplicity and graceful degradation.
border-radius
Creating rounded corners leads to code bloat and excessive HTTP requests. Border-radius reduces this to a simple CSS style. The only major browser that doesn’t support border-radius is IE. It has already been announced that IE9 will support border-radius, but I wanted to include it nevertheless.
cache redirects
Redirects are costly from a performance perspective, especially for users with high latency. Although the HTTP specsays 301 and 302 responses (with the proper HTTP headers) are cacheable, most browsers don’t support this.

  • IE8 – doesn’t cache redirects for the main page and for resources
  • Safari 4 – doesn’t cache redirects for the main page
  • Opera 10.10 – doesn’t cache redirects for the main page

(test case)

link prefetch
To improve page load times, developers prefetch resources that are likely or certain to be used later in the user’s session. This typically involves writing JavaScript code that executes after the onload event. When prefetching scripts and stylesheets, an iframe must be used to avoid conflict with the JavaScript and CSS in the main page. Using an iframe makes this prefetching code more complex. A final burden is the processing required to parse prefetched scripts and stylesheets. The browser UI can freeze while prefetched scripts and stylesheets are parsed, even though this is unnecessary as they’re not going to be used in the current page. A simple alternative solution is to use LINK PREFETCH. Firefox is the only major browser that supports this feature (since 1.0). Wider support of LINK PREFETCH would give developers an easy way to accelerate their web pages. (test case)
Web Timing spec
In order for web developers to improve the performance of their web sites, they need to be able to measure their performance – specifically their page load times. There’s debate on the endpoint for measuring page load times (window onload event, first paint event, onDomReady), but most people agree that the starting point is when the web page is requested by the user. And yet, there is no reliable way for the owner of the web page to measure from this starting point. Google has submitted the Web Timing proposal draft for browser builtin support for measuring page load times to address these issues.
remote JS debugging
Developers strive to make their web apps fast across all major browsers, but this requires installing and learning a different toolset for each browser. In order to get cross-browser web development tools, browsers need to support remote JavaScript debugging. There’s been progress in building protocols to support remote debugging: WebDebugProtocol and Crossfire in Firefox, Scope in Opera, and ChromeDevTools in Chrome. Agreement on the preferred protocol and support in the major browsers would go a long way to getting faster web apps for all users, and reducing the work for developers to maintain cross-browser web app performance.
Web Sockets
HTML5 Web Sockets provide built-in support for two-way communications between the client and server. The communication channel is accessible via JavaScript. Web Sockets are superior to comet and Ajax, especially in their compatibility with proxies and firewalls, and provide a path for building web apps with a high degree of communication between the browser and server.
History
HTML5 specifies implementation for History.pushState and History.replaceState. With these, web developers can dynamically change the URL to reflect the web application state without having to perform a page transition. This is important for Web 2.0 applications that modify the state of the web page using Ajax. Being able to avoid fetching a new HTML document to reflect these application changes results in a faster user experience.
anchor ping
The ping attribute for anchors provides a more performant way to track links. This is a controversial feature because of the association with “tracking” users. However, links are tracked today, it’s just done in a way that hurts the user experience. For example, redirects, synchronous XHR, and tight loops in unload handlers are some of the techniques used to ensure clicks are properly recorded. All of these create a slower user experience.
progressive XHR
The draft spec for XMLHttpRequest details how XHRs are to support progressive response handling. This is important for web apps that use data with varied response times as well as comet-style applications. (more information)
stylesheet & inline JS
When a stylesheet is followed by an inline script, resources that follow are blocked until the stylesheet is downloaded and the inline script is evaluated. Browsers should instead lookahead in their parsing and start downloading subsequent resources in parallel with the stylesheet. These resources of course would not be rendered, parsed, or evaluated until after the stylesheet was parsed and the inline script was evaluated. (test case see “|| CSS + Inline Script”; looks like this just landed in Firefox 3.6!)
SCRIPT DEFER for inline scripts
The benefit of the SCRIPT DEFER attribute for external scripts is discussed above. But DEFER is also useful for inline scripts that can be executed after the page has been parsed. Currently, IE8 supports this behavior. (test case)
@import improvements
@import is a popular alternative to the LINK tag for loading stylesheets, but it has several performance problems in IE:

  • LINK @import – If the first stylesheet is loaded using LINK and the second one uses @import, they are loaded sequentially instead of in parallel. (test case)
  • LINK blocks @import – If the first stylesheet is loaded using LINK, and the second stylesheet is loaded using LINK that contains @import, that @import stylesheet is blocked from downloading until the first stylesheet response is received. It would be better to start downloading the @import stylesheet immediately. (test case)
  • many @imports – Using @import can change the download sequence of resources. In this test case, multiple stylesheets loaded with @import are followed by a script. Even though the script is listed last in the HTML document, it gets downloaded first. If the script takes a long time to download, it can causes the stylesheet downloads to be delayed, which can cause rendering to be delayed. It would be better to follow the order specified in the HTML document. (test case)

(more information)

@font-face improvements
In IE8, if a script occurs before a style that uses @font-face, the page is blocked from rendering until the font file is done downloading. It would be better to render the rest of the page without waiting for the font file. (test case, blog post)
stylesheets & iframes
When an iframe is preceded by an external stylesheet, it blocks iframe downloads. In IE, the iframe is blocked from downloading until the stylesheet response is received. In Firefox, the iframe’s resources are blocked from downloading until the stylesheet response is received. There’s no dependency between the parent’s stylesheet and the iframe’s HTML document, so this blocking behavior should be removed. (test case)
paint events
As the amount of DOM elements and CSS grows, it’s becoming more important to be able to measure the performance of painting the page. Firefox 3.5 added the MozAfterPaint event which opened the door for add-ons like Firebug Paint Events (although early Firefox documentation noted that the “event might fire before the actual repainting happens“). Support for accurate paint events will allow developers to capture these metrics.
missing schema, double downloads
In IE7&8, if the “http:” schema is missing from a stylesheet’s URL, the stylesheet is downloaded twice. This makes the page render more slowly. Not including “http://” in URLs is not pervasive, but it’s getting more widely adopted because it reduces download size and resolves to “http://” or “https://” as appropriate. (test case)

28 Comments

5e speculative background images

February 12, 2010 6:09 pm | 13 Comments

This is the fifth of five quick posts about some browser quirks that have come up in the last few weeks.

Chrome and Safari start downloading background images before all styles are available. If a background image style gets overwritten this may cause wasteful downloads.

Background images are used everywhere: buttons, background wallpaper, rounded corners, etc. You specify a background image in CSS like so:

.bgimage { background-image: url("/images/button1.gif"); }

Downloading resources is an area for optimizing performance, so it’s important to understand what causes CSS background images to get downloaded. See if you can answer the following questions about button1.gif:

  1. Suppose no elements in the page use the class “bgimage”. Is button1.gif downloaded?
  2. Suppose an element in the page has the class “bgimage” but also has “display: none” or “visibility: hidden”. Is button1.gif downloaded?
  3. Suppose later in the page a stylesheet gets downloaded and redefines the “bgimage” class like this:
    .bgimage { background-image: url("/images/button2.gif"); }

    Is button1.gif downloaded?

Ready?

The answer to question #1 is “no”. If no elements in the page use the rule, then the background image is not downloaded. This is true in all browsers that I’ve tested.

The answer to question #2 is “depends on the browser”. This might be surprising. Firefox 3.6 and Opera 10.10 do not download button1.gif, but the background image is downloaded in IE 8, Safari 4, and Chrome 4. I don’t have an explanation for this, but I do have a test page: hidden background images. If you have elements with background images that are hidden initially, you should hold off on creating them until after the visible content in the page is rendered.

The answer to question #3 is “depends on the browser”. I find this to be the most interesting behavior to investigate. According to the cascading behavior of CSS, the latter definition of the “bgimage” class should cause the background-image style to use button2.gif. And in all the major browsers this is exactly what happens. But Safari 4 and Chrome 4 are a little more aggressive about fetching background images. They download button1.gif on the speculation that the background-image property won’t be overwritten, and then later download button2.gif when it is overwritten. Here’s the test page: speculative background images.

When my officemate, Steve Lamm, pointed out this behavior to me, my first reaction was “that’s wasteful!” I love prefetching, but I’m not a big fan of most prefetching implementations because they’re too aggressive – they err too far on the side of downloading resources that never get used. After my initial reaction, I thought about this some more. How frequently would this speculative background image downloading be wasteful? I went on a search and couldn’t find any popular web site that overwrote the background-image style. Not one. I’m not saying pages like this don’t exist, I’m just saying it’s very atypical.

On the other hand, this speculative downloading of background images can really help performance and the user’s perception of page speed. Many web sites have multiple stylesheets. If background images don’t start downloading until all stylesheets are done loading, the page takes longer to render. Safari and Chrome’s behavior of downloading a background image as soon as an element needs it, even if one or more stylesheets are still downloading, is a nice performance optimization.

That’s a nice way to finish the week. Next week: my Browser Performance Wishlist.

The five posts in this series are:

13 Comments

5d dynamic stylesheets

February 12, 2010 1:13 am | 7 Comments

This is the fourth of five quick posts about some browser quirks that have come up in the last few weeks.

You can avoid blocking rendering in IE if you load stylesheets using DHTML and setTimeout.

A few weeks ago I had a meeting with a company that makes a popular widget. One technique they used to reduce their widget’s impact on the main page was to load a stylesheet dynamically, something like this:

var link = document.createElement('link');
link.rel = 'stylesheet';
link.type = 'text/css';
link.href = '/main.css';
document.getElementsByTagName('head')[0].appendChild(link);

Most of my attention for the past year has been on loading scripts dynamically to avoid blocking downloads. I haven’t focused on loading stylesheets dynamically. When it comes to stylesheets, blocking downloads isn’t an issue – stylesheets don’t block downloads (except in Firefox 2.0). The thing to worry about when downloading stylesheets is that IE blocks rendering until all stylesheets are downloaded1, and other browsers might experience a Flash Of Unstyled Content (FOUC).

FOUC isn’t a concern for this widget – the rules in the dynamically-loaded stylesheet only apply to the widget, and the widget hasn’t been created yet so nothing can flash. If the point of loading the stylesheet dynamically is to not mess with the containing page, we have to make sure dynamic stylesheets don’t block the page from rendering in IE.

I created the DHTML stylesheet example to show what happens. The page loads a stylesheet dynamically. The stylesheet is configured to take 4 seconds to download. If you load the page in Internet Explorer the page is blank for 4 seconds. In order to decouple the stylesheet load from page rendering, the DHTML code has to be invoked using setTimeout. That’s what I do in the DHTML + setTimeout stylesheet test page. This works. The page renders immediately while the stylesheet is downloaded in the background.

This technique is applicable when you have stylesheets that you want to load in the page but the stylesheet’s rules don’t apply to any DOM elements in the page currently. This is a pretty small use case. It makes sense for widgets or pages that have DHTML features that aren’t invoked until after the page has loaded. If you find yourself in that situation, you can use this technique to avoid the blank white screen in IE.

The five posts in this series are:

1 Simple test pages may not reproduce this problem. My testing shows that you need a script (inline or external) above the stylesheet, or two or more stylesheets for rendering to be blocked. If your page has only one stylesheet and no SCRIPT tags, you might not experience this issue.

7 Comments

5c media=print stylesheets

February 11, 2010 5:27 pm | 2 Comments

This is the third of five quick posts about some browser quirks that have come up in the last few weeks.

Stylesheets set with media=”print” still block rendering in Internet Explorer.

A few weeks ago a friend at a top web company pinged me about a possible bug in Page Speed and YSlow. Both tools were complaining about stylesheets he placed at the bottom of his page, an obvious violation of my put stylesheets at the top rule from High Performance Web Sites. The reasoning behind this rule is that Internet Explorer won’t start rendering the page until all stylesheets are downloaded1, and other browsers might produce the Flash Of Unstyled Content (FOUC). It’s best to put stylesheets at the top so they get downloaded as soon as possible.

His reason for putting these stylesheets at the bottom was that they were specified with media="print". Since these stylesheets weren’t going to be used to render the current page, he wanted to load them last so that other more important resources could get downloaded sooner. Going back to the reasons for the “put stylesheets at the top” rule, he wouldn’t have to worry about FOUC (the stylesheets wouldn’t be applied to the current page). But would he have to worry about IE blocking the page from rendering? Time for a test page.

The media=print stylesheets test page contains one stylesheet at the bottom with media="print". This stylesheet is configured to take 4 seconds to download. If you view this page in Internet Explorer you’ll see that rendering is indeed blocked for 4 seconds (tested on IE 6, 7, & 8).

I’m surprised browsers haven’t gotten to the point where they skip downloading stylesheets for a different media type than the current one. I’ve asked some web devs but no one can think of a good reason for doing this. In the meantime, even if you have stylesheets with media="print" you might want to follow the advice of Page Speed and YSlow and put them in the document HEAD. Or you could try loading them dynamically. That’s the topic I’ll cover in my next blog post.

The five posts in this series are:

1 Simple test pages may not reproduce this problem. My testing shows that you need a script (inline or external) above the stylesheet, or two or more stylesheets for rendering to be blocked. If your page has only one stylesheet and no SCRIPT tags, you might not experience this issue.

2 Comments

5b document.write scripts block in Firefox

February 10, 2010 5:58 pm | 9 Comments

This is the second of five quick posts about some browser quirks that have come up in the last few weeks.

Scripts loaded using document.write block other downloads in Firefox.

Unfortunately, document.write was invented. That problem was made a bzillion times worse when ads decided to use document.write to insert scripts into the content publisher’s page. It’s one line of code:

document.write('<script src="http://www.adnetwork.com/main.js"><\/script>');

Fortunately, most of today’s newer browsers load scripts in parallel including scripts added via document.write. But a few weeks ago I noticed that Firefox 3.6 had some weird blocking behavior in a page with ads, and tracked it down to a script added using document.write.

The document.write scripts test page demonstrates the problem. It has four scripts. The first and second are inserted using document.write. The third and fourth are loaded the normal way (via HTML using SCRIPT SRC). All four scripts are configured to take 4 seconds to download. In IE8, Chrome 4, Safari 4, and Opera 10.10, the total page load time is ~4 seconds. All the scripts, even the ones inserted using document.write, are loaded in parallel. In Firefox, the total page load time is 12 seconds (tested on 2.0, 3.0, and 3.6). The first document.write script loads from 1-4 seconds, the second document.write scripts loads from 5-8 seconds, and the final two normal scripts are loaded in parallel from 9-12 seconds.

The issues with document.write are getting more well known. Some 3rd party code snippets (including Google Analytics) are switching away from document.write. But most 3rd party snippets still use document.write to insert their code into the publisher’s page. Here’s one more reason to avoid document.write.

The five posts in this series are:

9 Comments

5a Missing schema double download

February 10, 2010 5:12 pm | 15 Comments

This is the first of five quick posts about some browser quirks that have come up in the last few weeks.

Internet Explorer 7 & 8 will download stylesheets twice if the http(s) protocol is missing.

If you have an HTTPS page that loads resources with “http://” in the URL, IE halts the download and displays an error dialog. This is called mixed content and should be avoided. How should developers code their URLs to avoid this problem? You could do it on the backend in your HTML template language. But a practice that is getting wider adoption is protocol relative URLs.

A protocol relative URL doesn’t contain a protocol. For example,

https://stevesouders.com/images/book-84x110.jpg

becomes

//stevesouders.com/images/book-84x110.jpg

Browsers substitute the protocol of the page itself for the resource’s missing protocol. Problem solved! In fact, today’s HttpWatch Blog posted about this: Using Protocol Relative URLs to Switch between HTTP and HTTPS.

However, if you try this in Internet Explorer 7 and 8 you’ll see that stylesheets specified with a protocol relative URL are downloaded twice. Hard to believe, but true. My officemate, Steve Lamm, discovered this when looking at the new Nexus One Phone page. That page fetches a stylesheet like this:

<link type="text/css" rel="stylesheet" href="//www.google.com/phone/static/2496921881-SiteCss.css">

Notice there’s no protocol. If you load this page in Internet Explorer 7 and 8 the waterfall chart (nicely generated by HttpWatch) looks like this:

Notice 2496921881-SiteCss.css is downloaded twice, and each time it’s a 200 response, so it’s not being read from cache.

It turns out this only happens with stylesheets. The Missing schema, double download test page I created contains a stylesheet, an image, and a script that all have protocol relative URLs pointing to 1.cuzillion.com. The stylesheet is downloaded twice, but the image and script are only downloaded once. I added another stylesheet from 2.cuzillion.com that has a full URL (i.e., it starts with “http:”). This stylesheet is only downloaded once.

Developers should avoid using protocol relative URLs for stylesheets if they want their pages to be as fast as possible in Internet Explorer 7 & 8.

The five posts in this series are:

15 Comments

Browser script loading roundup

February 7, 2010 12:12 am | 8 Comments

How are browsers doing when it comes to parallel script loading?

Back in the days of IE7 and Firefox 2.0, no browser loaded scripts in parallel with other resources. Instead, these older browsers would block all subsequent resource requests until the script was received, parsed, and executed. Here’s how the HTTP requests look when this blocking occurs in older browsers:

The test page that generated this waterfall chart has six HTTP requests:

  1. the HTML document
  2. the 1st script – 2 seconds to download, 2 seconds to execute
  3. the 2nd script – 2 seconds to download, 2 seconds to execute
  4. an image – 1 second to download
  5. a stylesheet- 1 second to download
  6. an iframe – 1 second to download

The figure above shows how the scripts block each other and block the image, stylesheet, and iframe, as well. The image, stylesheet, and iframe download in parallel with each other, but not until the scripts are finished downloading sequentially.

The likely reason scripts were downloaded sequentially in older browsers was to preserve execution order. This is critical when code in the 2nd script depends on symbols defined in the 1st script. Preserving execution order avoids undefined symbol errors. But the missed opportunity is obvious – while the browser is downloading the first script and guaranteeing to execute it first, it could be downloading the other four resources in parallel.

Thankfully, newer browsers now load scripts in parallel!

This is a big win for today’s web apps that often contain 100K+ of JavaScript split across multiple files. Loading the same test page in IE8, Firefox 3.6, Chrome 4, and Safari 4 produces an HTTP waterfall chart like this:

Things look a lot better, but not as good as they should be. In this case, IE8 loads the two scripts and stylesheet in parallel, but the image and iframe are blocked. All of the newer browsers have similar limitations with regard to the extent to which they load scripts in parallel with other types of resources. This table from Browserscope shows where we are and the progress made to get to this point. The recently added “Compare” button added to Browserscope made it easy to generate this historical view.

While downloading scripts, IE8 still blocks on images and iframes. Chrome 4, Firefox 3.6, and Safari 4 block on iframes. Opera 10.10 blocks on all resource types. I’m confident parallel script loading will continue to improve based on the great progress made in the last batch of browsers. Let’s keep our eyes on the next browsers to see if things improve even more.

8 Comments