#fronteers14

Dave Olsen - Optimizing web performance

Fronteers 2014 | Amsterdam, October 10, 2014

Today, a web page can be delivered to desktop computers, televisions, or handheld devices like tablets or phones. While a technique like responsive design helps ensure that our web sites look good across that spectrum of devices we may forget that we need to make sure that our web sites also perform well across that same spectrum. More and more of our users are shifting their Internet usage to these more varied platforms and connection speeds with some moving entirely to mobile Internet.

In this session we’ll look at the tools that can help you understand, measure and improve the web performance of your web sites and applications. The talk will also discuss how new server-side techniques might help us optimize our front-end performance. Finally, since the best way to test is to have devices in your hand, we’ll discuss some tips for getting your hands on them cheaply.

Slides

Transcript

Big warm welcome to Dave Olsen! Thank you very much, Jake.

Thank you very much to the fronteers folks for inviting me here today to talk to you guys about performance.

Give you guys a little bit of an idea of what we're going to talk about, what kind of a road map, a very ambitious road map for today's talk.

We're going to talk a little bit about why we should care about web performance.

I think it just-- it really should just be a default, but hopefully will give you some things you can take back to your team, or to your management, to kind of explain, this is why we have to put this during the process-- or the front of the process.

We're going to talk about how we can diagnose performance problems, some easy tools to do that.

Some easy web performance optimization wins, again, that you can get during your process to make your life a little-- a lot easier so you don't have to think about it as much.

Some other tools to help test and automate perf optimizations.

And, finally, because the only real way to test performance, especially mobile performances on a real device, is how you can set up a device lab yourself.

Or even better, because you guys are very lucky in Europe, you have lots of device labs around to go find devices.

So why should we care about web performance? Why should we care, like the care bears? The fact of the matter is-- well, Ethan Marcote laid out three simple rules regarding responsive web design.

And putting them together you have flexible grids, flexible images, flexible media, and media queries.

And together they give you flexible layouts.

And flexible layouts are fantastic.

They allow us to make one design that, essentially, can be looked and seen everywhere, and kind of can shift-- you know that old analogy of water moving between glass containers-- this one layout, it goes everywhere.

But the problem is this-- shifting this layout is only one part of doing web development.

And Jason Grigsby does a really good job of summing it all up, where he says, "The way in which CSS media queries have been promoted for mobile hides tough problems and gives developers a false promise of a simple solution for designing for small screens."

Simply changing a layout is not enough when designing for a small screen.

There's so much more we have to be thinking about and this is just a tiny, tiny little part of that.

We're gonna be thinking about platform optimization.

If we're designing one layout that we're shifting everywhere, and we have to deal with IE8 users-- I'm in higher education in the States, and we still have a lot of legacy that we have to deal with, plus-- before we're looking.

So platform optimization.

Content strategy, content choreography.

How do we make sure our content is coming out in the right order, based on different devices? Ads, making sure those are going the correct places.

So it's not just a simple matter of just squishing things.

It's a matter of figuring out where those things should go when they've been squished.

And finally, I think for me what the talk is going to be about-- one thing that we tend not to think about a lot is performance.

And that's because we tend to develop and do testing, all at our desktops.

And we go like, yeah, this stuff's really, really fast, and this is awesome.

Instead we need to be thinking about, kind of, what our current dev practices are.

And they've led to-- to this.

Absolutely massive bloat for our web pages.

This is HTTP archive.

This is for last month's result. So web pages,

home pages for their 200,000 some odd places that they index, is at 1.8 megs-- the

average home page today.

That's two floppy disks now, to bring somebody any content.

I just-- absolutely blows my mind.

Maybe I'm dating myself by knowing how big a floppy disk is, but-- And the massive portion of this is images and JavaScript-- JavaScript to a lesser degree, but images.

We have to have this big glossy hero image.

We have to have this carousel.

And we have to have this amazing, just, visual experience.

And it works fantastic on a desktop.

And it works really great in a static comp when you're talking to a client.

It makes a huge difference when we start talking about, this layout's gonna go literally everywhere, to every device, over every network.

And we have to be thinking a lot more cognizant about that.

And just using responsive design is not a cure-all.

And a kind of proof that these current dev practices are still failing in responsive design, Guy-- hopefully I'm not completely butchering his name-- Guy Podjarny did this study in 2013 where he found almost 3/4 of the websites who had responsive web design, small screen design, weighed the same as a large screen design.

There was absolutely no thought put into-- or probably very minimal thought put into how these things are different, and how we're delivering to a different network and having, honestly, a different use case.

But definitely delivering to a different network and how that might affect the web pages that we're trying to do.

And why we need to be thinking about this is because our users think the web, is the web, is the web.

But the same experience, and the same content, and the same everything that they get on desktop, they want it on mobile.

And they don't care that they're on a shitty network.

It doesn't even come into their mind.

They're just like, OK, it's a website.

I'm going to access this website.

And they expect it to load the same, regardless of where they are.

And so everything we have to be thinking about is, how do we make it load fast everywhere? And not just, again, keep doing this, it worked well enough on desktop, it probably will work OK on mobile.

How do we figure it out? And the 5.4 seconds actually

isn't a-- it's actually kind of important.

A website should load in less than five seconds, again, regardless of platform-- which is kind of incredible-- or else they leave.

And for us who are trying to generate revenue, that's kind of a really important number for us to hit.

And kind of a good case study comes from Strange Loop.

They did this a couple years ago, where they worked with a client where they actually slowed down requests.

So they held on to requests for a little bit longer before sending it off.

And they did this little study where they held on to requests for 200 milliseconds, held on to requests for 500 milliseconds, and then 1,000, for a second.

And see what kind of effect they had on-- on revenue, how much they could revenue, what the balance rate was.

And it had massive implications.

A simple one second delay-- I mean, one second, that's like nothing-- had almost a 4% drop in conversions-- 4% drop in purchases.

They lost almost 10% in page views.

People weren't even seeing it.

They just gave up and went away, wouldn't even record the page view.

And that's simply a one second drop in a page.

If we're talking on mobile networks, we can be talking two, three, four second drops in pages if we're not optimizing properly.

So you're losing that much more.

So if we decide that we're going to go mobile first-- if we're gonna design our layouts for mobile first, we're gonna start with that small screen and work out-- we also have to keep thinking that's also performance first.

And I'm not trying to coin a phrase, or coin a design-- anything.

But, really, if you're going to go mobile first, you have to be starting and thinking right at the get-go about what performance means for the website as well.

So what are some of the primary performance issues for responsive design? And I'm breaking these down into two buckets.

The first bucket is something that we can control, as designers and developers.

And the second bucket is just the stuff you have to kind of realize you're going to be dealing with, and there you have nothing you can do about.

So try as much as you can to deal with the first bucket.

So a lot of things that happen is Download and Hide.

The easy trick in responsive design seems to be use Display None.

I'm just gonna take this CSS, and I'm going to-- or I take this div, I'm gonna hide it.

The reality is a browser is still going to try to parse and download whatever is in there.

So if you have an image and you did Display None on it, the browser still went and fetched it, just no one got to see it.

So you're taking that chance to-- so you're just forcing someone to take an action which they didn't need to do.

Download and Shrink.

Again we have this big, massive hero image.

And I set height and width to 100% and 100%.

Oh, fantastic, everything just kind of moves.

You sent a ton of extra bytes to someone when they don't actually get to take advantage of all that-- of what they can see because it's so much smaller now.

So you're trying to think about-- now especially with responsive images, has finally sort of arrived with a source set in the picture element, that we can actually properly size images to what the final output's going to be.

And then something that happens a lot, I think more with JavaScript than anything else, which is downloading and ignore. So we have a

certain feature that we're going to use on the desktop.

And it didn't quite fit, or we don't want to quite do the UI at a mobile size.

But we still had the user download the JavaScript anyway because we just have one layout that we're sending to everybody.

So the browser's A, downloaded JavaScript it doesn't really need, and B, it's now parsed that JavaScript to make it available to you, which again you're never going to need.

So you burn battery, and you burn bandwidth, and you made the website slower for-- for no reason.

So that's sort of the things that we can control.

Kind of, ideas of things that we want to address.

On the poor network side, high latency.

If you're on a cable modem, and you're at your desk doing design, you're probably seeing like a 20 millisecond latency, in terms of getting a request, or whatever.

And it's an order of magnitude more on a network.

So, basically, what latency is, is the-- when a user requests an asset-- it's the time from when a user requests an asset, to when the asset comes back to the user.

So on a desktop, it's 20 milliseconds ago.

Hey, I want that image, and it comes back.

And on a mobile network, it's 200 milliseconds.

I want that image, and it comes back.

And something you can't beat, there's nothing-- no way to get around it.

And so when you start adding up that 200 milliseconds over every request that you're trying to make, it adds up massively fast.

When we're talking about one second in terms of, that was a 10% drop in page views, you know, five images at 200 milliseconds, that's a second right there that you've lost, that you have zero control over.

So try to think about how you can avoid latency.

We have variable bandwidth, and we have packet loss, to a lesser degree.

So again, trying to think about things you can't, absolutely can't overcome, and trying to avoid them as much as you can.

So reviewing why we should care.

We have a set of current dev practices that sort of encourages bloat, that sort of encourages this big, glossy web fonts-- big design.

And we test on a desktop, and we look at it on a desktop, and we tend to review with the client on a desktop.

And we sort of say, hey, by the way, it does work on mobile.

And I've sort of, maybe haven't started mobile first design, but at the end the day we've tested and everything on a desktop machine.

And so we didn't get kind of a true idea of what the performance was, and where we're going to hit bottlenecks, and how that might actually affect someone.

And these two things coupled together lead to the web performance blues.

And that makes LEGO man very, very sad.

And I didn't quite realize how big he would be.

He's gonna eat me.

So how do we go about diagnosing web performance issues? How can we start figuring out what problems we have currently? What tools can we use to do that? And I've sort of broken this down, kind of via iTunes essentials sort of schema.

So where are the basics? So the very basic tool-- most basic tool you can use to get a really good idea of how your page is performing, or what things you can improve of an existing website if you want to kind of go retrofit an existing solution and see what your problems are.

PageSpeed Insights, from Google, is-- should be your number one stop.

It does a fantastic job.

You pop in your URL, give it a minute, and it'll give you two separate reports.

One, based on mobile.

It'll give you tips and tricks that you should-- things you should be thinking about for mobile, both from a performance perspective and a user interaction perspective, which is really massive.

And then the flip side, it'll also give you desktop.

But obviously, you know, the really useful tool, and the thing I think you should actually be focused on more-- they should probably just make everything the mobile set of rules-- is focusing on that.

And then it links off to all the great documentation that is in the Google Developer website.

So that's the basic-- most basic thing you can do.

The next step that you can do as you actually develop your websites, to try to think about how performance is working, is use the Chrome Developer Tools.

It's much more than Elements View.

It's much more than just being able to, hey, I can change the style on the fly, I can look at a source map.

That's all well and good, but there's also the Network panel, there's a Timeline panel.

You can actually get your page speed audit here too, as well, and try to figure out what's going on.

And I'm not going to cover it too much, but there is actually a Chrome DevTools course completely focused on web performance.

And it's taught by Ilya Grigorik, who is basically the man when it comes to web performance, including Steve Souders too, but-- he did a fantastic course.

It's free, actually, as long as you don't care about getting a certificate or anything from Udacity, you can get the free course.

It's supposed to take six hours or so, I guess.

And you can look through it and learn from the man about how you can actually use Chrome DevTools, beyond just using it for styles and managing markup.

And the Deep Cuts.

The absolute best tool, I think, and really worth your time to learn and go through and figure out how to actually use it.

And as I go through the rest of this talk, actually, I'm going to be highlighting features of WebPagetest.

WebPagetest is essentially created by performance developers, for performance developers with a lot of love and care.

The interface isn't always the best, it's a little ugly and clunky, but it has some absolutely amazing tools that'll give you great insight into what's going on with your web pages.

So in the same way that you can go to PageSpeed Insights and type in your URL, you can do the same thing with WebPagetest and get a really basic report.

So this is a basic report for my home page.

And I was very embarrassed with the F, but I figure I would leave it there.

I have to go change an Apache configuration somewhere.

But it gives you a quick little grade, so a quick visual about what's going on.

And it gives you, again, more links back to tips about how you can make sure your website is performing-- performing better and how you can do stuff.

So a bunch of the features of WebPagetest, which is what makes it so amazing, is that you can test real browsers from multiple locations.

So you can see, how's my web performance in IE? How's my web performance in Android browser? How is my web-- without having to actually have all that stuff on hand.

This is actually-- they're hooked up to these real devices with real browsers and testing everything.

And latency is also a factor of just pure distance, the speed of light is an issue of latency.

So you can see how's your website perform from, say, South Africa if you're trying to request something from California.

And to give you an idea of, maybe you want to put a CDN in.

So if you see you have a lot of traffic from a particular location, you can actually test it and see how people are doing.

You can modify connection speed so you can essentially shape traffic and figure out, hey, on a 3G network this is probably what it's going to be like, or this is what it's going to be looking like on a cable, or DSL.

I know for me, in West Virginia, I actually have less than a meg DSL.

And so it's kind of good to see how things might affect me.

Trying to go to certain websites is a challenge at times.

My cell connection's faster than my home connection.

It does video capture.

You can do content blocking.

You can get really, really nerdy and script the session, and do all sorts of fun things.

And I'll show you an example of one of those later on.

You can actually add WebPagetest to any kind of continuous integration platform that you have.

So if you're running Travis can run WebPagetest and get a performance audit as part of your continuous integration.

So you don't have to think about it as much.

You can just be like, hey, ran the test and it came back-- oh, wait, maybe I should go fix something.

So it becomes more a part of your process.

And I think that's the beautiful thing about web performance, is trying to figure out, not just tacking it on the end, but how can we integrate performance into every part of what we do.

And so WebPagetest allows you to do it with continuous integration.

You can also collect tests over time, and kind of record them and see how your performance gets better, or doesn't do as well.

And, finally, it's free so you can go ahead and use it.

About the only problem is, at times, it can be a little busy, and you have to kind of wait in a queue for the test to run.

But, still, it's a fantastic way of learning more about how your website performance is.

So to review some diagnostic tools, fast and easy, thing to go back tonight and play with, PageSpeed Insights.

If you're going to do local development, Chrome DevTools.

And I would highly encourage you to take the time-- to take a day to do that course on Udacity from Ilya.

And if you want customization, and if you want integration with your process completely, and massive amounts of data to mass amounts of results, go with WebPagetest.

So what are some easy web optimization wins? What are some things that you could do during your process to actually improve the performance of your website, without having to think so much? The very first thing I would recommend is a web performance budget.

So a budget is a guide that you would-- in the same way that you would do a creative brief with a client at the very beginning of a project, you and your team-- or, even better, with the client, decide what your performance numbers.

What metrics should you reach, from a web performance perspective, by the end of the project.

You could do something as simple as, we want to keep it under a certain number of kilobytes.

So for my own personal website I've set a performance budget of 50k.

And that sort of defined for me, as I made decisions, how does that fit within this budget that I kind of gave myself, things that I want to think about.

And it's not a hard and fast limit, it should be a guide.

I really wanted web fonts.

I thought, hey, my web page looks better with web fonts, they're kind of sexy.

And that broke my budget, web fonts will tend to do that, so now my web page is a little under 100k.

So it's not horrible but I made a compromise.

And that's a lot of performance.

Performance tweaks are about compromises.

They're about figuring out what's gonna work best for you and your client.

You can't just do everything and hope it's going to work.

So performance budget works really well.

If you take anything away from this talk beyond, like, PageSpeed Insights, create a performance budget to work with folks.

And another good way of actually figuring out what a performance budget would be, especially with a client and somebody who makes money, is to look at their competitors.

Figure out what their PageSpeed score is, or whatever.

You get that from PageSpeed Insights.

And then figure-- say our threshold for our performance budget is 25% less than that PageSpeed score, so we know our website's gonna perform better than your competitors.

And that way you're going to make more money, and everyone's gonna be a lot happier.

So that's a way of also doing a performance budget, is figure out who your-- who the competitors are, figure out what their baseline is, and do something like 25% lower than that them.

So the three things to keep in mind, as kind of tenets-- things that you should have in the back of your head as you make tweaks with web performance.

Number one is, obviously, reduce requests.

Less requests, less chance you're gonna run into latency issues, the faster everything's going to be.

Reduced asset size, clear your-- the browser pipeline that much faster to go get more assets.

Smaller is better.

We'll talk a little bit more about that in a bit.

And then speed up page render.

You actually can get a lot of benefit out of making things look like they're filling in faster than they really are.

So trying to figure out how you can make a page look like it's rendering more quickly.

So again, getting back to the no request-- the first tenet, the best request really is no request.

The less request you make, the much, much happier.

So the more you can cut out the better.

And probably the number one easy win to make that happen is making sure browser cache is properly enabled on your server, and making sure that elements are being cached on the browser.

A really easy way of figuring out if browser cache is enabled, you can go to redbot.org

and it'll tell you, hey, your server for your particular website is caching assets, or not caching assets.

If it's not, start figuring out what the metrics are for actually setting that up with Apache, or whatever you're using, so that the browser can hold on to requests.

Because, basically, what the cache allows it to do is, they'll say, hey, this asset hasn't changed, or I've been told not to go request this for awhile.

And the browser makes less requests, which means your web page appears to be faster.

So browser cache is definitely the number one easy web perf win.

As a front end developer, you have zero to do.

Just go tell a server admin, please do this.

And that's sometimes easy and sometimes hard.

I've been waiting on years for that at my university for some things, so.

Compress HTML and CSS, use mod_deflate.

Again, this is a really easy win, making sure things are compressed and as small as they possibly can be.

And this is just, again, set up on a server.

One thing that I think is a little harder, and maybe isn't necessarily necessary, is avoiding AJAX requests.

If you know your web page is going to be making a request eventually for content, and that content's never-- not going to change based on an action that a user has taken, there's nothing wrong with taking that and actually embedding that as part of the web page to begin with.

And maybe putting it into a script tag with set-to-type text HTML, and then using that content to populate the AJAX area instead.

So you're essentially kind of concatenating markup together.

So it's a way of thinking about how to improve performance.

It's not really a server thing, but something to keep in mind.

So images easy wins.

Again, no request, try to avoid images.

Think about how you can use blocks of CSS color, blocks of-- to make gradients, to make patterns all with CSS, rather than trying to use an image to do it.

Number one thing would be, for me, is image compression.

That's the most difficult thing I have dealing with designers, is going, you don't have to save it at 80% JPEG.

Not everything has to be saved at PNG24.

You can actually kind of decrease that, get smaller sizes and still be good.

People don't notice artifacts, they really don't, they don't care.

They're actually probably more worried about your textual content than your image content anyway.

So being a little less and being critical about what compression you use can save massive amounts of bandwidth in size.

On responsive images, I'm just getting into this, use sourceset.

It's an addition to what you're using with your regular images.

Pictures should, kind of, only be used in terms of art directed and I'm not sure how much people use art directed. And

again, avoid dark matter.

If you have an element that has an image in it, and you're using display none, it's really bad.

You're still downloading the image.

You haven't done anything, you've just essentially enforced a penalty on your end user to make something a little bit easier for yourself.

And really, honestly, in responsive design, if you're using display none, you're probably doing something wrong anyway because we should be showing as much as we can to people.

Desktop and mobile should be the same.

So what kind of tools can we use, in terms of images, to make your life easier? If you're not using a task runner, it is, after performance budget, the thing you should be using to increase your-- to make your life easier, in terms of performance perspective.

So imagemin, instead of a designer having to think about some image compression ability, imagemin will take care of it a little bit for you.

Image-resize will resize images in terms of doing SourceSet.

Spritesmith-- svgmin, if you're using SVG.

If you're not using a task runner, you can also use services like kraken.io or smushit

to compress images for you, to get the most out of it.

And imigix.com, I

guess that's kind of the replacement for the sentia.io,

will automatically, on the fly-- if you just add a URL, it'll automatically, on the fly, resize images for you.

So these are some tools that you can use in terms of images, and work into your workflow.

Again, with Grunt & Gulp they have made web performance so much easier, it becomes a lot less thought.

So, web perf in JavaScript.

So this is our second, kind of massive thing that we're dealing with.

I would avoid using bulky frameworks if you can.

I know jQuery is awesome, and makes life great, but you're getting penalized for two things when you use JavaScript.

A, it's the download, and then B, the browser still parses it, takes time to figure out and try to make that available to you to use.

So everything in jQuery is available to you to use at any time.

And, of course, for the most part people tend to ignore the vast bulk of it.

It's gotten better with jQuery2, but still we had a developer the other day who used jQuery UI, had the entire framework, so that-- and it was 200 kilobytes, just so he could have a accordion on his web page.

And that's the kind of stuff you have to be a little more critical of, and be thinking about that, hey, I don't need all of this to do a really simple thing.

And so a better thing to do would be microjs.com,

which has a handful of these really tiny little packages that do really specific things.

I think it's very much in that kind of node style of thinking about front end JavaScript.

Or, even better, learn Vanilla JS and get really familiar with Mozilla Developer Network.

There's some really great things that you can write, just really simply.

If all you're doing is selecting elements on a page, it's a lot easier to write document.getElementById

than it is to include jQuery to get sizzle.

Kind of an easy trade off there.

Try to avoid DOM re-flows and repaints.

Using JS to modify DOM is something that we all use.

But if you're going to be inserting document elements into the document, kind of build them off canvas with a document fragment, and then insert them.

Or if you're going to add styles, add classes.

Don't add actual styles to the elements, just always add a class.

So as you add more attributes to the class, it's a lot easier to A, keep updated, and B, you're applying that style and all those parameters at once to an element, Instead of like, select element, apply style, select element, apply style, select element, apply style.

Use Touch or FastClick.

FastClick, I think, would be the best bet to reduce perceived performance, in terms of Touch events.

And CDNs to serve common JS libraries.

Google does a fantastic job of having a CDN for, basically, everything.

And that makes your website just, again, that much faster, because people already have it cached.

And that takes care of the caching for you.

So some tools to think about using.

Uglify, which will minify things for you.

Concat, use with discretion, concatenation.

Taking all of your files and putting them into one request is not always the best thing.

There's actually kind of a break point, where having all that stuff into one massive request that takes a really long time to download, is actually worse than just having two or three requests that are a little bit smaller.

Something you have to kind of find on a per-project basis, what is gonna make sense for you.

So don't just like be-- concat is one part of a Gulp task.

You have to think about maybe splitting that up a little bit.

Closure-compiler, google-cdn-- google-cdn is pretty cool, because what it does is basically takes all of your local libraries, if they exist in google-cdn it'll replace those local requests with what the CDN version would be in your markup.

And so that's an amazingly handy tool.

I would only use it if you have, like a production task in Gulp.

But, yeah, that would make, again, your life easy.

You don't have to figure out what exists there, and doesn't exist there and take advantage of it.

You can just use it right your way.

And, again, on the web, in terms of things to learn, microjs.com.

I can't stress enough how awesome it is, in terms of being able to find tools that do exactly the right thing, the exact thing you're looking for.

And to learn from developer.mozilla.org

and developers.google.com

because there's a massive amount of infrastructure and write up right now, regarding what's going on.

And kind of to get an idea of what a content breakdown looks like, in terms of WebPagetest, so you can get an idea of where you might want to focus on your various tools.

In terms of-- that's pretty straightforward.

Web fonts are by far my biggest issue.

And then, third party code easy wins.

A lot of times we don't really think that if we're relying on a third party group, how does that affect us.

Hey, I just include this little snippet that was this like button, it was like one line, it's gotta be awesome and fast, right? It's only one line.

But behind the scenes, they're loading 100k of JavaScript with 15 requests, or something along those lines.

So you're taking a really big hit by including third party.

If you want to test it, you can use spof-o-matic, which is something that's available in the Chrome Web Store and becomes a little thing that's part of Chrome.

And you can test to see, hey, if Facebook went away, how would my page react? How much longer would my page take the load? What would happen? And it's, again, a way of testing it.

And I would also avoid social media widgets.

I mean, is everyone really excited to say how much-- they only have zero likes on every article? So a good way of avoiding that is Heiss, in Germany, uses a two click social widget.

So it kind of says, hey, if you click this, you'll get your chance to like this page.

But it increases your privacy and, again, increases performance massively, drastically increases the performance of your web page by not relying on that social media widget to be included.

WebPagetest also allows you to test SPOF.

There's actually a little tab where you can type in all the URLs that you essentially want dropped to dev/null, and see how your page reacts.

What kind of loads, what doesn't load, and when it loads.

So if the best request was no request, then the worst request is one that blocks the parser.

And so it's really important to also understand not just that we're-- what we're requesting, but how the browser actually handles and draws what we've requested.

And the most important thing here is to understand the Critical Rendering Path.

So what the critical rendering path basically states, is the browser goes and fetches your markup and starts creating your DOM tree.

Starts laying out all of y our-- not laying out all of your elements but at least building the tree.

As part of that, it says, hey, there's CSS.

I should go download the CSS, so it downloads your CSS.

So the parser is still creating the DOM tree but it's waiting on the CSS.

Because it doesn't know how to actually draw that stuff out, and it's gonna say, CSS is gonna tell me how to actually draw everything out.

And so once the CSS is done your rendering can start.

So that's render blocking.

CSS render blocks your page.

Once the CSS is done, your page can start being rendered.

And then if you have JavaScript, it finds your JavaScript.

It'll actually blocked everything, because JavaScript can affect both the DOM and CSS, so it says, hey, I'm going to stop.

So what you want to understand is how to optimize to make sure your website actually starts loading as quickly as they can, for the thing that-- for the most important thing the person can see, which is above the fold, what's actually in the view port.

You want that to load as fast as you can.

Anything that's not in view should be able to be deferred and delayed for later.

And maybe your web page actually takes quite a while to load, but you've deferred everything and the user's starting to actually interact with content.

So you want to optimize for the critical rendering path.

You want to-- in the waterfall view-- and you have to learn to love the waterfall if you want to learn web performance-- is you try to make-- it's not amazingly obvious-- that Start Render line as far to the left as you possibly can.

You want so that the user actually starts seeing something happening.

That's what the Start Render is.

It went from a web-- went from a white block, nothing is there, nothing-- I don't know what's going on, to actually, hey, things are starting to happen.

You want to make sure that that page render is as far to the left as you possibly can, as close to DOM content loaded as you can make it.

So people know that after the DOM has been loaded, and after CSS has been loaded, hey, we can start seeing things.

So you want to start figuring out where these lines are and trying to make them as good as you can.

And this is kind of an example of WebPagetest, actually has this filmstrip view.

So it could show you where that cut-off is between nothing, 0%, lowered at two seconds in on my web page, to, at least something has started to load.

And this is on a 3G connection, which is why it seems a little slow.

Well, this is a great way-- like, you never would really get this out of DevTools or PageSpeed Insights.

But this is a way of trying to figure out, hey, this is actually how it would look on this particular device, or this is how it actually would look loading on this particular connection from this particular location.

And so you can get an idea of how we can better do that.

So the filmstrip view is a really good look at critical rendering path as well as percent complete.

[BUMPS MICROPHONE] Whoops, sorry.

So again, you can start figuring out, with these massively useful charts, how can I get these things faster and better and farther to the left, so things are going on.

So some easy wins in terms of the critical rendering path would be to defer loading of JavaScript.

Probably the easiest thing, and best thing you can still do and the best practice, is put your JavaScript right before the body tag.

That's, regardless of your multiple HTML5 attributes that you can use now, it's going to be the best and simplest thing to do.

Otherwise you would probably use deferred if you have to have JavaScript elsewhere, to at least make it look like it's being above that-- above the body tag.

And then async.

That's the order I would do, and I think most people actually-- others would say async comes before defer.

I would say defer before async.

And you can also script elements into the DOM using the on-load event.

So if you know something isn't needed at all until after the on-load event, until the page has been finished loading, the best thing to do is start inserting element after on-load event.

So place critical CSS directly within your document and load the rest of the CSS after on-load, with JavaScript, is another way of getting the critical rendering path to make sure everything in that view port that CSS is not blocking.

And defer the rest.

A good service to use this is Penthouse.

Penthouse is also, I believe, a Grunt &Gulp plug-in that you can use to figure out what your critical CSS is and have it embedded in your web page.

And then for anything that's above the-- again, above the fold-- the fold is gonna-- is coming back in a massive way for web performance-- use the same host name.

So one DNS for all of them reduces DNS look-ups, which is, again, being affected by latency.

So how slim can you make it? And could you possibly try to have a single request for all above the fold content.

So having your CSS in lines, no real need for images-- you can use data URIs if you have really small images.

Could you do that all in, essentially, one package of the markup and deliver that to the client, and then defer everything else after that to be amazingly fast? Because the ultimate goal is a narrower and shorter waterfall, but also focus on getting, again, that fast initial render to at least give the user something, instead of them just kind of waiting and waiting and waiting for something to happen.

Give them something to look at and the rest of it can be lowered afterward.

And it's not enough to be like, hey, I did one test.

You also want to make sure you test what I call the squishy.

So test set all your break points and see how your performance is working, and make sure you're using the correct media queries, like min width versus max width.

They can have massive implications if you use them incorrectly.

And so you can use scripting and custom view ports with WebPagetest.

So you can actually go in, and type in and say, hey, this is my break point, I want you to test at this point and see what happens.

And, again, save that.

And go through each iteration of all your break points and test those each individually, and see how your web performance works from each viewpoint to each-- each break point to each break point.

And again this is what, if you scripted it and set it at 324, it would give you a filmstrip view in a separate report.

WebPagetest, I've given you kind of like an amazingly high level view, the things that I find massively useful in WebPagetest.

But Andy Davies and Aaron Peters go into much more detail about what you can get out of WebPagetest.

It's a massively useful, massively customizable tool to figure out what's going on.

And they did this-- and this was actually, I think, one of the top 10 talks in Europe last year, or two years ago.

And they do just a fantastic job, so this is how you can learn more.

So reviewing tips for easy wins.

Number one, I would say, is budget, to give yourself and a client a performance goal.

And if you can, base that on competitors.

Figure out what their baseline performance is and make sure their website-- their client's website works better than the competitors.

Enable cache headers, really easy win.

Enable compression, again, amazingly easy win.

Just talk to a server admin.

Properly format or reduce images.

Defer as much as you can.

Just start thinking about how much can you put off to after the on-load event that you can.

So the web page is at least there and usable, and then everything else can kind of fill in after the fact.

This includes images, actually.

You can defer images to just after on-load if they're below the fold, right? If you don't need to see it-- if you can't see it, does it need to be loaded? And then, finally, use a task runner to build web performance into your workflow.

If you're not using a task runner, it's gonna be the easiest way to make all this stuff work.

And there's tons of tools, and tips and tricks about how to make sure web performance is working that way.

Because I mentioned it in my description, I wasn't sure how much time I was going to have, I'm just going to cover this really quickly.

Another way of kind of slicing into performance, if you're not as-- if you don't feel as comfortable, I guess, with the front end stuff, is to use a technique called RESS, which stands for Responsive Design and Server-Side Components.

I don't know what happens to the D and the C. You'll have to take

that up with Luke Wroblewski.

I have no idea how he actually came up with RESS out of that amazingly long description.

But, basically, what it is, it gives you an interesting way of sort of swapping in and out elements.

To be things based on, primarily, user agent string to make things perform a little better.

So in our case, we used it on our homepage, because we didn't feel quite comfortable doing a responsive slide show or carousel.

Politically, we had to have a carousel.

Who doesn't love a carousel? It makes everyone's lives amazing and easier, I don't know.

But we didn't have, I think, the in-house talent to kind of make it responsive and fast and performant.

So we used the technique RESS to actually swap that out on the server.

So for a mobile device we gave you a slightly different carousel, and on the front end, for desktop, we gave you a different carousel.

So it can be a scalpel.

It doesn't replace what you're doing with responsive design, responsive design is still the base.

Flexible layouts, all that kind of stuff is still the base.

If you don't feel quite as comfortable on the front end, you can use RESS to actually swap things out.

I did a presentation a long time ago, called An Evolution of Responsive Design.

It's definitely not an evolution, I think I was a-- bit of hubris on my part several years ago.

Makes me seem like a real ass.

But, basically, it goes into tips and tricks and techniques about how you can actually use RESS, or use the server to help you in delivering a performant website, beyond just the cache headers, or whatever it would be.

So some more web performance tools.

Perf tooling today.

If you're using Gulp and Grunt, this is the resource you'd want to use to find all the tools to make your life easier.

It's a really great, handy list of resources to automate your process, to just kind of slide in these things to figure out how well they're going to work for you.

As important as it is to think about performance during development, you also have to monitor performance and make sure your web performance is still holding up.

So Google Analytics has site speed.

So if your website is using Google Analytics to monitor your hits and stuff like that, you can also track how your site speed is doing, good or poor or whatever.

And this is-- the great thing about this, this is real user data.

It's not you trying to mimic it with WebPagetest, or whatever it is, which is-- you know, ends up what you're doing, you're mimicking a real request.

This is real requests.

This is real feedback.

And you can start narrowing it down.

Say, hey, from particular-- you can cut and slice this data the same way you would page views.

So you can say, hey, from this particular country we're seeing a big uptick in purchases.

You know, how's our performance doing? Is there anything we can do to tweak that to make it load that much faster, and maybe get more conversions, get more page views? Be better for them.

Another tool for using performance monitoring would be SpeedCurve, which is a tool from Mark Zeman.

Would recommend that.

Not only can you monitor your own performance, but, again, you can look at competitors and say, how's the competitor's performance doing? It'll tell you, and then as their performance changes, you can go, hey, maybe I need to start tweaking myself to stay ahead of them to be better than the competitor.

And to make-- so we can, again, beat them.

And he just released, I think literally yesterday, this perf map, which is kind of a different way of looking at web performance.

So I haven't used it yet, but it's another way of looking at what assets on your web page may actually be the slowest, or what's taking the longest time, and kind of categorizing it in a really nice visual way instead of just having to learn the waterfall and sort of visualize in your own head.

This gives you a really nice tool for seeing individual bits, and how long they were taking, and maybe where you can make changes.

More real user testing would be Boomerang.

It's something, again, you have set up yourself.

But you can figure out, hey, this is how performance is working for real end users, and what I have to tweak.

Because performance is not just a-- it really is not the end of-- it's not something you tack on to the end of a process, and it's not something you just give up when you hand it off to a client.

Or if you're internal, it's not something you just be like, I'm done, I sort of use these tools and I'm set.

You have to keep monitoring it, and see how things do.

When new browsers come out, they may be better or worse at performance.

You never know, may do different things.

So you want to keep on top of it the same way you would keep on top of any of your operations or how things are performing.

And last but not least, in terms of tools, is mod_pagespeed.

This is something that you can just have a server admin turn on, and it takes care of a whole lot of the stuff that you would normally take care of with Gulp or Grunt.

It'll convert images for you, it will do cache, it will do in-lining your CSS for you to get a faster page render time, things you don't have to think about.

The good is you don't have to think about it, and it's kind of just always running.

The bad is it can be-- I found it to be kind of annoying on the cache side, where you just have to try to clear things out.

And then also, I mean, it's not as intelligent as you would be, in terms of what the trade-offs are.

It's just going to kind of go in and do stuff, and kind of cross your fingers and hope that it all works.

So it's great because it's automated.

The flip side is its automated.

So devices.

Finally, to think about what we have in terms of how devices can work well together, and how we can actually get our hands on devices.

Because this is the ideal, is to actually test, not just in a browser in Chrome DevTools to see how performance is running, or to use WebPagetest and kind of hope that all that stuff is set up correctly.

It's to actually have a device in your hand that's connected to a network, and to actually be able to see things, edit things, and see how the performance is running.

Because you'll have access to all the same Chrome dev tools that you would on a desktop, you'd have that access with a remote device as well.

So one of the things you want to think about is how can you slow things down if you happen to have a device, locally? And you have two options.

On Windows you probably want to end up using Charles, which is a HTTP proxy.

So you can set up, basically say, hey, slow down all my network connections to like a 3G speed or slower.

Muck around with latency, all that kind of good stuff.

So on a local device in your hand, you can start mimicking poor network performance, even if you have great network performance.

If you're on a Mac you can actually use Network Link Conditioner.

It's actually a tool that's on the Mac Developer CD setup.

And you can install that if you have the Mac Developer Tools.

And that way, again, locally on your own machine you can start mucking around with network connections and seeing how network connections affect your performance.

And it has a bunch of built-in profiles.

So this is a 3G connection with a 200 milli-- or 300-- probably 350 millisecond latency.

What happens? And it gives you a really great idea of locally, of what's going on.

Jason Grigsby, hate to go back to him again, but-- Jason did a really good write-up, in terms of setting up Charles Proxy for iOS apps.

But the basic instructions that are there work for Windows, work for Mac, and explain how you can go about using it-- using Charles Proxy.

And if you want to get your hands on real devices, eBay-- mobilekarma.com,

I know this works really well in the States, we use it.

Basically it's a reseller of older devices.

You're never gonna get the latest, greatest, but-- it's a reseller of devices you can actually get your hands on for very cheap.

Again, in the States, cell phone store leftovers.

But finally, open device labs.

And this is where you guys are massively lucky.

You actually have an open device lab here in Amsterdam.

It's got 39 devices, Front Lab.

So if you're like, hey, I want to go test something, it's there.

It's open for you to go ahead and go use.

And there's eight other open device labs in the Netherlands alone.

So you can go ahead and figure that out.

And they're all over Europe, It's a really big movement here in Europe where people are being great and sharing devices.

And not only do you get this chance to kind of go and use the device, but it's also a chance to go network with other people and learn, hey, maybe some tips and tricks.

If you have a problem, they can go on a device, maybe they'll be able to help you out.

So it's a good way of-- kind of a geek coffee shop.

Congregate, figure out what's going on.

So, again, Front Lab here has 39-- a massive device lab, 39 devices.

At least that's what they list, so I hope they still have it.

So summing it all up, on this massive talk, all the various things that you can do with web performance.

Definitely open conversations with a client or with a team, using performance budget right from the get-go on a project.

That gets you in that mindset and culture of thinking about performance from the start.

At the same time, you would say to a client, we're going to be a snazzy website with these brand colors, you can say, and we're going to beat your competitor by 25%.

And it's something you can add on as part of your-- your project.

First render speed is probably more important than pure resource size.

So if you're going to be focusing on anything, I would definitely focus on getting page render as fast as you can-- the initial page render as fast as you can to show a user that something actually is happening.

And the rest of it can all be deferred.

You can still have a massive web page, just make sure stuff loads somewhat quickly on the front end.

Try as much as you can to integrate performance into your workflow.

Again, this comes down to, I think, a task runner has just become the thing that you have to use.

No matter-- I'm on the back end more so than anything, and I still use a task runner to make my life easier.

But also to take care of these kind of manual tasks that, I think, is what stops us from doing these things in the first place.

And it's why we always try to push them off to the end, because they are a test.

To minify a file is annoying.

It used to be like, you had to have this Yahoo thing to do it, and it was-- Yahoo-- YUI Compressor.

And now you can actually make that, just as your project, every time you run the project.

Every time you update a file, it gets minified.

So definitely integrate that performance into your workflow.

Test, evaluate, and most importantly, monitor.

It's not just a matter of having done that during the development phase, but you want to make sure as new browsers come out, as new features come out, as you get traffic from new places, they're optimizing your performance as best as you can.

So some performance tweeps to follow.

Definitely Ilya.

Steve is in the crowd, or was in the crowd.

Andy Davies, I think, is a fantastic follow.

He shares amazing amount of great stuff.

Tam(my) Everts and Pat Meenan also would be good people to follow on Twitter, if you're on the Twitters.

And so, with all that, thank you very much for listening.

I appreciate you guys learning more about web performance.

[APPLAUSE] Almost forgot my microphone.

Right, well, your talk over-ran a bit so I'm going to revoke your chill-out lounge privileges.

Oh, that's fine.

But we will do one question.

What's there? Okay, we'll do this one.

So of all the performance advice you gave there, how does HTTP 2 change things? Ah, jeez.

It's a short question, so I presume it'll be a short answer, right? It will change things massively.

And I think, for the short term, focus on the old tech, because you're still delivering to the old tech.

And just keep doing-- the other things that are going on now, I think all the tools are changing so quickly, just get the basics right first, and then worry about stuff after the fact.

Cool.

I know lots of you had questions, but we're going into a break and it'll be lunch time as well.

So bother Dave with your questions then.

Dave Olsen! Thank you.