.net, C# tip, IIS, MVC, Non-functional Requirements, Performance, Web Development

More performance tips for .NET websites which access data

I recently wrote about improving the performance of a website that accesses a SQL Server database using Entity Framework, and I wanted to follow up with a few more thoughts on optimising performance in an MVC website written in .NET. I’m coming towards the end of a project now where my team built an MVC 5 site, and accessed a database using Entity Framework. The engineers were all scarred survivors from previous projects pretty experienced, so we were able to implement a lot of non-functional improvements during sprints as we went along. As our site was data driven, looking at that part was obviously important, but it wasn’t the only thing we looked at. I’ve listed a few of the other things we did during the project – some of these were one off settings, and others were things we checked for regularly to make sure problems weren’t creeping in.

Compress, compress, compress

GZip your content! This makes a huge difference to your page size, and therefore to the time it takes to render your page. I’ve written about how to do this for a .NET site and test that it’s working here. Do it once at the start of your project, and you can forget about it after that (except occasionally when you should check to make sure someone hasn’t switched it off!)

Check your SQL queries, tune them, and look out for N+1 problems

As you might have guessed from one of my previous posts, we were very aware of how a few poorly tuned queries or some rogue N+1 problems could make a site grind to a halt once there were more than a few users. We tested with sample data which was the “correct size” – meaning that it was comparable with the projected size of the production database. This gave us a lot of confidence that the indexes we created in our database were relevant, and that our automated integration tests would highlight real N+1 problems. If we didn’t have “real sized data” – as often happens where a development database just has a few sample rows – then you can’t expect to discover real performance issues early.

Aside: Real sized data doesn’t have to mean real data – anonymised/fictitious data is just as good for performance analysis (and obviously way better from a security perspective).

Use MiniProfiler to find other ADO.NET bottlenecks

Just use it. Seriously, it’s so easy, read about it here. There’s even a nuget repository to make it even easier to include in your project. It automatically profiles ADO.NET calls, and allows you to profile individual parts of your application with a couple of simple lines of code (though I prefer to use this during debugging, rather than pushing those profile customisations into the codebase). It’s great for identifying slow parts of the site, and particularly good at identifying repeated queries (which is a giveaway symptom of the N+1 problem).

Reduce page bloat by optimising your images

We didn’t have many images in the site – but they were still worth checking. We used the Firefox Web Developer Toolbar plugin, and the “View Document Size” item from the “Information” menu. This gave us a detailed breakdown of all the images on the page being tested – and highlighted a couple of SVGs which had crept in unexpectedly. These were big files, and appeared in the site’s header, so every page would have been affected. They didn’t need to be SVGs, and it was a quick fix to change it to a GIF which made every page served a lot smaller.

For PNGs, you can use the PNGOut utility to optimise images – and you can convert GIFs to PNG as well using this tool.

For JPEGs, read about progressive rendering here. This is something where your mileage may vary – I’ll probably write more about how to do this in Windows at a future time.

Minifying CSS and JavaScript

The Web Developer Toolbar saved us in another way – it identified a few JavaScript and CSS files issues. We were using the built in Bundling feature of MVC to combine and minify our included scripts – I’ve written about how to do this here – and initially it looked like everything had worked. However, when we looked at the document size using the Web Developer Toolbar, we saw that some documents weren’t being minified. I wrote about the issue and solution here, but the main point was that the Bundling feature was failing silently, causing the overall page size to increase very significantly. So remember to check that bundling/minifying is actually working – just because you have it enabled doesn’t mean it’s being done correctly!

Remember to put CSS at the top of your page, and include JavaScript files at the bottom.

Check for duplicated scripts and remove them

We switched off bundling and minification to see all the scripts being downloaded and noticed that we had a couple of separate entries for the JQuery library, and also for some JQuery-UI files. These were big files and downloading them once is painful enough, never mind unnecessarily doing it again everytime. It’s really worth checking to make sure you’re not doing this – not just for performance reasons, but if you find this is happening it’s also a sign that there’s maybe an underlying problem in your codebase. Finding it early gives you a chance to fix this.

Do you really need that 3rd party script?

We worked hard to make sure that we weren’t including libraries just for the sake of it. There might be some cool UI feature which is super simple to implement by just including that 3rd party library…but every one of those 3rd party libraries includes page size. Be smart about what you include.

Tools like JQuery UI even allow you to customise your script to be exactly as big or small as you need it to be.

Is your backup schedule causing your site to slow down?

I witnessed this on a previous project – one of our team had scheduled the daily database backup to happen after we went home…leading to some of our users in the world in a later time zone to see a performance deterioration for about half an hour at the same time every day. Rescheduling the daily backup to later in the day caused us no problems and removed a significant problem for our users.

Is someone else’s backup schedule causing your site to slow down?

There’s a corollary to the previous point – if you’re seeing a mysterious performance deterioration at the same time every day and you’re absolutely sure it’s not something that you or your users are doing, check if your site is on shared hosting. When I contacted our hosts and requested that our company VMs were moved onto a different SAN, it miraculously cleared up a long-standing performance issue.


There’s a few tips here which really helped us keep our pages feeling fast to our users (and some other tips that I’ve picked up over the years). We didn’t do all of this at the end of the project, this was something we focussed on all the way through. It’s really important to make sure you’re checking these things during sprints – and part of your Definition of Done if possible.

IIS, Non-functional Requirements

Some useful headers to add in IIS to improve security.

We use the OWASP ZAP tool to do some quick penetration testing on our site. This is a great free tool, and can be used as part of your continuous integration suite.

One of the things it looks for is whether your web application has some useful security related HTTP Headers. OWASP has a good list here and there’s 3 that I think are particularly important for you to configure in IIS.

You can look at headers for a site using http://cyh.herokuapp.com/cyh. This is a really excellent site and application – it highlights headers that it recognises as correctly configured, as well as warning about those which it might be configured wrongly or just plain missing. And even better than that, it recommends what the header should be – very nice constructive criticism!

The suggestions below obviously aren’t comprehensive – just part of what you could/should be doing.

Help Protect against Clickjacking

X-Frame-Options: deny

This makes sure you’re not accidentally rendering content inside a frame that you don’t intend to.

Help Protect against Cross Site Scripting (XSS)

X-XSS-Protection: 1; mode=block

Modern web browsers have some XSS protection built in by default, but having this header on your site is a good belt-and-braces approach to making sure it’s active (in case it has been disabled for some reason).

Help Protect against Drive-by-Downloads

X-Content-Type-Options: nosniff

This makes sure that IE and Chrome won’t look at some content and try to “sniff” the mime-type, which could cause content to be treated as an executable.

Final note – a colleague of mine who’s another Technical Architect suggested this – a way of using Powershell to add these headers to your IIS instance, so this can be part of your continous deployment practice. This way you’ll not forget those headers when you set up a new environment!

IIS, Non-functional Requirements, Performance

Compress your static and dynamic content in IIS and improve the performance of your website.

A quick way to give your site a performance boost is to enable compression of your content, so a smaller payload is sent over the wire to the browser. I’ve seen this described in a complex way and I think it’s really quite easy.

First – check that the server roles of “Static Content Compression” and “Dynamic Content Compression” are available in your version of IIS.

By default, Static is available but Dynamic is not. If you’re on Windows Server, you need to add this role through Server Manager. If you’re on a standard Windows install, you can do this through turning Windows features on or off through the Programs app in Control Panel.

Second – You need to modify your web.config file. Find the <system.webServer> node, and add the line:

<urlCompression doStaticCompression="true" doDynamicCompression="true" />

Note that both attribute values are set to true.

Third – Restart your server (this bit’s necessary if you installed the Dynamic Content Compression role in step 1).

And refresh your web page. It’s really that simple.

You can test that your site is using GZIP by using the Web Developer Toolbar in Chrome or Firefox – go to Information -> View Response Headers, and look for a header with the text below:

Content-Encoding: gzip

Lots more detailed information here: https://www.iis.net/configreference/system.webserver/urlcompression