.net, C# tip, IIS, MVC, Non-functional Requirements, Performance, Web Development

More performance tips for .NET websites which access data

I recently wrote about improving the performance of a website that accesses a SQL Server database using Entity Framework, and I wanted to follow up with a few more thoughts on optimising performance in an MVC website written in .NET. I’m coming towards the end of a project now where my team built an MVC 5 site, and accessed a database using Entity Framework. The engineers were all scarred survivors from previous projects pretty experienced, so we were able to implement a lot of non-functional improvements during sprints as we went along. As our site was data driven, looking at that part was obviously important, but it wasn’t the only thing we looked at. I’ve listed a few of the other things we did during the project – some of these were one off settings, and others were things we checked for regularly to make sure problems weren’t creeping in.

Compress, compress, compress

GZip your content! This makes a huge difference to your page size, and therefore to the time it takes to render your page. I’ve written about how to do this for a .NET site and test that it’s working here. Do it once at the start of your project, and you can forget about it after that (except occasionally when you should check to make sure someone hasn’t switched it off!)

Check your SQL queries, tune them, and look out for N+1 problems

As you might have guessed from one of my previous posts, we were very aware of how a few poorly tuned queries or some rogue N+1 problems could make a site grind to a halt once there were more than a few users. We tested with sample data which was the “correct size” – meaning that it was comparable with the projected size of the production database. This gave us a lot of confidence that the indexes we created in our database were relevant, and that our automated integration tests would highlight real N+1 problems. If we didn’t have “real sized data” – as often happens where a development database just has a few sample rows – then you can’t expect to discover real performance issues early.

Aside: Real sized data doesn’t have to mean real data – anonymised/fictitious data is just as good for performance analysis (and obviously way better from a security perspective).

Use MiniProfiler to find other ADO.NET bottlenecks

Just use it. Seriously, it’s so easy, read about it here. There’s even a nuget repository to make it even easier to include in your project. It automatically profiles ADO.NET calls, and allows you to profile individual parts of your application with a couple of simple lines of code (though I prefer to use this during debugging, rather than pushing those profile customisations into the codebase). It’s great for identifying slow parts of the site, and particularly good at identifying repeated queries (which is a giveaway symptom of the N+1 problem).

Reduce page bloat by optimising your images

We didn’t have many images in the site – but they were still worth checking. We used the Firefox Web Developer Toolbar plugin, and the “View Document Size” item from the “Information” menu. This gave us a detailed breakdown of all the images on the page being tested – and highlighted a couple of SVGs which had crept in unexpectedly. These were big files, and appeared in the site’s header, so every page would have been affected. They didn’t need to be SVGs, and it was a quick fix to change it to a GIF which made every page served a lot smaller.

For PNGs, you can use the PNGOut utility to optimise images – and you can convert GIFs to PNG as well using this tool.

For JPEGs, read about progressive rendering here. This is something where your mileage may vary – I’ll probably write more about how to do this in Windows at a future time.

Minifying CSS and JavaScript

The Web Developer Toolbar saved us in another way – it identified a few JavaScript and CSS files issues. We were using the built in Bundling feature of MVC to combine and minify our included scripts – I’ve written about how to do this here – and initially it looked like everything had worked. However, when we looked at the document size using the Web Developer Toolbar, we saw that some documents weren’t being minified. I wrote about the issue and solution here, but the main point was that the Bundling feature was failing silently, causing the overall page size to increase very significantly. So remember to check that bundling/minifying is actually working – just because you have it enabled doesn’t mean it’s being done correctly!

Remember to put CSS at the top of your page, and include JavaScript files at the bottom.

Check for duplicated scripts and remove them

We switched off bundling and minification to see all the scripts being downloaded and noticed that we had a couple of separate entries for the JQuery library, and also for some JQuery-UI files. These were big files and downloading them once is painful enough, never mind unnecessarily doing it again everytime. It’s really worth checking to make sure you’re not doing this – not just for performance reasons, but if you find this is happening it’s also a sign that there’s maybe an underlying problem in your codebase. Finding it early gives you a chance to fix this.

Do you really need that 3rd party script?

We worked hard to make sure that we weren’t including libraries just for the sake of it. There might be some cool UI feature which is super simple to implement by just including that 3rd party library…but every one of those 3rd party libraries includes page size. Be smart about what you include.

Tools like JQuery UI even allow you to customise your script to be exactly as big or small as you need it to be.

Is your backup schedule causing your site to slow down?

I witnessed this on a previous project – one of our team had scheduled the daily database backup to happen after we went home…leading to some of our users in the world in a later time zone to see a performance deterioration for about half an hour at the same time every day. Rescheduling the daily backup to later in the day caused us no problems and removed a significant problem for our users.

Is someone else’s backup schedule causing your site to slow down?

There’s a corollary to the previous point – if you’re seeing a mysterious performance deterioration at the same time every day and you’re absolutely sure it’s not something that you or your users are doing, check if your site is on shared hosting. When I contacted our hosts and requested that our company VMs were moved onto a different SAN, it miraculously cleared up a long-standing performance issue.

Summary

There’s a few tips here which really helped us keep our pages feeling fast to our users (and some other tips that I’ve picked up over the years). We didn’t do all of this at the end of the project, this was something we focussed on all the way through. It’s really important to make sure you’re checking these things during sprints – and part of your Definition of Done if possible.