IIS, Non-functional Requirements, Performance

Compress your static and dynamic content in IIS and improve the performance of your website.

A quick way to give your site a performance boost is to enable compression of your content, so a smaller payload is sent over the wire to the browser. I’ve seen this described in a complex way and I think it’s really quite easy.

First – check that the server roles of “Static Content Compression” and “Dynamic Content Compression” are available in your version of IIS.

By default, Static is available but Dynamic is not. If you’re on Windows Server, you need to add this role through Server Manager. If you’re on a standard Windows install, you can do this through turning Windows features on or off through the Programs app in Control Panel.

Second – You need to modify your web.config file. Find the <system.webServer> node, and add the line:

<urlCompression doStaticCompression="true" doDynamicCompression="true" />

Note that both attribute values are set to true.

Third – Restart your server (this bit’s necessary if you installed the Dynamic Content Compression role in step 1).

And refresh your web page. It’s really that simple.

You can test that your site is using GZIP by using the Web Developer Toolbar in Chrome or Firefox – go to Information -> View Response Headers, and look for a header with the text below:

Content-Encoding: gzip

Lots more detailed information here: https://www.iis.net/configreference/system.webserver/urlcompression

C# tip, Clean Code, MVC, Non-functional Requirements, Performance

Performance in ASP.net and C# – Bundling and Minification – Part #2

This is a very quick follow up to my earlier post on bundling and minification – an issue we found and the solution.

While examining the content of the (supposedly) bundled and minified resources, I noticed one of my CSS files wasn’t minified. It had the error message at the top of the file:

/* Minification failed. Returning unminified contents.

(2628,2): run-time error CSS1031: Expected selector, found '@charset'

After some investigation, we found that our CSS files (originally generated using SASS) had a header at the top of each file saying “@charset = ‘utf-8′”.

Removing lines like this allowed the contents to be minified. Deleting the line was fine, as we had the charset defined in our layout page meta data, and therefore this was already in the HTTP Headers for the page.

Hopefully this helps someone out there who’s having the same issue.

C# tip, MVC, Non-functional Requirements, Performance

Performance in ASP.net and C# – Bundling and Minification

Another quick post – a really useful feature of MVC that everyone has heard of…and then they seem to forget to do it in practice.

Don’t forget about Bundling and Minification – Rick Anderson explains here how to do it and why it’s important, and a picture tells a thousand words when you see network timings before and after switching on bundling.

Remember, if you don’t see bundling working on your MVC project:

  1. Make sure that the compilation element in your Web.config’s system.web node has the debug = "false";
  2. Check the RegisterBundles class and check if the BundleTable.EnableOptimizations value is set.
    • I don’t actually like this being in my RegisterBundles class – I’d prefer to set this through configuration and not have it embedded in my C# code;
  3. Make sure that the bundling/minification configuration that you’ve set up for your development environment isn’t being copied across to your other environments – you might have planned to debug locally, but you probably don’t want that preference copied across to your acceptance, demonstration or production environments.

This is a really quick and simple way to improve your site’s performance – try it!

Accessibility, Continuous Integration, Non-functional Requirements

Accessibility and Continuous Integration

There are some great tools out there already to test if your page conforms to accessibility standards – HTML_CodeSniffer is one of the best I’ve seen – but I can’t run this as part of my automated CI process.

There are some tools that allow you to submit a URL, such as the WAVE tool on the WebAIM site, but if you’re developing a site on an intranet, or you’re working with confidential data, this isn’t useful either.

From the W3C WAI Tools list, I discovered AccessLint, which audits your code using javascript from Google’s Accessibility Developer Tools. This posting is a quick recipe for how to run this against a web page from the command line using Windows.

  1. Download PhantomJS, extract to somewhere on your hard drive, and add the binary’s path to your environment variables.
    • Grab Phantom JS from here.
    • I extracted the zip file to C:\PhantomJS_2.0.0 so the actual PhantomJS.exe sits in C:\PhantomJS_2.0.0\bin. I created an environment variable called PHANTOM_JS, and then added “%PHANTOM_JS\bin” to the end of my PATH.
  2. Download the installer for Ruby and run it.
  3. Download the Access Lint code.
    • Grab the Access Lint code from here. Pull using Git, or download the zip – either way works.
    • I have the code in C:\Access_Lint so I can see the access_lint ruby file in C:\Access_Lint\bin.
  4. Install the rubygem.
    • Open a command prompt, browse to where the access_lint ruby file is saved (as above, I have it in C:\Access_Lint\bin), and enter:
gem install access_lint

And we’re ready to go!

Now you can open a command prompt, and enter a command like:

access_lint audit http://w3.org

The audit output will render to the command prompt window as JSON.

You can now check the accessibility of a web page on your integration server as part of a CI process.

Criticisms

The process isn’t perfect.

  • To test each page, you’d have to audit it individually – it would be better if we could crawl the site. (We could work around this by running a batch of audit commands, specifying each page to be audited in a separate line).
  • The JSON output has some odd artefacts – it uses “=>” instead of “:”, and the first and last lines in the file are console logs. (We could work around this by doing some simple post processing on the output).
  • JSON isn’t particularly readable. (We could work around this by using the JSON as a data source, and using another tool to render results in a more readable format)
  • And most significantly, if this tool doesn’t report failures, it doesn’t mean your page is accessible. (No real workarounds beyond manually checking the page.)

But this is the starting point on a journey. Accessibility can sometimes be an afterthought. It should be a natural part of development and process, and part of a team’s definition of done.