Browsers can only download a limited number of files at a time from the same host when loading a web page. This is as low as 2 files on IE 6 & 7.
To allow the page to appear faster a best practise is to prioritise important files like the main CSS, template images and the main content. Since the print stylesheet is rarely used, especially not within the first few seconds of page load I recommend loading this file last.
This isn’t as simple as referencing the file at the end of your HTML since in some browsers the page is not rendered until all CSS is loaded. A JavaScript solution is required. Add the following JS to load on document load.
function LoadPrint() {
var headTag = document.getElementsByTagName("head")[0];
var printCss = document.createElement("link");
printCss.type = "text/css";
printCss.rel = "stylesheet";
printCss.href = "/css/print.css";
printCss.media = "print";
headTag.appendChild(printCss);
}
<noscript>
<link rel="stylesheet" type="text/css" href="/css/print.css" media="print" />
</noscript>
I’ve avoided using jQuery’s appendTo as during my testing IE7 didn’t like it. Barebones JavaScript will do. Now for the no JavaScript use the noscript tag in the head to load the print stylesheet in the normal position.
The savings won’t be huge but every little helps so why not.
See also: Delay loading your print css - Stoyan Stefanov
]]>I wanted to illustrate this point of sharing bad experiences due to slow websites. Have a go at searching Twitter for 'slow website' and you will be amazed at the number of people who have vented their frustrations within the last few hours.
Here's a few I've seen in recent days.
Trying to book tickets on @SJ_AB, but the site is so slow...
— Thomas Dahlberg (@tdhlbrg) January 20, 2011
the o2.ie website is so slow and frustrating #agghhhhh
— ronan mcloughlin (@jedi_vincent) January 20, 2011
Why is Gap's website slow? Argh.
— Wordy PR Girl (@WordyPRGirl) January 20, 2011
Websites like Twitter and Facebook mean it's so easy these days to have a wee moan about the frustrations caused by an under performing site. We pay good money for a speedy internet connection so when we can't take full advantage of this due to slow websites, the user experience is quickly soured.
]]>The Etag is sent with initial download of the file
Etag:"0221921cfdfca1:5b19"
When checking for modification the etag is sent in the If-None-Match header
If-None-Match:"0221921cfdfca1:5b19"
So what’s the problem then. The issue lies when hosting a component on multiple servers. When validating the tag on a server different from the one it was downloaded from, the file’s tag can be different as they are generated using server attributes. In IIS 6 the tag is appended with a modification counter, which again is likely to be different from server to server.
So in essence rather than just telling the browser to pull the file from the cache, it will be re-downloaded.
Since last modification is already in place, lets just stick with that. If your not splitting your site across multiple servers you can at least benefit from the reduced header size.
Let’s get to switching the bugger off in IIS 6. Luckily, just as in Apache, it’s really easy.
public class Actor
{
public virtual int Id { get; set; }
public virtual string Name { get; set; }
public virtual IList Movies { get; set; }
}
public class Movie
{
public virtual int Id { get; set; }
public virtual string Name { get; set; }
public virtual IList Actors { get; set; }
}
So obviously a movie can have many actors and an actor could star in multiple movies. Your database will consist of three tables, an actors, a movie and a Cast table. The cast table will pretty much just be our link with ActorId and MovieId foreign keys.
Finally here’s the mapping.
public class ActorMap : ClassMap
{
public ActorMap()
{
Id(x => x.Id);
Map(x => x.Name);
HasManyToMany(x => x.Movies)
.WithTableName("Cast")
.WithParentKeyColumn("ActorId")
.WithChildKeyColumn("MovieId")
.LazyLoad()
.Cascade.SaveUpdate();
}
}
public class MovieMap : ClassMap
{
public MovieMap()
{
Id(x => x.Id);
Map(x => x.Name);
HasManyToMany(x => x.Actors)
.WithTableName("Cast")
.WithParentKeyColumn("ActorId")
.WithChildKeyColumn("MovieId")
.Inverse();
}
}
]]>
To turn this effect off open Cordova.plist
which can be found in the Supporting Filesdirectory.
Look for the UIWebViewBounce
entry and change it to NO
.
Magic
]]>// This returns invalid date in iOS.
var yearMonth = "2012-08";
var date = new Date(yearMonth);
document.write(date);
Split the string into parts and pass in the first two items of the array. Since January is 0 and December is 11, remember to take 1 from the month value. Should work.
// This works in iOS.
var yearMonth = "2012-08";
var dateParts = yearMonth.split("-");
var date = new Date(dateParts[0], dateParts[1] - 1);
document.write(date);
If anyone has a cleaner method to do this, let me know. Much appreciated.
]]>I was recently tasked with finding a lightbox plugin for high traffic e-commerce site I was building from scratch. Given the impact page weight can have on conversion, bounce rates and a bunch of other delicate analytic metrics, it was essential every piece of JavaScript selected for the site was as lightweight as possible, along with being responsive.
Most lightbox plugins that met the responsive requirement came bundled with functionality to display videos, and transitioned from slide to slide in more elaborate ways than the next. I just wanted to popup an image.
I decided to build my own.
Requires jQuery >= 1.4 and < 3
<html>
<head>
<link rel="stylesheet" href="jquery.lightbox.css" />
</head>
<body>
<!-- Link to the image -->
<a href="myimage.jpg" data-caption="Optional caption attribute" rel="lightbox">Click me</a>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.2.4/jquery.js"></script>
<script src="jquery.lightbox.min.js"></script>
<script>
$(function () {
$('[rel="lightbox"]').lightbox();
});
</script>
</body>
</html>
]]>
The culprit
<httpCookies requireSSL="true" />
A quick web.config transform so this is set to false on localhost saved the day. Huzzah!
]]>When visiting a snappy site like Amazon, you don't grudge clicking to view a related item. You know you won't be waiting long for content to appear. In a set time period, the less time spent loading, the more ground covered.
Bounce rate is the percentage of visitors who hit your site but leave, only having viewed one page.
Google can use your bounce rate in its ranking. A site might be completely relevant to what you were looking for but if its sluggishness is forcing visitors to hit the back button, it's understandable when you get knocked down a place.
Google looks at your site's speed as quality indicator. You'd prefer the faster of two identical sites, so Google factors this into their ranking algorithm.
If your customers leave satisfied they'll tell a friend, tweet, share on Facebook, blog etc. Lots of links leads to lots of new visitors.
A frustrating experience is remembered, making me think twice before clicking again next time your site appears on Google. Inversely, I positive experience gets you bookmarked.
Why is #onedirection's website so slow, driving me crazy???!!! how to move on if we never break down, #iUltimateQuotes
— Donnie Withers (@TpYoshiko) October 13, 2013
One less fan.
Like me, at some point you'll have visited Amazon to quickly check the price of a new watch of something and before you know it you've burned twenty minutes and your basket is full of stuff from every other department. Nice snappy pages suck you in. Similarly see YouTube and Wikipedia.
Your visitors will only dedicate so long to your site. The less time spent loading, the more time left to explore and absorb your content, leading to your finish line regardless of whether that's a checkout or new user signup.
Speed is a indicator of quality and therefore trust. You'll need to have this established this by the time you're asking for card details.
]]>If you’re nice and organised you’ll no doubt be splitting your CSS across multiple .less files, compiling to a single .css file with the power of @imports e.g. @import "components\some-component.less";
. It becomes a pain once you get into the browser if you need to trace back a line in the output to where it appears in one of your many .less files.
debug="true"
will kindly add a comment above every declaration, pointing you to the file and the line where it originates from.
Edit your web.config to add the new attribute.
<dotless minifyCss="false" cache="false" web="true" debug="true" />
]]>
npx
. However, there are some utility packages I use again and again. To install globally either npm i -g <package-name>
or even better, add them to ~/.nvm/default_packages
.
npm i -g rimraf
rimraf [path]
It's rm -rf
but works whereever you have node - macOS, Windows, WSL, node.
npm i -g http-server
http-server [path]
Quickly serve a directory through localhost:8080 with zero-config.
npm i -g netlify-cli
netlify deploy
Deploy a site to Netlify from the command line. I'll use this to deploy temporary site when I need to demo work in progress.
npm i -g nodemon
nodemon app.js
This will run your code in Node and automatically restart if it detects any files changes in the directory. Great to have installed globally for quick scripts.
npm i -g sort-package-json
sort-package-json
Predictably, this sorts the package.json of your current directory. Tidy package.json, tidy mind.
npm i -g fkill-cli
fkill node --force
Cross-platform process killer. Useful when you have node running in the background and you've closed the terminal window that started it.
npm i -g concurrently
concurrently "command1 arg" "command2 arg"
Run multiple commands concurrently.
Run this command to see what you have installed globally.
npm ls -g --depth=0
Now that the decade is coming to a close and "top x of the decade" lists are popping up everywhere, I thought I'd add to the pile with my favourite movies of the last 10 years.
Film | Year | |
---|---|---|
Baby Driver | 2017 | 7.6 |
Coherence | 2013 | 7.2 |
Deadpool | 2016 | 8.0 |
Dunkirk | 2017 | 7.9 |
Edge of Tomorrow | 2014 | 7.9 |
Ex Machina | 2014 | 7.7 |
Get Out | 2017 | 7.7 |
Gone Girl | 2014 | 8.1 |
Inception | 2010 | 8.8 |
Moonrise Kingdom | 2012 | 7.8 |
Scott Pilgrim vs The World | 2010 | 7.5 |
Shutter Island | 2010 | 8.1 |
Skyfall | 2012 | 7.7 |
Spider-Man: Into the Spider-Verse | 2018 | 8.4 |
Star Wars: The Force Awakens | 2015 | 7.9 |
The Social Network | 2010 | 7.7 |
The Wolf of Wall Street | 2013 | 8.2 |
Whiplash | 2014 | 8.5 |
Wind River | 2017 | 7.7 |
@media (prefers-color-scheme: light|dark|no-preference) { ... }
Picture a webpage with a white background with mostly black text. If a visitor has their operating system's colour scheme set to Dark, a few line of CSS is all that's needed to make the website fall in line.
@media (prefers-color-scheme: dark) {
body {
background: #000;
color: #ccc;
}
}
Git-Bash is installed as part of Git for Windows so I'd like to add this to the list and set is as the default shell.
Open settings with Ctrl+,
or via the little down arrow. This will open profiles.json in your preferred text-editor.
Add the following to the profiles []. Set guid to something unique.
{
"guid": "{abc00000-0000-0000-0000-000000000000}",
"name": "Git-Bash",
"commandline": "%PROGRAMFILES%\\Git\\bin\\bash.exe",
"icon": "%PROGRAMFILES%\\Git\\mingw64\\share\\git\\git-for-windows.ico",
"startingDirectory" : "~"
},
Save profiles.json and Terminal should now display Git-Bash in the choice of tabs.
To set Git-Bash as the default rather than Powershell, replace the defaultProfile value with your Git-Bash guid.
"defaultProfile" : "{abc00000-0000-0000-0000-000000000000}",
Before code-splitting, your webpack build might generate a single file for all of your application code.
Imagine a React application with multiple pages and components. One of these pages is MyPage.
import React from "react";
import HeavyComponent from "./HeavyComponent";
const MyPage = () => (
<div>
<h1>I am the page</h1>
<HeavyComponent />
</div>
);
export default MyPage;
Let's lazy load the HeavyComponent
so it's code isn't downloaded by the browser until a visitor hits MyPage
for the first time.
import React, { Suspense, lazy } from 'react';
const HeavyComponent = lazy(() => import('./HeavyComponent');
const MyPage = () => (
<div>
<h1>I am the page</h1>
<Suspense fallback={<p>Loading...</p>}>
<HeavyComponent />
</Suspense>
</div>
);
export default MyPage;
HeavyComponent now gets split into a seperate chunk with a numbered filename in your build directory e.g. 1.js
1.js isn't a very helpful filename. Fortunately with webpackChunkName
in an inline comment, we can instruct webpack to name the file something much friendlier.
const HeavyComponent = lazy(() => import(/* webpackChunkName: "HeavyComponent" */ './HeavyComponent');
Now the chunk will be named HeavyComponent.js
]]>If you like this kind of content, check out uses.tech for more developer's setups.
]]>Ctrl+F5
or selecting IIS Express: Start Website
from command palleteOnce you've ran through in the install wizard you're good to go.
]]>What podcasts are you listening to?
A twice weekly "Tasty Treats Podcast for Web Developers."
Top Episode: What's new in web development
Development, design, performance, accessibility, tooling, a little bit of everything!
Top Episode: Greenfield
Hilarious guide to being a professional footballer by the big man himself.
Top Episode: That Holidays Episode
A roundtable movie podcast on some of the most rewatchable movies ever made.
Top Episode: Gone Girl
Two lads having an unplanned chat about everything and anything.
Top Episode: The Curse of the Colonel
Five episodes a week covering previews and reviews of the weeks action.
]]>prefers-reduced-motion
media feature is used to detect if the user has requested that the system minimizes the amount of animation or motion on the page.
Value: no-preference | reduce
@media (prefers-reduced-motion: reduce) {
/* disable off your motion */
}
@media (prefers-reduced-motion: no-preference) {
/* Or enable animations */
}
You should find a setting for this in the accessibility settings in recent versions of Android, iOS, macOS, and Windows.
For Android: Settings -> Accessibility -> Remove animations
For iOS: Settings -> Accessibility -> Vision -> Motion -> Reduce motion
From Andy Bell's A Modern CSS Reset
/* Remove all animations and transitions for people that prefer not to see them */
@media (prefers-reduced-motion: reduce) {
* {
animation-duration: 0.01ms !important;
animation-iteration-count: 1 !important;
transition-duration: 0.01ms !important;
scroll-behavior: auto !important;
}
}
If the browser find your OS setting set to reduce motion, the cartwheeler below should not be spinning.
Eleventy doesn't have out-of-the-box SCSS/SASS pre-processing so it's up to you to bring in your own process. Taking inspiration from the excellent Hylia starter kit here's how I do it with a just a few npm scripts.
npm i --save sass
npm i --save-dev npm-run-all
My package.json file has 5 commands defined in the scripts section. Luckily I only ever need to run npm start
directly.
npm run sass
compiles the root .scss into a .css file in the _includes directory
"sass": "sass --style=compressed src/scss/styles.scss src/_includes/css/styles.css",
I'm using nunjucks for templating so I inject the CSS in my base.njk template.
<style>
{% include "css/styles.css" ignore missing %}
</style>
Running npm start
spins up the site on http://localhost:8080 and auto-reload whenever any of my code changes.
npm start
"watch:eleventy": "eleventy --serve",
"watch:sass": "npm run sass -- --watch"
"start": "npm-run-all sass --parallel watch:*"
npm run build
compiles CSS and builds the entire project into the output directory. Netlify runs this command on any new commit to the master branch.
"build": "npm run sass && eleventy"
By default,anything listed in .gitignore or .eleventyignore will be ignored by eleventy's watch process. If the compiled CSS is ignored, eleventy --serve wouldn't rebuild of the html whenever your sass is recompiled.
Luckily, there's a workaround. You can opt-out of eleventy inspecting .gitignore with the following
module.exports = function (eleventyConfig) {
eleventyConfig.setUseGitIgnore(false);
};
]]>
Start by creating sitemap.njk
in your source directory.
---
permalink: /sitemap.xml
eleventyExcludeFromCollections: true
---
<?xml version="1.0" encoding="utf-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
{% for page in collections.all %}
{% if not page.data.draft %}
<url>
<loc>{{ site.url }}{{ page.url | url }}</loc>
<lastmod>{{ page.date.toISOString() }}</lastmod>
<changefreq>{{ page.data.changeFreq if page.data.changeFreq else "monthly" }}</changefreq>
</url>
{% endif %}
{% endfor %}
</urlset>
<loc>{{ site.url }}{{ page.url | url }}</loc>
While we could always hardcode our domain, it's a nice idea to lean on 11ty's global data and pull it from there. My global JSON file is at src/_data/site.json
. Now "https://www.belter.io" is prepended to each page's path.
{
"name": "Duncan McDougall",
"url": "https://www.belter.io",
...
}
<changefreq>{{ page.data.changeFreq if page.data.changeFreq else "monthly" }}</changefreq>
There are some pages that change more frequently than others and it's worth hinting to search engines that you'd like them to recrawl these pages more often. Achieve this by adding a changefreq value to these page's front matter data.
Valid values are:
As I've been evolving this site quite a bit recently, I've set my homepage's changefreq to daily.
Once you have a working sitemap, your next steps should be let search engines know where to find it. The simplest way is to add a reference in your robots.txt file.
Sitemap: https://www.belter.io/sitemap.xml
User-agent: *
Disallow:
You can then point Google there with Search Console.
]]>I have a collection of 100 markdown documents, each about a different movie. Their frontmatter data has a title, excerpt, and a list of genres. This is what we'll add to the index.
---
title: "Parasite"
excerpt: "All unemployed, Ki-taek's family takes peculiar interest in the wealthy and glamorous Parks for their livelihood until they get entangled in an unexpect..."
genres:
- Comedy
- Drama
- Thriller
---
We're going to create an 11ty filter. We'll use this to convert our movies collection into an elasticlunr index which will be output as JSON with the url /search-index.json
const elasticlunr = require("elasticlunr");
module.exports = function (collection) {
// what fields we'd like our index to consist of
var index = elasticlunr(function () {
this.addField("title");
this.addField("excerpt");
this.addField("genres");
this.setRef("id");
});
// loop through each page and add it to the index
collection.forEach((page) => {
index.addDoc({
id: page.url,
title: page.template.frontMatter.data.title,
excerpt: page.template.frontMatter.data.excerpt,
genres: page.template.frontMatter.data.genres,
});
});
return index.toJSON();
};
const searchFilter = require("./src/filters/searchFilter");
module.exports = function(config) {
...
config.addFilter("search", searchFilter);
config.addCollection("movies", collection => {
return [...collection.getFilteredByGlob("./src/movies/**/*.md")];
});
...
};
---
permalink: /search-index.json
---
{{ collections.movies | search | dump | safe }}
Now we have an index, let's use it on a page with just HTML and a little JavaScript. With the code below, we're instructing the browser to fetch search-index.json, load it into elasticlunr, search it, and finally add the results to an <ol>
element.
<div class="field">
<label for="searchField">Search</label>
<input type="search" placeholder="Search..." id="searchField" />
</div>
<ol id="searchResults" />
<div id="noResultsFound" style="display: none">
<p>No results found.</p>
</div>
<!--Only 5.7kb GZipped. You may want to bundle this with your application code. -->
<script src="https://cdnjs.cloudflare.com/ajax/libs/elasticlunr/0.9.6/elasticlunr.min.js"></script>
(function (window, document) {
"use strict";
const search = (e) => {
const results = window.searchIndex.search(e.target.value, {
bool: "OR",
expand: true,
});
const resEl = document.getElementById("searchResults");
const noResultsEl = document.getElementById("noResultsFound");
resEl.innerHTML = "";
if (results) {
noResultsEl.style.display = "none";
results.map((r) => {
const { id, title, description } = r.doc;
const el = document.createElement("li");
resEl.appendChild(el);
const h3 = document.createElement("h3");
el.appendChild(h3);
const a = document.createElement("a");
a.setAttribute("href", id);
a.textContent = title;
h3.appendChild(a);
const p = document.createElement("p");
p.textContent = description;
el.appendChild(p);
});
} else {
noResultsEl.style.display = "block";
}
};
fetch("/search-index.json").then((response) =>
response.json().then((rawIndex) => {
window.searchIndex = elasticlunr.Index.load(rawIndex);
document.getElementById("searchField").addEventListener("input", search);
})
);
})(window, document);
Keep in mind the number of pages in your collection and how much content your indexing. Since this is client-side search, the entire index will be downloaded by the browser. My demo site of 100 pages has an average of ~150 words indexed per page. The index is 26.5kb GZipped.
Once you've got search up and running it's worth checking out the ElasticLunr docs and have a play with things like query-time boosting, boolean logic and token expansion.
]]>npm outdate
and npm update
to check for and update to newer versions of your installed node modules using npm's built-in commands.
Current
is what you have installedWanted
is the latest version that satisfies the semver range in package.jsonLatest
is, you guessed it, the latest.Red items mean the wanted version is also the latest.
This will update all packages to the wanted
version.
My package.json has "react": "^16.13.1"
listed as a dependency. npm update
would change this to "react": "^16.14.0",
, not v17.
Running npm outdated
again would show I have every package updated the wanted version.
Individual packages can be updated by listing the package names in the command e.g. npm update react react-dom
npm update
only gets you as far as your semver range. If you want to break above that and get the latest, append @latest
to the package name. e.g. npm install react@latest
All three commands work globally with the addition of -g
or --global
npm outdated -g
npm update -g
npm install -g package@latest
]]>