Agenda

SEO Basics

Search Engine Optimization

Users commonly use search engines to navigate the vast quantity of content available online. Web content is first indexed by a search engine, and sophisticated algorithms are used to assess the best match of indexed pages to search query keywords. Search engine optimization (SEO) refers to the best practices and techniques that influence your pages potential ranking.

SEO ties into both semantics and standards very closely. The more effectively the semantics explains the content, the more sense can be made of the content by search engine algorithms, resulting in superior rankings.

Long gone are the days of SEO snake oil that saw a lot of shady practices resulting in positive results. Today, most search engine algorithms are so well developed, that the best policy is: well marked up content!

Search Engines

How search engines work

Search engine limitations

Best Practices

Here are some of the best practices for proper markup and site design for optimal SEO

Site Load Time

To be effective for search engines (and impatient users) a page and all its resources should load in the client browser in under 4 seconds. There are several services online for testing page speed from various servers around the globe (eg: pingdom.com's page performance analysis). The following tools and techniques can reduce the size and quantity of files sent to the client.

Minified Code

Cache Control

GZip Compression

You can test GZip Compression on any URL to see if it is enabled, and how much bandwidth was saved.

Site Structure

The names of your files and folders, as well as the parent-child structure of the folders should be descriptive and indicative of the site's information architecture.

Content

Content within your site is one of the biggest elements that you get ranked on. Be sure it is accurate and reflects the other page elements like the meta name="description", Title and any headings.

Keywords

The keywords and phrases a user provides when querying a search engine are the basis for determining the best match. A website's target keywords and phrases should appear organically within the site content.

Choosing useful keywords

Submit URLs To Search Engines

Search engines regularly crawl the content of the web to keep their rankings up to date. During this procedure, it is only a matter of time before new pages are 'discovered' and added to the index, eventually becoming candidates to appear in search query results. New websites, especially commerical enterprises, will want to expedite this process by explicitly notifying the major engines that their site is online and ready to be indexed.

Search Engine Submissions

Disallowing SEO

As a general rule, all publicly available web content should be optimized for search engines. However, if there are sections of your site you do not want indexed, there are several options.

Using meta Tag To Disallow Indexing

You can use the meta tag to stop SEO indexing.

<-- dont index this page (noindex), dont index pages that this page links to (nofollow)-->
<meta name="robots" content="noindex, nofollow">

Using robots.txt To Disallow Indexing

To dissallow SEO using Robots.txt, place a text file named robots.txt in your server root that includes the following:

User-agent: *
Disallow: /

Alternatively, robots.txt can also disallow indexing on specific folders or files:

User-agent: *
Disallow: /privateFolder/
Disallow: /tmp/file.html

Disallow Per Link

If your pages include any links to pages that you don't want included in the ranking algorithm, you can add a rel="nofollow" attribute, eg:

<-- dont include the page this links to when assessing this page -->
<a href="http://www.bcit.ca" rel="nofollow">BCIT</a>

Why do this? When you link to another website, you are passing your site's reputation along. If, for instance, you are allowing users to add comments in an open forum, someone might post links to external sites, some of which may not be deserving of your sites reputation.

SEO Resources

Great documents on SEO

Google's own Search Engine Optimization guide

The Periodic Table of Search Engine Optimization

Front End Toolsets

Front end toolsets provide a starting foundation for new web development projects.

HTML5 Boilerplate

HTML5 Boilerplate is a front end template, essentially a collection of starter code useful to small web projects. For large scale projects, it may be better to not rely on prepackaged configurations of CSS rules and web server.

HTML5 Boilerplate includes several useful files, styles and server configuration files:

To get the most out of HTML5 Boilerplate, review the various files and features available:

Bootstrap

Bootstrap is a front end framework that provides various responsive layout solutions for developers to build on. A small to medium sized web project may benefit from the prewritten styles, especially useful for rapid prototyping of page designs.

Bootstrap features:

There are two options for using Bootstrap:

  1. Download Bootstrap to your server and use the files locally.
  2. Alternatively, load Bootstrap via CDN

Begin with an HTML starter template and apply Bootstrap classes and ids to your semantic HTML. Bootstrap's primary layout style is the .container, which contains one or more .rows, each of which can be subdivided into up to 12 sections. The Bootstrap grid system is responsive by default, but can be further customized to developer preferences.

Other Front End Tools And Templates

CMS

CMS (Content Management Systems) provide rich functionality along with well tested code. Customizing templates can be challenging for some of these platforms, but they do provide a client-friendly interface suitable for content updates.

Blog CMS

Other CMS

SASS

SASS (Syntactically Awesome StyleSheets) is a CSS preprocessor and scripting language that provides programming like capabilities to CSS. You can use variables and mixins to write easier to maintain, more powerful CSS code.

SASS documents use a CSS-like syntax, and are written as .scss files. A SASS preprocessor is used to compile the .scss into a .css file, which can then be applied to HTML as usual.

SASS Installation On VS Code

Writing SASS with VS Code is easiest using one of the many available plugins:

Other SASS Installation Options

The Ruby programmnig language is required to use SASS. Mac users already have Ruby installed. Windows users can use the Ruby Installer.

With Ruby installed, SASS can be run from the command line.

If you want to avoid using the command line when using SASS, try one of these user-friendly GUIs:

Using SASS

Workflow

  1. The developer writes a .scss file instead of .css
  2. SASS compiler translates the .scss into .css
  3. Apply CSS to HTML as usual

Remember: when using SASS, you do not edit your .css files. Edit .scss files and use the SASS compiler to update the CSS.

Commenting

SASS supports double-slash, single line comments in addition to the usual CSS commenting syntax.

/*
SASS: standard CSS commenting is supported
*/

//SASS: double-slash single line comments are also supported

Variables

Possibly the most-used feature of a preprocessor like SASS is the ability to use variables in your CSS. This provides an easy mechanism for the editing and updating of CSS rules that affect the color scheme or layout.

//define variables with $
$flexItemWidth: 50%;
$defaultColor: blue;

//apply variables as values to CSS properties
.item{
	width: $flexItemWidth;
	color: $defaultColor;
}
.another_item{
	background-color: $defaultColor;
}

Nested Rules

CSS rules can be nested to indicate a descendant selector relationship. This can be useful for module-specific styles. Rules can be nested as deeply as needed, but more than 3 nested levels tends to result in difficult to read and understand code.

//rules for the header
header{
	color:blue;
	//rule for selector 'header img'
	img{
		max-width:100%;	
	}
	//rule for selector 'header nav'
	nav{
		float:left;
		//rule for selector 'header nav a'
		a{
			text-decoration:none;
		}	
	}
}

Nested Namespaces

CSS properties with shared namespaces (font, background, border, margin, padding etc.) can be nested.

//assign several properties at once 
body{
	font: {
		size: 1.25em;
		weight:normal;
		family:"verdana", sans-serif;
	}
}

Referencing Parent Selectors

A nested rule can reference its parent using the & character.

//parent selector is defined...
a{
	color:blue;
	//reference parent selector with &
	&:hover{
		color:yellow;
	}
}

Mixins

An collection of css properties that are re-used throughout the stylesheet can be defined once as a mixin, then applied whenever needed.

//define mixin
@mixin nice-styles{
	box-shadow:5px, 5px, 10px, #5d5d5d;
    border-radius:30px;
    padding:10px;
}
//apply mixin 
section{
	@include nice-styles;
}

//define mixin with parameters
@mixin shaded-box($xoffset, $yoffset, $rounded-corners){
    box-shadow:$xoffset $yoffset 10px #5d5d5d;	  
    border-radius:$rounded-corners;
}
//apply mixin with arguments
aside{
	@include shaded-box(5px, 5px, 20px);
}

Imports

A .scss file can import other .scss files. All of them will be compiled when translated into CSS. The result will be a single CSS file even if there are several source files.

//import css reset
@import "cssreset.scss";

//import mixins
@import "my-mixin.scss";

Explore SASS Further

The SASS preprocessor provides useful features for reducing repetitive code bloat and increasing the ease of maintenance. The larger the web project, the more useful SASS capabilities become.

To Do

The session 10 quiz will be closed book, written.