Users commonly use search engines to navigate the vast quantity of content available online. Web content is first indexed by a search engine, and sophisticated algorithms are used to assess the best match of indexed pages to search query keywords. Search engine optimization (SEO) refers to the best practices and techniques that influence your pages potential ranking.
SEO ties into both semantics and standards very closely. The more effectively the semantics explains the content, the more sense can be made of the content by search engine algorithms, resulting in superior rankings.
Long gone are the days of SEO snake oil that saw a lot of shady practices resulting in positive results. Today, most search engine algorithms are so well developed, that the best policy is: well marked up content!
Here are some of the best practices for proper markup and site design for optimal SEO
<title>companyName : pageTitle</title>
meta
tag name="description"
attribute is used, name="keywords"
is not<meta name="description" content="The first several words of this attribute is displayed along with many search engine rankings" />
description
attribute is the content users see in many search engines. Be accurate and descriptive, but be sure to summarize the content of the pageTo be effective for search engines (and impatient users) a page and all its resources should load in the client browser in under 4 seconds. There are several services online for testing page speed from various servers around the globe (eg: pingdom.com's page performance analysis). The following tools and techniques can reduce the size and quantity of files sent to the client.
.htaccess
file. Cache duration is defined in seconds (eg: 1 month is 2628000 seconds, 1 year 31536000, etc). For example: #add this code to your .htaccess
#the pipe delimited lists describe which file types to target
#max-age=31536000 describes the duration of the cache in seconds
#cache image files for one year
<filesMatch ".(jpg|jpeg|png|gif|ico)$">
Header set Cache-Control "max-age=31536000, public"
</filesMatch>
#cache css and js files for one month
<filesMatch ".(css|js)$">
Header set Cache-Control "max-age=2628000, public"
</filesMatch>
#if your html content is regularly updated, set a 1 day cache
<filesMatch ".(html)$">
Header set Cache-Control "max-age=86400, public"
</filesMatch>
#if needed, you can also disable caching for specific file types
<filesMatch ".(pdf)$">
Header set Cache-Control "max-age=0, no-cache, no-store, must-revalidate"
</filesMatch>
href="/styles/style_1.3.css"
to href="/styles/style_2.0.css"
ensures style_1.3.css
will not be used even if its cache duration has yet to expire.htaccess
file:#the file types in the pipe delimited list here
#will ensure browsers that cant handle GZip wont get it
<IfModule mod_headers.c>
<FilesMatch ".(js|css|xml|gz|html)$">
Header append Vary: Accept-Encoding
<FilesMatch>
</IfModule>
#files types defined here will use GZip compression
#for any client browsers that can handle GZip
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/xml
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript
You can test GZip Compression on any URL to see if it is enabled, and how much bandwidth was saved.
The names of your files and folders, as well as the parent-child structure of the folders should be descriptive and indicative of the site's information architecture.
/url?ejamcirkanfmnadcmgnskd
is not very descriptive. /products/dogfood/specials
is betteriframe
s, flash files or java appletsContent within your site is one of the biggest elements that you get ranked on. Be sure it is accurate and reflects the other page elements like the meta name="description"
, Title and any headings.
img
tags should include descriptive alt
attributesfigure
/ figcaption
tags added to summarize the contentThe keywords and phrases a user provides when querying a search engine are the basis for determining the best match. A website's target keywords and phrases should appear organically within the site content.
Search engines regularly crawl the content of the web to keep their rankings up to date. During this procedure, it is only a matter of time before new pages are 'discovered' and added to the index, eventually becoming candidates to appear in search query results. New websites, especially commerical enterprises, will want to expedite this process by explicitly notifying the major engines that their site is online and ready to be indexed.
As a general rule, all publicly available web content should be optimized for search engines. However, if there are sections of your site you do not want indexed, there are several options.
You can use the meta tag to stop SEO indexing.
<-- dont index this page (noindex), dont index pages that this page links to (nofollow)-->
<meta name="robots" content="noindex, nofollow">
To dissallow SEO using Robots.txt, place a text file named robots.txt
in your server root that includes the following:
User-agent: *
Disallow: /
Alternatively, robots.txt
can also disallow indexing on specific folders or files:
User-agent: *
Disallow: /privateFolder/
Disallow: /tmp/file.html
If your pages include any links to pages that you don't want included in the ranking algorithm, you can add a rel="nofollow"
attribute, eg:
<-- dont include the page this links to when assessing this page -->
<a href="http://www.bcit.ca" rel="nofollow">BCIT</a>
Why do this? When you link to another website, you are passing your site's reputation along. If, for instance, you are allowing users to add comments in an open forum, someone might post links to external sites, some of which may not be deserving of your sites reputation.
Google's own Search Engine Optimization guide
Front end toolsets provide a starting foundation for new web development projects.
HTML5 Boilerplate is a front end template, essentially a collection of starter code useful to small web projects. For large scale projects, it may be better to not rely on prepackaged configurations of CSS rules and web server.
HTML5 Boilerplate includes several useful files, styles and server configuration files:
To get the most out of HTML5 Boilerplate, review the various files and features available:
Bootstrap is a front end framework that provides various responsive layout solutions for developers to build on. A small to medium sized web project may benefit from the prewritten styles, especially useful for rapid prototyping of page designs.
Bootstrap features:
There are two options for using Bootstrap:
Begin with an HTML starter template and apply Bootstrap classes and ids to your semantic HTML. Bootstrap's primary layout style is the .container
, which contains one or more .row
s, each of which can be subdivided into up to 12 sections. The Bootstrap grid system is responsive by default, but can be further customized to developer preferences.
CMS (Content Management Systems) provide rich functionality along with well tested code. Customizing templates can be challenging for some of these platforms, but they do provide a client-friendly interface suitable for content updates.
SASS (Syntactically Awesome StyleSheets) is a CSS preprocessor and scripting language that provides programming like capabilities to CSS. You can use variables and mixins to write easier to maintain, more powerful CSS code.
SASS documents use a CSS-like syntax, and are written as .scss
files. A SASS preprocessor is used to compile the .scss
into a .css
file, which can then be applied to HTML as usual.
Writing SASS with VS Code is easiest using one of the many available plugins:
.scss
fileF1
to open the command line, and type Compile all SCSS/SASS files in the project
.scss
file, VS Code will compile the appropriate .css
. EasySass will also create a .min.css
minified versionView > Output > EasySass
The Ruby programmnig language is required to use SASS. Mac users already have Ruby installed. Windows users can use the Ruby Installer.
With Ruby installed, SASS can be run from the command line.
If you want to avoid using the command line when using SASS, try one of these user-friendly GUIs:
.scss
file instead of .css
.scss
into .css
Remember: when using SASS, you do not edit your .css
files. Edit .scss
files and use the SASS compiler to update the CSS.
SASS supports double-slash, single line comments in addition to the usual CSS commenting syntax.
/*
SASS: standard CSS commenting is supported
*/
//SASS: double-slash single line comments are also supported
Possibly the most-used feature of a preprocessor like SASS is the ability to use variables in your CSS. This provides an easy mechanism for the editing and updating of CSS rules that affect the color scheme or layout.
//define variables with $
$flexItemWidth: 50%;
$defaultColor: blue;
//apply variables as values to CSS properties
.item{
width: $flexItemWidth;
color: $defaultColor;
}
.another_item{
background-color: $defaultColor;
}
CSS rules can be nested to indicate a descendant selector relationship. This can be useful for module-specific styles. Rules can be nested as deeply as needed, but more than 3 nested levels tends to result in difficult to read and understand code.
//rules for the header
header{
color:blue;
//rule for selector 'header img'
img{
max-width:100%;
}
//rule for selector 'header nav'
nav{
float:left;
//rule for selector 'header nav a'
a{
text-decoration:none;
}
}
}
CSS properties with shared namespaces (font
, background
, border
, margin
, padding
etc.) can be nested.
//assign several properties at once
body{
font: {
size: 1.25em;
weight:normal;
family:"verdana", sans-serif;
}
}
A nested rule can reference its parent using the &
character.
//parent selector is defined...
a{
color:blue;
//reference parent selector with &
&:hover{
color:yellow;
}
}
An collection of css properties that are re-used throughout the stylesheet can be defined once as a mixin, then applied whenever needed.
//define mixin
@mixin nice-styles{
box-shadow:5px, 5px, 10px, #5d5d5d;
border-radius:30px;
padding:10px;
}
//apply mixin
section{
@include nice-styles;
}
//define mixin with parameters
@mixin shaded-box($xoffset, $yoffset, $rounded-corners){
box-shadow:$xoffset $yoffset 10px #5d5d5d;
border-radius:$rounded-corners;
}
//apply mixin with arguments
aside{
@include shaded-box(5px, 5px, 20px);
}
A .scss
file can import other .scss
files. All of them will be compiled when translated into CSS. The result will be a single CSS file even if there are several source files.
//import css reset
@import "cssreset.scss";
//import mixins
@import "my-mixin.scss";
The SASS preprocessor provides useful features for reducing repetitive code bloat and increasing the ease of maintenance. The larger the web project, the more useful SASS capabilities become.
The session 10 quiz will be closed book, written.