Top Website Design Research

Top Website Design Research Points

  • Use F-Shaped Pattern
  • Use Z-Shaped Pattern
  • Don't let people more than 3 click to find their answer
  • Too many options ensure NONE will be chosen
  • Visitors read long widths of text faster, but prefer shorter widths
  • Your headlines draw even more eyes than images!
  • Image captions are the most consistently read in-post content
  • People follow the “line of sight” of other people
  • Don’t Make Users Wait: Speed Up Your Website
  • Make Your Content Easily Readable
  • Don’t Worry About "The Fold" and Vertical Scrolling
  • Place Important Content on the Left of a Web Page
  • Small Details Make a Huge Difference
  • Don’t Rely on Search as a Crutch to Bad Navigation
  • Your Home Page Isn’t As Important as You Think
  • Golden Ration in Web Design
    • Considering 900px total width page
      • Content area + Side bar
      • 900px / 1.62 = 555.55px = content area
      • Side bar = 345px
    • Considering for a rectangle block with 300px width
      • What is the height?
      • 300px/1.62 = 185px height
  • Mathematics and Design - Golden ration +The Rule of Thirds + Grid Systems

Source

Other Resources

Image Source : Pexels

Click Tracking [Google Analytics]

Event Tracking

<script>
/**
* Function that tracks a click on a link in Google Analytics.
* This function takes a valid URL string as an argument, and uses that URL string
* as the event label. Setting the transport method to 'beacon' lets the hit be sent
* using 'navigator.sendBeacon' in browser that support it.
*/
var trackLink = function(url) {
   ga('send', 'event', 'link', 'click', url, {
     'transport': 'beacon',
     'hitCallback': function(){document.location = url;}
   });
}
</script>

On Page Implementation

<a href="http://www.example.com" onclick="trackLink('http://www.example.com'); return false;">Visit example.com</a>

 

Custom Fields in bbPress Topic Form

Creating Fields

add_action ( 'bbp_theme_before_topic_form_content', 'bbp_extra_fields'); function bbp_extra_fields() { 

$value = get_post_meta( bbp_get_topic_id(), 'bbp_extra_field1', true); echo '<label for="bbp_extra_field1">Extra Field 1</label><br>'; echo "<input type='text' name='bbp_extra_field1' 

value='".$value."'>"; $value = get_post_meta( bbp_get_topic_id(), 'bbp_extra_field2', true); echo '<label for="bbp_extra_field1">Extra Field 2</label><br>'; echo "<input type='text' name='bbp_extra_field2' value='".$value."'>"; }

Saving Custom Fields

add_action ( 'bbp_new_topic', 'bbp_save_extra_fields', 10, 1 ); add_action ( 'bbp_edit_topic', 'bbp_save_extra_fields', 10, 1 );

function bbp_save_extra_fields($topic_id=0) {

if (isset($_POST) && $_POST['bbp_extra_field1']!='') update_post_meta( $topic_id, 'bbp_extra_field1', $_POST['bbp_extra_field1'] );

if (isset($_POST) && $_POST['bbp_extra_field1']!='') update_post_meta( $topic_id, 'bbp_extra_field1', $_POST['bbp_extra_field2'] );

}

Displaying Fields on the Topic Page

add_action('bbp_template_before_replies_loop', 'bbp_show_extra_fields');

function bbp_show_extra_fields() {

$topic_id = bbp_get_topic_id();

$value1 = get_post_meta( $topic_id, 'bbp_extra_field1', true); 
$value2 = get_post_meta( $topic_id, 'bbp_extra_field2', true);

echo "Field 1: ".$value1."<br>"; 
echo "Field 2: ".$value2."<br>";

}

Permanent 301 Redirect Setup in .htaccess

Redirect old domain to new domain

RewriteEngine on
RewriteCond %{HTTP_HOST} ^example.com [NC,OR]
RewriteCond %{HTTP_HOST} ^www.example.com [NC]
RewriteRule ^(.*)$ http://example.org/$1 [L,R=301,NC]

Force Redirect to NON-www version

RewriteEngine on
RewriteCond %{HTTP_HOST} ^www.example.com [NC]
RewriteRule ^(.*)$ http://example.com/$1 [L,R=301,NC]

Force Redirect to WWW version

RewriteEngine on
RewriteCond %{HTTP_HOST} ^example.com [NC]
RewriteRule ^(.*)$ http://www.example.com/$1 [L,R=301,NC]

Redirect from HTTPS to HTTP

RewriteEngine on
RewriteCond %{HTTPS} on
RewriteRule (.*) http://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

Redirect from HTTP to HTTPS

RewriteEngine on
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

Redirect files with certain extension

RewriteEngine On
RewriteCond %{REQUEST_URI} .aspx$
RewriteRule ^(.*).php$ /$1.php [R=301,L]

Redirect Individual Files

Considering Two Situations

  1. Redirecting File within the same domain
  2. Redirect File to Another Domain

Redirect File with

Redirect old domain to new domain

RewriteEngine on
RewriteCond %{HTTP_HOST} ^example.com [NC,OR]
RewriteCond %{HTTP_HOST} ^www.example.com [NC]
RewriteRule ^(.*)$ http://example.org/$1 [L,R=301,NC]

Force Redirect to NON-www version

RewriteEngine on
RewriteCond %{HTTP_HOST} ^www.example.com [NC]
RewriteRule ^(.*)$ http://example.com/$1 [L,R=301,NC]

Force Redirect to WWW version

RewriteEngine on
RewriteCond %{HTTP_HOST} ^example.com [NC]
RewriteRule ^(.*)$ http://www.example.com/$1 [L,R=301,NC]

Redirect from HTTPS to HTTP

RewriteEngine on
RewriteCond %{HTTPS} on
RewriteRule (.*) http://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

Redirect from HTTP to HTTPS

RewriteEngine on
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

Redirect files with certain extension

RewriteEngine On
RewriteCond %{REQUEST_URI} .aspx$
RewriteRule ^(.*).php$ /$1.php [R=301,L]

Redirect Individual Files

Considering Two Situations

  1. Redirecting File within the same domain
  2. Redirect File to Another Domain

Redirect File within the Same Domain

Redirect 301 /file1.htm /file2.htm

Redirect File to Another Domain

Redirect 301 /file1.php http://example.org/file2.php

in the Same Domain

Redirect 301 /file1.htm /file2.htm

Redirect File to Another Domain

Redirect 301 /file1.php http://example.org/file2.php

WordPress Robots.txt Sample

WordPress Robots.txt

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/

Adding Sitemaps to WordPress Robots.txt

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/

Sitemap: http://www.example.com/post-sitemap.xml

Explanation

Allowing all Bots

  • Allowing any Bots to Crawl
User-agent: *
Disallow:

Not Allowing any Bots

  • Not Allowing any Bots to Crawl
User-agent: *
Disallow: /

Block a Folder

User-agent: *
Disallow: /Folder/

Block a File

User-agent: *
Disallow: /file.html

 Block a page and/or a directory named private

User-agent: *
Disallow: /private

Block All Sub Folders starting with private

User-agent: *
Disallow: /private*/

Block URL's end with

User-agent: *
Disallow: /*.asp$

Block URL's which includes Question Mark (?)

User-agent: *
Disallow: /*?*

Block a File Type

User-agent: *
Disallow: /*.jpeg$

Block all Paginated pages which don't have "?" at the end

  • http://www.example.com/blog/? ( Allow )
  • http://www.example.com/blog/?page=2 ( Not Allow )

Helps us to Block Paginated pages from Crawling

User-agent: *
Disallow: /*? # block URL that includes ?
Allow: /*?$ # allow URL that ends in ?

Using Hash

# Hash is used for commenting out

Bots / User Agents

Top 10 Bots

Robot
bingbot
Googlebot
Googlebot Mobile
AhrefsBot
Baidu
MJ12bot
proximic
A6
ADmantX
msnbot/2.0b

Individual Crawl request for each Bots

User-Agent: Googlebot
Allow: /

User-Agent: Googlebot-Mobile
Allow: /

User-Agent: msnbot
Allow: /

User-Agent: bingbot
Allow: /

# Adsense
User-Agent: Mediapartners-Google
Disallow: / 

# Blekko
User-Agent: ScoutJet
Allow: / 

User-Agent: Yandex
Allow: / 

# CommonCrawl
User-agent: ccbot
Allow: / 

User-agent: baiduspider
Allow: / 

User-agent: DuckDuckBot
Allow: / 

User-Agent: *
Disallow: /

Cross Domain Tracking [ Google Analytics ]

Cross Domain Tracking in Google Analytics is for tracking a Visitor across different domains

Usage

Tracking the same visitor navigating across

  1. Various Domain names
  2. Various Sub Domains
  3. http and https of the same/different domain
  4. 3rd Party Shopping Carts
  5. IFrame

Linking

( Example )

Consider two domains :

  1. example-1.com : Primary Domain
  2. example-2.com : Secondary Domain

Primary Domain

<script>

(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga');

ga('create', 'UA-XXXXXXX-Y', 'auto', {'allowLinker': true});
ga('require', 'linker');
ga('linker:autoLink', ['example-2.com'] );
ga('send', 'pageview');

</script>

Secondary Domain

<script>

(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga');

ga('create', 'UA-XXXXXXX-Y', 'auto', {'allowLinker': true});
ga('require', 'linker');
ga('linker:autoLink', ['example-1.com'] );
ga('send', 'pageview');

</script>

Three  or More Domains

ga('linker:autoLink', ['example-2.com', 'example-3.com'] );

 

SEO Cannibalisation

SEO Cannibalisation is a process of Consolidating many similar pages which dilutes SEO Value, in to a single page.

Similar Pages

  • Many Pages with Exactly Similar Page Titles
  • Many Pages with content targeting the same keyword

Reason for SEO Cannibalisation

Have too many similar pages?

  • It could confuse Google
  • It could dilute the SEO Value
  • It could decrease the overall Conversion Rate
  • It could confuse Website visitors
  • It lacks focus

So we need to do Keyword Cannibalisation

SEO Dilution

Issues on having too many similar Pages: 

  • Internal Linking - Linking to various pages with same Anchor Text dilutes SEO value
  • Backlinks - Dilution of Backlinks to various similar pages, decreasing the overall value if its linked to a single page.
  • Content Quality - Ideas and research about a single topic divided across many different pages dilutes content quality
  • Conversion Rate  - Various similar pages have various conversion rates,  so the overall conversion rate will be less

Consolidation

Considering similar pages,

Choosing THE Best Page to Cannibalise

A Page with

  • More Conversion Rate
  • High Quality Content
  • High Quality Backlinks

Consolidate SEO Value

  • Consolidate Content from other similar pages to The Single Page
  • Update Internal Links & its Anchor Text to The Single Page
  • Update Backlinks from others sites to The Single Page if possible
  • Use 301 Redirect from other pages to The Single Page

Advantages of SEO Cannibalisation

  • Google is not confused now
  • Solves the issue of SEO Dilution
  • Website Visitors are not confused now
  • Improves Overall Conversion Rate

Its Good to FOCUS!!!

SEO for a Historical Website

SEO ( Search Engine Optimisation ) for the website which has a big history behind it. It is a kind of Website which has been active for a long period of time with regular activities.

Refining the Website Structure

  • Proper Internal Linking

Improving the User Interface of the Website

  • Making it possible for the user to navigate across the historical website and be able to get relevant information
  • Allowing Social Sharing
  • Making it more interactive

Fixing the 404 Error Backlinks

  • 404 is a HTTP status code, which tells us that the there is a broken backlink. Hence wasting the Link Juice
  • Using a redirect from the broken link to an appropriate page would add extra value to that page and the website

Off site backlink cleanup

  • Websites "Old Backlink" research,
  • And Removing unwanted backlinks

Improving the Title and Meta Description

  • Title Contributes more towards increasing Click Through Rate
  • Meta Description is the second important thing which contributes towards Click Through Rate

Implementing Rich Snippets

It is also called as "adding semantics to the website"

  • Making Google understand each part of the website
  • Highlighting to Google the meaning of each section of the website
    • Use Webmasters Tools to Highlight this information
    • Or It can also implemented via the backend coding

Other ideas for the Historical Website

  • Setting up a XML sitemap for Google
    • for Normal Pages
    • for Images
    • for News
    • and setting up its Priority and its frequency
    • Finally, Submit the XML sitemap to Google via Webmasters Tool
  • Check for duplicate Titles, Duplicate content, short meta description issues and fix it.
  • Check if Canonicalisation is necessary
  • Check if Cannibalisation is necessary