It’s possible that you did not build your website.  You trusted a webmaster to do it.  You may also be using a Content Management System (CMS) or an online instant website-building tool with many elements that are pre-built and you do not have control of.  Therefore, you have no idea if your website is optimized for the search engines like Google and Bing to crawl and index properly.  You are also unsure if your website give your audience a good user experience.

Below are questions you may want to ask about your website to see if it is optimized, not only for search engines but also for your users, including tools to help you answer the questions.

What do you do if there are many elements in your website that are not optimized?  Ask your webmaster to fix it, or in a worst case scenario, such as a fix is somehow difficult or impossible, rebuild your website with search engine optimization in mind.   It will probably be worth it if you want to see traffic to your site increase substantially or your website effectively serves the function of why you built it in the first place.

How many pages on my site are indexed by search engines?

Do you know how many pages are on your site?  Are they all indexed by Google and other search engine?

To find out:

Go to Google, enter in the search box:  “” (replace “” with the name of your website).

Google will tell you how many results (pages) it finds.

If your expected number is less than what Google reports, you may have a crawler problem.  If it is more, Google might be indexing a number of unimportant content that might hinder it from crawling your site effectively.

With this site search, you will also know if you are using a subdomain.  Remember that Google is treating subdomains as different websites.  If you have a powerful site and you need the content in your subdomain to be visible in the search engine, use subfolder instead of a subdomain so your important page can inherit the power of your main domain.

There is only one or very few pages being indexed on my site.  What’s wrong?

After making a site search in Google or Bing you might find that the number of pages being indexed is strikingly low compared to the actual number of pages on your site.

Look at your robots.txt to see if any of the pages you want indexed are “disallowed”.

Your robots.txt is usually found in

Search Status:

What does the search engine spider see on my site?

The search engine bot or spider crawl millions of web pages per day.  It “reads” the content of the web pages to allow the search engine to index them and rank them in search results pages.  Search engines bots are machines, and how it reads a web page may not necessarily how humans see it.

Certain elements on websites that are seen by humans may not be seen or are may be seen only in a limited way by search engines.  Examples are flash files, graphics, videos, javascript links and more.  The search engine cannot consider content that it cannot see.   Global navigation made with javascript, for example, may not be seen and crawled by search engines.

Also importantly, some hidden, spammy content that humans cannot see can get your site penalized or banned.

Below are tools you can use to see what the search engines see in your website:


Search Engine Spider Simulator

Internet Marketing Ninjas On-page Optimization Tool

Go to Google, search for your page, click on the preview arrow > cached > text only version

Web Developer add-on – disable JavaScript, disable Meta Direct, disable Cookies (search engines don’t crawl with cookies enabled), CSS (hidden links by hiding entire div)




What  HTML code or CSS property is causing a certain appearance in my page?  How do sections of my page look like in code?

See also  Fasttrackcash: Perfect Way To Make Money Online With Internet Marketing

When looking at your website, there might be certain elements in its design that you would like to change, such as font sizes, color and background, alignment, border color, etc.

Firebug addon


How does my site look in the Mac Safari, mobile phones (or other browser)?

The browser you are using may not be the browser that you think your audience is using.  Different browsers may show your content differently.  For example, if your audience primarily access your site via their smartphones, you should know how it looks on that device by using the User Agent Switcher addon.






Am I targeting the right keyword?

Keyword research and keyword targeting is the foundation of search engine optimization.  You have to optimize every page of your website to target keywords that are popular or are actually being used by users.

Here are some of the best keyword tools that will help you determine if you are targeting the keywords that people actually use to seek for information on your site:

Google keyword tool:

Keyword Discovery:


With Google Insights for Search, you can compare search volume patterns across specific regions, categories, time frames and properties:


Is my site relevant to a keyword?

Find out how strong a keyword message your content is sending to the search engines.  Using your target keyword in your content increases its relevancy signal for that keyword.

Note though that the search engines are sophisticated enough to know if you are intentionally stuffing your keyword in your content, and thus see your pages as spammy and trying to game their index.  This will almost certainly lower your page ranking, if your page is seen at all.

There is no magic percentage that you should aspire for in terms of keyword density.  Write for people, not search engines, by making sure that your keywords are found naturally in your content, and are not being “stuffed”.

Use the following tools to determine your keyword density:

Keyword Density Analysis Tool

Dave Naylor’s Keyword Density Tool

SEOmoz Term Target Tool  (members only) Keyword Density & Prominence Analyzer


How can I easily combine keywords to come up with phrases relevant to my site?

You might want to broaden your keyword horizons by generating more keyword targets to your site.  That way, you get more keywords that drive traffic to your site.  One way to do this is to cross-combine single keywords and search phrases to come up with new keyword phrases.

Search Combination


Do my pages have Page Title and Description Tags?  Do I have duplicate tags on my pages?

The Page Title is the most important tag where search engines get signal as to what your web pages are about (what keywords are relevant to your site).  The Description Tags helps your pages become enticing to click when found in the search engines results pages.   It is important for each page of your website to have a different Page Title and Description.

With Screaming Frog, you can have every page of your website crawled an see if any of your pages have Titles and Meta Descriptions missing, are duplicated, or have more than the optimal number of characters.

Screaming Frog

Do I have a lot of links to my website?

Having external links or links from other websites is a major ranking factor for search engines.  For search engines, links are votes .  The more links you have to your page, especially if they are coming from quality sites, the higher it will rank in relevant keyword searches.

Open Site Explorer:

Majestic SEO:

Link Diagnosis:

Blekko: (click on the SEO link)

If you’ve signed up, Google Webmaster Tools:

Browser plugins:

SEO for Firefox :

mozBar  for Firefox and Chrome:

See also  Top Five Sunless Tanning Lotion Tips


Are there broken links on my website?

Screaming Frog crawls your site and reveals any links that are broken:


Xenu’s Link Sleuth:


What text link do other websites use to link to my site?

Text links give signal to the search engines about what your website is about (what keywords it should rank for).  Do not overdo it  though, having the same text link on all your links is a signal for the search engine that your site is overoptimized, and may suffer suppressed ranking.

Open site explorer:

How do I rank, and how do I track the changes in my ranking?

SEOBook rank checker

Does my page load fast?

Page load time is now a minor ranking factor for (at least) Google.

PageSpeed Online analyzes the content of a web page, then generates suggestions to make that page faster. Aside from a minor ranking boost, reducing page load times can reduce bounce rates and increase conversion rates.

Load Time Speed Test Tool

Why is it taking so long for my page to load?

HTTP header analyzer shows number of files being downloaded, size of files being downloaded, location of server, and more:

Also see recommendations for your site at

Is my website penalized?

Having a PageRank of 0 even if your website is relatively old and have external links going to it indicates that your website may be penalized.

Among other tools you can see your  PageRank by using Search Status:

Live HTTP header Firefox Addon

Is my redirect SEO friendly?

When you have web pages taken down and being replaced by a new page containing the same information on a new URL, you can tell the search engines that the old page is permanently redirected by giving it a 301 redirect to the new page.  This way, most of the page equity of the old page is transferred to the new page.

A 302 (temporary) redirect is not search engine optimization friendly because the page equity of the old page is not transferred to the new page.

Use the Live HTTP header analyzer to show whether a redirect is a 301 or 302.  You will also know all the nodes in redirect chain.

Live HTTP header Firefox Addon:

Other tools:

Redirect Checker

HTTP Response code checker

Is my website cloaking any content?

Use Firefox User Agent Switcher , switch to search engine user agents, then refresh.  If results are different, yes you have cloaked content.

Do I have duplicate pages on my site?

You can do this manually by going to Google search and type: “exact quote from one of your content”

Are other sites duplicating my content (stealing my content)?

Other sites stealing your content may result in your site being identified by the search engines as the duplicate content, and thus is not shown in the index.  Of course, you would not like content thieves to benefit from your hard work.

Copyscape checks if another website is plagiarizing your content.

If you catch a site plagiarizing or stealing your content or images, you can file a DMCA takedown notice.  Details here:

Are my images optimized?

Search engines currently cannot read what images on web pages are about.  One way to indicate what the image is about is by using alt tags in the HTML image code.

These tools reveals the alt tags in your website’s images, if there are any:

Web developer toolbar:




You can also inspect code with firebug

Firefox: Tells you file name of image, alt text, size, file size.  This tool also reveals broken links.

Do I use stylesheets properly?

A minor issue, but it is advisable to use an external style sheet (style.css) on your website as it removes code from the page that may hamper the search engine spider’s ability to find the “content” of a page.


See also  Know What Are How You Can Get Mod Apk Apps

Are all my HTML codes valid?

A lot of error in your pages’ HTML codes may signal to the search engines that your site is low-quality and less than trustworthy.  They may also hinder the search engine to properly find your web page’s content.

Are my CSS codes valid?

Is my URL using parameters?

This is true for sites that are creating dynamically generated pages.  Having more than 2 parameters in the URL may result in search engines having a hard time crawling the page correctly and limiting the amount of content crawled by the search engines.

Check your URLs manually.

Example of a URL with multiple parameters:

Have I enabled compression?

Compressing resources with gzip or deflate can reduce the number of bytes sent over the network, resulting in faster page load.

See Google recommendation for more details:


Is my site sending data to Google analytics correctly?

Google Analytics Debugger

messages include error messages and warnings which can tell you when your analytics tracking code is set up incorrectly.  In addition, it provides a detailed breakdown of each tracking beacon sent to Google Analytics.

How do I compare with competition in terms of PageRank, links, etc?

SEO Toolbar for Firefox make it easy get a holistic view of the competitive landscape of a market directly in the search results.

How old is my domain name, and how long before it expires?

With, you also see what people see as your contact info, also other Top level domains (.com, .org, .net if you don’t know them yet that you should probably claim to protect your brand)

Is it good practice to use subdomain on my site?

Use of subdomains is acceptable if the content located on the subdomain is substantial enough to exist as a completely separate website. Subdomains used for administrative or technical purposes should not be indexing in the search engines

To look for subdomains, go to Google and do site search.  In search box, type: –www

Do I have an xml sitemap?

XML sitemaps are used to pass the URLs of your site’s pages directly to the search engines. Especially useful for large sites with complex navigation, they are a best practice for all web sites from an SEO perspective.

Check.  It’s usually at

Among other information, revealing your xml sitemap is one of the functions of the Search Status extension from Firefox:

Am I using Iframes or Framesets?

iFrames load content into a web page from an external source.  When content is loaded into a web page this way, it is not visible to search engine spiders and will not be indexed or associated with the page it is loaded into. Do not use iFrames to load relevant content into a web page.

Firebug:  Firefox:


Is my 404 (Page not Found) error configured properly?

When a page is not found in your site, the server must respond with 404 error in the header response code.  Do not use dynamic or custom 404 pages that do not return 404 code as this might cause crawling issues with search engine spiders (Note: having custom 404 pages that return 404 code is not only okay, but advisable).

Is my URL human-readable?

A human-readable URL is search engine friendly, and is easier to remember than a bunch of characters.

Dynamic web sites often have page URLs which contain multiple parameters and values. These should be modified to be human readable when possible. The most common method to rewrite these URLs is through a mod-rewrite function.

Inspect visually.  Example of human-readable URL:

Are there all-in-one tools I can use to see if my website is optimized?

Google Webmaster Tool – show what Google find about top search queries, crawl error types and their counts, links to your site, keywords, sitemaps, malware, crawl errors, etc.

Web developer toolbar



Is My Website Optimized? Tools To Answer Your Questions

Leave a Reply

Your email address will not be published. Required fields are marked *