Tactics for On-Page SEO You Can't Succeed Without

Aug 1, 2017 // Philip Westfall

Trying to figure out the right keywords to get your content found online can seem like you're desperately looking for a needle in an especially large haystack.

Ranking a page for a keyword is like running a contest where the search engine is the judge. So, understanding the judge’s criteria has everything to do with how you will perform. 

To quote Google:


“Today Google’s algorithms rely on more than 200 unique signals or “clues” that make it possible to guess what you might really be looking for.” - Google 2012

Unless you want to become a Search Engineer at Google, you won’t need to learn how these 200 signals work. However, you DO need to understand the most important ones and how you can apply them to your pages. 

If you’re still curious about taking a DEEP dive into these 200 factors, check out Brian Dean’s famous post, “Google’s 200 Ranking Factors”.

Today, we're going to focus on the 12 best On-Page SEO practices.

As the terms imply, on-page SEO has everything to do with what is ON the page you are trying to rank. Off-Page SEO is a whole other animal!

Fun fact?

On-Page SEO accounts for about 30-40% of your ranking score!

We're going to dive into the top 12 tactics for On-Page SEO improvement, and how to use them properly. You want to be sending the right signals to Google to help you improve your rankings.

Although each point could have its’ own individual blog post, I’ll try to be concise and highlight only the main points.

#1. Domain Names and URL Structure

Your URL sends signals to Google regarding what your pages are about. If your brand name aligns with your target keyword (ie: theshoecompany.com) that’s great, but don’t change your brand name or buy a new domain for this use alone. 

When you buy or renew your domain name, renew it for 10 years (if you can). The longer you commit to your domain name, the more legitimate you look in the eyes of Google. 

When deciding on your URL structure, the clearer it is to a human, the clearer it will be for a Google Crawler. Here a few examples of what to do and not to do:

url structure.png

The first URL is the best, because it is a clear and quick path to what we are looking for. The second URL requires too many clicks to get to the information we are looking for. The third is difficult to understand for humans, so it won’t be crawled correctly.

The final link is similar to the first URL, but it has unnecessary “keyword stuffing”. Having too many of the exact same keywords in your URL is frowned upon by Google, so it’s best to avoid this tactic.

#2. Title Tags and Meta Descriptions 

This is probably one of the easiest and most impactful strategies for On-Page SEO. 

Since your Title Tag and Meta Description are part of your rich snippet in a SERP, they are in direct relationship with your CTR and as we’ve seen above, essential for high rankings.

Here is how to write proper Title Tags and Meta Descriptions:

*2 quick notes:

1. Google doesn’t check for keywords in your meta description.
2. Whenever you are trying to place keywords, always place them as close to the top and to the left. Make them show up first!

ONpageSEO.jpg

As for your meta description, keep it creative, but truthful. Ask yourself the following questions: Would I click on this link? Does the description properly match what is on the page.

#3. Heading Tags (H1 – H6)

Having distinct Titles and Subtitles on your web pages will give your users a better experience for navigating your content. Crawlers, on the other hand, use heading and sub-headings tags to navigate the importance of your Titles and Subtitles.

This is how a basic page would look like to a visitor and to a crawler (I’ve isolate titles and subtitles):

heading tags.pngRemember, the most important tag is the H1 tag, and the least important is H6.

There should only be one H1 tag per page. and most pages won’t go beyond an H3 tag. Heading and subheading tags should be relevant to your target keywords and make sense if you were going to lay them out as above. They should be as short as possible.

 

#4. Images: Title and Alt Tags 

Unfortunately, Google Crawlers are not the best at figuring out what your images are. This is where Image Titles and Alt Tags come into play. 

Different browsers and screen readers will use this information for different purposes, for example, when you hover over an image or if the image doesn’t load. 

Here is the basic structure for deciding on Title and Alt Tags:

image alt and title tags.png For the most part, you want your title to be very short and concise (2-3 words). The “alt” or alternative text can be slightly longer, and should describe what is on the picture in greater detail. If done correctly, you can add in some of your target keywords as long as they are relevant to the picture.

*Quick tip: Ask yourself if you did a Google image search for the alt tag, would you expect to see this image? The answer should be YES!

 

#5. HTTPS

When you navigate online, you want to feel secure! This is even more true when you are sharing personal information. Google tends to agree with this line of thinking, and it ranks secure sites higher than non-secured websites.

Hypertext Transfer Protocol Secure or HTTPS is an internet communication protocol that ensures the integrity and confidentiality between a computer and a website.

If you haven’t done this yet, Google shares some more information on how to secure your site with HTTPS.

 

#6. Page Speed and Experience

Google ranks the best content first. This means slow pages with a bad user experience will not rank well.

From a speed perspective, you want to optimize your images (shrink their size) so your pages load as quickly as possible. Some website builders or platforms will do this automatically for you.

The same is true for any other content or code slowing down your page load. Google’s AMP project stems from this principle, as does the increase of mobile web traffic.

On the user experience side, this basically boils down to common sense. Is your page easy to navigate? Is the font big enough to read? Is there confusing information on the page? Rule of thumb is always ‘the simpler the better’.

Keep in mind that if your pages are slow or offer a bad experience, your bounce rate will be high and your odds of ranking will be low.

#7. Mobile Optimization

Is your website mobile-friendly? Is it offering your visitors a good experience? Hopefully the answer is YES. If you aren’t sure, run Google’s Mobile-Friendly Test to find out. 

As of April 21st, 2015, Google Search started to expand its use of mobile-friendliness as a ranking signal. In their own words:

“This change will affect mobile searches in all languages worldwide and will have a significant impact in Google Search results. Users will find it easier to get relevant, high quality search results optimized for their devices.” 

 
In simple terms, if your website isn’t optimized for mobile: do it now!

 

#8. Length, Quality and Freshness of Content

The only thing better than quality content is LOTS of quality content. Want to make it even better? Make sure that it’s updated and fresh! So here are some quick tips:

  • Instead of writing 10 small articles on the same subject, try writing one in-depth and easy to navigate article.
  • When you get a page that is ranking high on Google, try to update it every now and then to keep the content fresh. For example, replacing old statistics with new ones.

If you already have multiple articles on a same topic, combine them to create a “better” piece of content. Be sure as you adjust your content you don’t create any navigation situations that lead to broken links. 301 redirect (permanent) all the old links to your new article so you maintain all the precious “Google Juice” accumulate by the original links.

#9. Duplicate Content.

Duplicate content refers to identical or very similar blocks of content across different web pages. Google doesn’t like this, so you want to avoid this as much as possible.

In some cases, you might have duplicate content without even realizing it.

  • Web Store items shown or linked to via multiple distinct URLs.

  • www vs non-www or HTTP vs HTTPS versions of your website (these are considered separate pages of your website, although they serve the same content).

  • Very similar blog posts covering the exact same topic. Although this is not technically duplicate content, you want to avoid having several pages competing for the exact same keyword.

 

To fix these problems, you have two main choices. You can use 301 redirects or canonicalization. Generally speaking, if you have dynamically created or syndicated content you will want to look at using canonical URLs. Otherwise, use 301 redirects. 

Remember, we only want 1 competitor (page) in each contest (keyword or topic). We don’t want to confuse our judges (Google, Bing, Yahoo).

 

#10. Sitemap.XML

Sitemaps allow you to tell search engines about the organization of your content (hierarchy and importance of each page). This is how a basic site map would look to a user and to a crawler:

website sitemap.pngIf your website is very basic and well structured, you don’t necessarily need a sitemap, but it's always recommended. Google has multiple resources to help you learn about sitemaps

There are also several tools online that will help you create a sitemap automatically

Don’t forget that you need to submit your sitemap by adding it to your robots.txt file and submitting it to Google’s Search Console. 

 

#11. Robots.txt

The purpose of this simple text file is to tell crawlers what sections of your site you DON’T want crawled.

By default, all your pages and content could get crawled. If there is any reason you’d want stop a specific bot from accessing a section of your site, you can add it there. To check to see if you have a robots.txt file on your website, simply add “/robots.txt” to the end of your domain name. For example:

www.yourwebsite.com/robots.txt

If for some reason, you have a robots.txt file on your website, check to see how it looks. Here are the two basic configurations:

robots.png

In both cases, the “user-agent” are bots and the * symbol represents “all”. The right disallow “/” means that every page on your website will be blocked. If you wanted to block a specific folder you’d want to use “/folder-name/” and if you wanted to block a single file you’d use “/file-name.html”.

Google has great resources to help you learn about robots.txt files

 

#12. Various Technical SEO

If you want to take your SEO a step further, the are several ways we haven't explored together to give even more specific clues to search engines on what your pages and content are about. 

For example, Structured Data (typically using schema.org vocabulary) will make it easier for search engines to organize and display your content in creative ways. This can drastically improve CTR in a SERP. Here are a few examples that you’ve probably seen in the past (Local Pack and Knowledge Panel).

SERP Features - Structured Data.png

Moz offers a detailed description of SERP features and how to get them. 

Although there are other technical SEO tactics that can be used, if you follow everything we just listed, you should start seeing progress, and start ranking for your target keyword.

Build a website with PageCloud's website builder

PUBLISHED:Aug 1, 2017

SEO, Analytics, Content Marketing, Website Creator, eCommerce, Online Tips

 

Build The Custom Website You've Always Wanted