Date of Publication: 2 May, 2020

SEO Strategy for improving your Search Results

• Are you looking to get higher Google rankings for your website?

This blog could prove real handy. In this blog we will discuss how to get good SERP result using some easy and guided steps that even beginners can follow.

Anirban Guha

Anirban Guha

Digital Marketing Expert | Media Strategist | Entrepreneur

Table of Contents


Many digital marketers have come to believe that SEO (Search Engine Optimization) is no longer the same as it use to be – rather it is a declining facet of digital marketing.

But the truth is the strategies that use to work previously no longer works. In this blog “SEO strategy for search result” we will explain the fundamentals and stategies that one can take to make website rank up in SERP( Search Engine Result Page).

So let’s begin…

Before you start the first page of your website, be sure of the keyword that the content of your page  is best represented.

Keyword research is an important, valuable, and high-return activities in the search engine marketing field. Through the detective work of dissecting your market’s keyword demand, you learn not only which terms and phrases to target with SEO, but also more about your customers as a whole.

Whenever someone types a search phrase into an search engine, it gets recorded. There are keyword research tools that allows you to retrieve this information. However, those tools cannot show you (directly) how valuable or important it might be to rank for and receive traffic from those searches.

If you are looking to get an information for the internet, you go through a progression where you try certain searches, check out some sites, refine your searches, and repeat this process until you are fully satisfied with the result. Taking the time to understand typical search sequences is one aspect that impacts your keyword strategy.

Other elements that influence search behavior include searcher demographics (male/female, age, income, etc.), geographical location, and time of year. Seasonal products such as Winter Garments, for example, go through sharp peaks in volume during end November to early Feb, and then decline rapidly once the season is past.
The keyword research helps you investigate all of these factors. Take the time to go beyond the surface and use the tools to learn how your customer thinks, get
your thinking in alignment with theirs, and then build your web page around this powerful information.

It is important to choose keywords that have relatively high search volume but with low to moderate competition in order to rank your page faster.

Understanding the Long Tail of the Keyword Demand Curve

Fig. 1.1. Importance of longtail keyword - SEO Strategy for Search Results in 2020

It is wonderful to deal with keywords that have 5,000 searches per day, or even 500 searches per day, but in reality these “popular” search terms may actually comprise less than 30% of the overall searches performed on the Web. The remaining 70% lie in what’s commonly called the “long tail” of search. The tail
contains hundreds of millions of unique searches that might be conducted a few times in any given day, or even only once ever, but when assessed in aggregate they comprise the majority of the world’s demand for information through search engines.

Long-tail keywords can benefit your search engine result when you are look not just to rank high but to appear in relevant and more precised queries.  a long-tail keyword which is more specific to your particular business will be far less competitive, and the traffic generated is more likely to lead to conversions. For example, targeting ‘Beginners guide to on-page SEO’ is far more specific.

Tools for Keyword Research

  1.  Google Ads: Keyword Planner  (Recomended)
  2. Keyword Magic (Recomended)
  3. Keywordin
  4. Google Trends
  5. Ubersuggest
  6. Ahrefs Keywords Explorer
  8. Moz Explorer
Remember, focus keywords should be put in the first 10% of your content.

You would be amazed to know that LSI keywords still gives wonderful results. E.g. LSI Keywords for Interior Designing could be deco, patio, work, furniture, interior designer, etc. You can use tools like Keys4up, Twinword Ideas and  LSI Graph to get ideas.

Search engine-friendly web page, at the most basic level, is which allows the search engine crawlers to access the content. It is  the first step towards creating a good chance to higher ranking in search results over a period of time. Once your site’s content is accessed by a search engine, it can then be considered for relevant positioning within search results pages.

Indexable Content

It is advised to index and optimize blog pages than your business pages. The blog pages could have relevant anchors to route traffic to the business pages.

Images are a file type that the search engines have challenges with “identifying” from a relevance perspective, as there are minimum text-input fields for image files in GIF, JPEG, or PNG format (namely the filename, title, and alt attribute). We do strongly recommend accurate labeling of images in these fields, images alone are usually not enough to earn a website page top rankings for relevant queries. However, image if not optimized many effect adversely on your search ranking. The size of the images are an important factor which helps to minimize the loading time of the web page.

If you are using Adobe Photoshop generate web images by saving it im web friendly format (Clt+Alt+Shift+S). Jpeg or PNG 24 formats recomended. With the advent of next-gen image formats try using jpegxr, jpeg2000 or webp as image formats.

However, some browsers do not recognize these next-gen formats. We advice the exist web optimized format image uploads and add webp converter plugin to the website (WebP Converter for Media)

Spiderable Link Structures

Fig. 1.2. Providing search engines with crawl-able link structures - SEO Strategy for Search Results in 2020

Search engines, be it Google or Bing,  use links on web pages to help them discover other web pages and websites. For this reason, we strongly recommend taking the time to build an internal linking structure to help the spiders crawl easy. As I have mentioned before when you index a blog page have business page linked to it.

There a few common reason why some pages may not be reachable –

  1. Links in submission-required forms
  2. Links in hard-to-parse JavaScript
  3. Links in Java or other plug-ins
  4. Links in Flash (remember flash will not work with chorme after 2020, it is a good idea to do away with it)
  5. Links in PowerPoint and PDF files
  6. Links pointing to pages blocked by the meta Robots tag, rel=”NoFollow”, or robots.txt
  7. Links on pages with many hundreds or thousands of links
  8. Links in frames or iframes

XML Sitemaps

Think of your website as a housing complex and each page of your site as an apartment. When a visitor visits the security, with the help of a blueprint, easily guides him to the exact tower, floor and apartment number.   You can think of an XML Sitemap like a blueprint for complex and each web page as an apartment, your XML Sitemap would be a blueprintfor the bots and crawlers to navigate through the pages. XML Sitemaps are important for SEO because they make it easier for Google to find your site’s pages—this is important because Google ranks web PAGES not just websites.

Adding a URL to a Sitemap file does not guarantee that a URL will be crawled or indexed. However, it can result in pages that are not otherwise discovered or indexed by the search engine getting crawled and indexed. This program is a complement to, not a replacement for, the search engines’ normal, link-based crawl.

The benefits of Sitemaps include the following:

  1. For the pages the search engines already know about through their regular spidering, they use the metadata you supply, such as the last date the content was modified (lastmod date) and the frequency at which the page is changed (changefreq), to improve how they crawl your site.
  2. For the pages they don’t know about, they use the additional URLs you supply to increase their crawl coverage.
  3. For URLs that may have duplicates, the engines can use the XML Sitemaps data to help choose a canonical version.
  4. Verification/registration of XML Sitemaps may indicate positive trust/authority signals.
  5. The crawling/inclusion benefits of Sitemaps may have second-order positive effects, such as improved rankings or greater internal link popularity.
  6. Having a site map registered with Google Search Console can give you an extra analytical insight into your site.

XML Sitemaps are especially important if:

  • You have pages on your site created dynamically (e.g. some e-commerce sites)
  • Your site is not well-structured or well-linked (internal links)
  • Your site has few external links or is new (a newly developed site just set “live”)
  • Your site is large and/or has lots of archived content that may not be well-linked
N.B: XML sitemaps don’t have to be static files. In fact, they don’t even need to have a . XML extension to submit them in Google Search Console. Instead, set up rules logic for whether a page gets included in the XML sitemap or not, and use that same logic in the page itself to set meta robots index or not index.
Fig. 1.3. Responsive Web Design - SEO Strategy for Search Results in 2020

Did you know about 70% web browsing happens through mobile device? Yes the same one that you take along everywhere you go.

Responsive web design means that your web-page adapts to different screen sizes when users switch between desktops, laptops, tablets and mobile phone.

Your website needs to immediately adapt to different users. When your webpage gets effectively displayed on every size and quality of screen the crawlers and bots find is easier to index your web content.

Did you know that Google boosts mobile-friendly websites?

On April 21, 2015, Google released a significant new mobile-friendly ranking algorithm, Mobilegeddon,  that’s designed to give a boost to mobile-friendly pages in Google’s mobile search results.

Google’s post started out by stating the reasoning behind the change –

When it comes to search on mobile devices, users should get the most relevant and timely results, no matter if the information lives on mobile-friendly web pages or apps.

In the April 21 post, Google gave a quick three bullet list of what this update would impact –

Affects only search rankings on mobile devices.

Affects search results in all languages globally.

Applies to individual pages, not entire websites.

This update wasn’t really about organic search. It was about responding to consumer behavior, which was trending in the direction of mobile.
Responsive pages gives better user experience and google gives you points for that.
A responsive page is light weight and so loads quickly.
A mobile responsive design enables you to not only put up relevant content for use but also display it in a way that is mobile device friendly. It provides a conducive environment for users to browse your website. Decreasing your bounce rate when accessed from screens of different sizes.

Avoid Duplicate Content

If you have different websites for users on different devices, you have to duplicate the same content

That could cost you if your the looking for a great search result. By making sure all of your web content is on a single domain, Google can crawl, index and organize your content effectively.

After you are done with designing your web page it becomes important to check the speed. Google Page Speed Insight (GSPI) tell you the loading speed of the website. Google ranks the quicker load websites over the slower ones. It is a very important SEO activity to have a fast loading page.

Try Scoring close to 100 for both Mobile and Desktop versions.

Eliminate Render Blocking Resources

You need to eliminate the resources that are blocking the first paint of your page. Google suggests to “consider delivering critical JS/CSS inline and deferring all non-critical JS/styles”.

– We recommend Autoptimize + Async Java Script Plugin


  1. Install the Plugins
  2. WP Dashboard Settings > Async Java Script > Checkbox to enable Async Java Script > Quick Setting: Apply Async (N.B. If Async option causes problem on the website try Defer or excluding JQuery
  3. WP Dashboard Setting > Autoptimize > Check box to optime Java Script Code > Check box to optimize CSS Code

Reduce Server Response Time (TTFB)

TTFB stands for Time TO First Byte. It is basically the measurement of how long the browser has to wait before its first byte of data from the server. The longer it takes to get the data, the longer it takes to display your page.

A common misconception is that it is calculates after DNS Lookup Times. The original claculation of TTFB in networking always includes network latency.

3 Step of Calculation:

  1. Request to Server
  2. Sever Processing
  3. Response to Client

GPSI recommends under 200ms for TTFB. 300-500ms is however acceptable. but you need to work on your server settings if it crosses 600ms marrk.

How to measure TTFB – You can measure the TTFB of the web page with GeekFlare TTFB Tools

4 Ways to Reduce TTFB on your Word Press Site:

  1. Utilize a fast Word Press Host
  2. Implement CDN – Cloudflare
  3. Use Word Press Cache – Cache Enabler or W3 Total Cache
  4.  Get a Premium DNS –
    1.  Amazon Route 53
    2. GoDaddy Premium DNS
    3. GoDaddy Premium DNS
    4. Google Cloud DNS
    5. Azure DNS

Use Images in Next-Gen Format

As discussed earlier in the Indexable Content Section, use optimized images in Next Gen Format, viz, jpeg2000, jpegXR and webp. 

Remove Unused CSS Code

No matter how well you have developed a website, there’s a good chance that the website contains CSS that do not have any effect on current page elements. For example, frameworks like Bootstrap come with dozens of CSS styles that is probably not needed. If you added a plugin during development but later chose to remove , there is a high probability that there could be rules related to the plugin lingering in your style sheet.Unused CSS just adds dead weight to your applications and contributes to the increase of your web page size. You most definitely want to make sure that you have as less codes as possible.

It is generally seen that about 35% of all CSS is unnecessary.

How to remove them manually?

  1. Open Chrome DevTools
  2. Open the command menu with: Clt +Shift + P
  3. Type in “Coverage” and click on the “Show Coverage” option
  4. Select a CSS file from the Coverage tab which will open the file up in the Sources tab

Any CSS that is next to a solid green line means that the code was executed. Solid red means it did not execute. A line of code that is both red and green, means that only some code on that line executed.

Just because a style isn’t used on one page doesn’t mean that it’s not used elsewhere. Ideally you should use an MS Excel to audit audit several pages on your site to keep track of which rules keep appearing on the unused list. The ones that appear the most can probably be safely removed.

There are tools available if you find the above process risky. Some of the available tools are listed below:

  1. UnusedCSS
  2. PurifyCSS
  3. PurgeCSS
  4. UnCSS

NB: These tools can have adverse effect on your website. Be cautious and read the feature and functionality throughly before using.

Ensure Text Remains Visible During Web-Font Load

Fonts are often large files that take time to load. Some browsers hide text until the font loads, causing a flash of invisible text (FOIT).

Avoid invisible text during font loading

  1. Use Font Display – It is a CSS property available as of Chrome, Chrome for Android, Opera, Safari’s Technical Preview and Firefox. Swap tells the browser that text using this font should be displayed immediately using a system font. Once the custom font is ready, the system font is swapped out. If a browser does not support Font Display, the browser continues to follow it’s default behavior for loading fonts.
  2. Wait to use custom fonts until they are loaded – If you are to implement it across browsers it take a little bit more time. There are three parts to this approach:

    1. Avoid the use of custom font when your page is loading. This enables the browser to display text immediately using a system font.
    2. Detect when your custom font is loaded. This can be accomplished with a couple lines of JavaScript code, thanks to the FontFaceObserver library.
    3. Update page styling to use the custom font.

Serve Static Assets with an Efficient Cache Policy

A static cache policy has the potential to make your website faster. When a browser requests a resource, the server providing the resource can tell the browser how long it should temporarily store or cache the resource. For any subsequent request for that resource, the browser uses its local copy rather than getting it from the network.

A good static cache policy differs from website to website. What might work for one could have negative effect on the other. Google Lighthouse considers a resource cacheable when the below mentioned conditions are met:

  1. It’s a font, image, media file, script, or stylesheet.
  2. It has a 200, 203, or 206 HTTP status code.
  3. It doesn’t have an explicit no-cache policy.
Here are some ideas about how you can increase the caching efficiency:
  1. Your URLs should have consistency
  2. Ensure that the server provides a validation token (ETag)
  3. You need to find which resources can be cached by intermediaries (like a CDN)
  4. It is important to estimate the optimal life time for each resourch
  5. Determine the best cache hierarchy for your site
  6. Minimize churn

Avoid an Excessive DOM Size

  1. As covered by Google, an excessive DOM (Document Object Model aka web page) can harm the performance of your web page. 

The recommendation is:

  1. Less than 1500 nodes
  2. Less than 32 nested levels deep,
  3. Parent node with less than 60 child nodes.

When your web page has a large DOM size it takes much longer to render the page and run the Java Script.

Unfortunately, you need the redo the entire design all over again to resolve the DOM size. Hence, this is something you should always keep in mind before you plan to develop a web page. 

Understand that this warning is significant and if you get it for more than one or two pages in your site, you should consider the following:

  1. Reduce the number of widgets and sections on your web page layout
  2. Use basic page builder as many available page builders bloat up codes.
  3. Use a simpler theme, viz, Twenty Nineteen

Avoid Chaining Critical Requests

Today most available browsers parse HTML using a streaming parser—assets are discovered within the markup before it has been fully delivered. As assets are found, they’re added to a network queue along with a predetermined priority.

There are a number of asset prioritization levels ranging from the lowest to the highest.

You can find out how your site is prioritizing requests using priority column in Chrome Dev Tools network request table.

There’s really only 5 things that are necessary to display for a blog like this one:

  1. Most importantly, the HTML. If everything fails the user can still read the page.
  2. CSS
  3. The logo (A PNG placed by CSS. This could probably be an inline SVG).
  4. Web font weights.
  5. The featured image.
These assets (note the lack of JavaScript) are essential to the visuals that make up the main viewport of the page. These assets are on the top priority list.
If you study the performance panel in Chrome you will notice that multiple requests are made before the fonts and featured image are requested. The mismatch can easily be seen.

Now since you have defined the critical request you can actually prioritize them very easily using few simple yet powerful tweaks.

Preload ( ) instructs the browser to add font.woff to the browser’s download queue at a “High” priority. NB: as=”font” is the reason why font.woff would be downloaded as High priority — It’s a font, so it follows the priority plan as per the browser.

Essentially, you’re telling the browser: You may not be clear but you should do this.

In most cases, fonts are delayed, just because you haven’t instructed the browser to download them in a timely fashion.

Content is The King

The development of highly engaging, trusted and  shareable content, and promotion of that content via various channels is an important tool to rank up your website. Content can be published on your own site, other people’s sites, or in social media, but in all cases acts to build visibility for your brand online. The most valuable content is usually highly relevant to what you do, solves problems for others or stirs their emotions, and is often non-commercial in nature.

Search engines may use shared content on social media platforms as a way of discovering new content, in particular, news related content.

In 2015, Brian Dean revealed a link building strategy he used that had an 11% success rate. This tactic even doubled his organic traffic in 14 days.

He called it the Skyscraper Technique.

These are the basic steps associated with this technique:

  1. Find a relevant piece of content with lots of backlinks; (Click here ► to click the backlinks)
  2. Create something better than the existing ones.
  3. Ask those linking to the original piece to link to your superior content instead.

Have you ever walked by a really tall building and said to yourself: “Wow, that’s amazing! I wonder how big the 8th tallest building in the world is.” Of course not. It’s human nature to be attracted to the best. And what you’re doing here is finding the tallest “skyscraper” in your space…and slapping 20 stories to the top of it.

Brian Dean
 Brian Dean, Founder Backlinko

To improve the content over the existing available best, Brian recommends you improve all four of these aspects:

  • Length – If the post lists 10 tips, beat it by listing 15 tips.
  • Freshness – Check if the article is outdated,e.g, an article on social share points out the advantages of google plus. Update it with newer information and a very good packaging.
  • Design – Content isn’t just about the words; its visual appeal matters too. Design it so that it stands out.
  • Depth –  Don’t just list things out. Fill in the details and make it actionable.

Old pages and posts that are outdated and do not add value, adversely effects your website’s performance in the SERP. You need to delete them. It’s part of your regular content maintenance activity. There are several ways to go about this let’s find out…

  1. Update old content that is still valid

  2. Delete irrelevant posts or pages

  3. “301 Redirect” the old post to a related one –  especially if the old post has got quality backlinks that you do not want to let go. 301 Redirect  tells the search engines and visitors that there’s a better or newer version of this content elsewhere on your site.

  4. Add No Index Tags to these Pages.
Fig. 2.1. Setting up 301 Redirection with Redirection WordPress plugin - SEO Strategy for Search Results in 2020

Dwell time is the amount of time that passes between the moment you click a search result and subsequently go back to the search result page. If the visitor finds your content useful he will definitely spend more time with your web page, may even navigate to other pages. Search engines duly notes this to rank your page.

Following are few of the techniques that can effectively increase the dwell time.

Optimize the First Impression

  1. Page loading speed – About 40% of all visitors bounce off a page if the loading time is more than 3 seconds. That why speeding up your web site is very important. we have already discussed techniques in #Step 4.
  2. Design – Web design has a lot of impact on whether the visitor trusts your website or not. You need to make sure it’s crisp and in-sync with your target audience. Simple backgrounds, plenty of white space as well as clear and easy-to-read fonts helps you go a long way
  3. Layout –   Design and layout should be able to create a synergy for to engage the visitor.  Make sure your content takes center stage and is easily understood.
  4. Mobile optimization – Don’t forget to make your website responsive. See #Step 3.
  5. Ads and popups – They could be an important tool to get leads but can get really annoying for the user. Google, since last year, have started penalizing them if they are too intrusive. The best way is to use them is with exit-intent technology or any other subtle way.

Concentrate on User Needs

  1. Understand your users – The most important thing, when you think of setting up SEO Strategy for Search Results in 2020 is understand your users. You have to get into the head of your visitors – understand what they are looking for and how relevant you are to them.
  2. Write longer content – Longer content scores over shorter contents. Longer and exciting your content is to the visitor longer he/she is expected to be there on the web page.
  3. Target the right keywords – The keywords in your title and description make a promise about what your piece is going to be about. Do not use click baits as the visitors will leave your site in a jiffy. Also see #Step 1.
  4. Keep your content up-to-date – Updated content helps to retain visitor. See #Step 5 & #Step 6.
  5. Respond to questions and comments – The most important conversation happens in the comment section. It helps you build relationship among your audiences, gradually turning them into your advocates.

Make Your Content Highly Readable

Another very important fact which has gained limelight in the recent years is readability. Team Yoast in WordCamp Europe 2016 gave a talk on on why they added a readability tool to their Yoast SEO plugin making few recommendations as below:

  1. Focus on the structure of the text
  2. Use of clear headings
  3.  Easy to read text
  4. Ensure that the text is nice to read
Not to forget be relateable and use media like images, infographics, videos etc. There a huge stock pile of available media resources.
We recommend:
  1. All-free-download
  2. Pixabay
  3. Freepik
  4. Pexels

Include Internal Links

Dwell time is not just about the landing page. It’s more about the total time they spend on your website. To keep them around longer, give them somewhere to go next.

Link related posts on your site (like I am doing here) is one of the ways to achieve that. Another is to include a widget in your sidebar or at the end of your post that lists your latest, popular or related posts.

According to a report recently published by Google, click through data has proven to be a critical resource to improve search ranking quality.

How to optimize CTR?

There is a easy technique that I follow and would share it with you. Do let me know if it has added any value to you.

The Steps are:

  1. First look at the ads that appear on the top for a certain targetted keywords
  2. Find what is common in those ads ( The trick is you need to read the descriptions very carefully)
  3. Find the best alternative, crisp and attractive copy which is bound to catch attention ( Please note do not put false terms, it could backfire) 

Recently you might be hearing much about schema markups for your website.

Do you, fancy search results with star ratings get all the hype when people talk schema. For good reason… who doesn’t like having stars in their search listings? However, this has many more benefits

Fig. 2.2. Star Rating in Search result - SEO Strategy for Search Results in 2020

Benefits of Schema Markup

  1. It helps search engines to better understand the content of your  site

  2. It helps to improve brand presence with a full Knowledge Graph

  3. Get attention-grabbing rich results to increase CTR

  4. Get more real estate in SERPs

  5. Get a CTA right in the Google Search Results

  6. Position your site for voice search, Alexa, and Google Assistant

  7. Non-search sites may use them
  8. Provide a sneak preview to your layout and content in search result
  9. Yo can get more job applicants

  10. Receive more plays of your movie or song

  11. You can get more video views (YouTube, or native video)

  12. Display Social Profiles in search results

  13. Increase your social media presence

DigiT - Kolkata Digital Marketing - Tejom Digital - Knowledge Graph
Fig. 2.3. Knowledge Graph appears on right side of the search result page - SEO Strategy for Search Results in 2020

How to do Schema Mark Up?

Schema markup uses a unique semantic vocabulary in microdata format.

Are you afraid that you have no expertise over coding? Don’t Worry!!

You need to just add bits of vocabulary to HTML Microdata., the website for schema markup, was a collaborative effort from Google, Bing, and Yahoo.


  1. Visit Google’s Structured Data Markup Helper.

  2.  Select you data type for the mark up.
  3. Paste in the full URL of your page or blog ( that you want to markup.

  4. Highlight and select the elements you want to mark up

  5. Create HTML after you finish the selection process
  6. Download HTML file from the right hand top side, and copy/paste it into your CMS or source code
  7. To check how your page will look like with the mark up added use  Structured Data Testing Tool or Google Rich Result Test’s instructions explain clearly, “the more content you mark up, the better”.

Though you need to focus on user issues, as discussed earlier you need to make your page conducive for the crawlers and bots. There as some common issues on a CMS like Word Press that could adversely effect the result inspite having  a great user experience. In the following step we will understand how to fix these issues.

Robots.txt File Blocking Access to Googlebot

This is not something out of the ordinary, rather matter of overlooking during Technical SEO audits or regular checks. This can be identified using he robots.txt Tester tool which shows you whether your robots.txt file blocks Google web crawlers from specific URLs on your site. Export the crawl results and compare against a known list of pages on your site and check there are no crawler blind spots. This could happen to even the biggest and most efficiently mainteained websites.

Note that in some cases you can access historic robots.txt file changes using the Internet Wayback Machine.

Fig. 3.1. The Way Back Machine - SEO Strategy for Search Results in 2020

Domain Configuration Issues at the DNS Level

This is yet another issue that could that is a very impactful change done very easily. With this you need to do is ensure the DNS zone settings are correct, with an “A” record in place for the “WWW” version of the domain, which points to the correct IP address (a CNAME would also work). This will prevent the domain from not resolving. The only issue which could complicate this a little and consume your energy is that it can be tricky to get access to a site’s domain management panel, or like in many cases, the password is lost, or it’s not seen as high priority.

Rogue Pages within the XML Sitemap

This is strangely very common.Within the XML there could be pages indexed that do not add value but pulls down the value proposition of your website for bots. A typical example is your thank you pages from form submissions or PPC landing pages which leads to content duplication or form of pages/posts/taxonomies that you’ve already noindexed somewhere else.

A small hack: You can catch this kind of pages by doing a search in Google to return all that’s been indexed. Once you get a list, all you have to do is fix them.

Hacked Sites and Spammy Backlinks

This is you can find on sites that are running on older versions of WordPress or other CMS platforms that require regular security updates. While working with a client we have found about 8000 such links in his website,, and they were indexed and clickable. Such an exercise to separate the real pages from the fake ones could be very tedious. However, to find out these pages and backlinks the process starts with search in Google. If you’re really concerned about indexed content in cases like the one I have mentioned you could also serve a 410 status code to really clarify things with the search crawlers.

Duplicate Content

This is a very unfortunately thing that happens predominantly with domain with low authority. Say for e.g you publish a  great content and re-publish it on a authoritative industry platform like Medium and it turns to be a success. It also may get replicated in websites with a higher DA. Chances are, inspite of your canonical tag your article do not get associated with you. Happened so many times with me. If your domain is quite small in terms of authority, Google may not even have had the chance to crawl and index the published content – and it could even be the case that the rendering element of the crawl/index hasn’t yet been completed, or there’s heavy JavaScript causing a big time lag between crawling, rendering and indexing of that content.

The only word of advice that I can give you, is that, you have probably worked hard to create and craft the content – don’t lose it just by re-publishing it elsewhere. Let the article settle in for some time before you share or re-publish it.

Bad AMP Configuration

AMP stands for Mobile Accelerated Pages. However, if you the not telling search engines that an AMP page exists at a particular
URL, there is absolutely no point in having AMP setup – the point is that it gets indexed and returned in the SERPS for mobile users.

An important way of tell search engines about an AMP page  is by adding the reference to you non-AMP page. Also note, canonical tags on AMP pages shouldn’t be self-referencing – they must link back to the non-AMP page.

Legacy Domains that 302 Redirect or form a Chain of Redirects

It is  a simple semi-regular check of trying to crawl your old site using google search –, or even a third party site that checks for status codes and redirects. I bet you’ll find the domain 302 redirects to the final destination (301 is always a best bet here) or it 302’s to a non-www version of the URL before jumping through few more redirects before landing on the final URL. According to John Mueller of Google, they only follow 5 redirects before giving up, while it is also noted that some value is lost for every redirect. It is for this reason we recommend to stick to 301 redirects that are as clean as possible.

Redirect Path by Ayima is an extension with Chrome browser which shows you the redirect statuses as you’re browsing the web. Backlinking tools like Majestic and Ahrefs might also show you old links from your previous domains.

You could also use the site audit of Raven Tools and Semrush to find the issues with your technical Audit.

But Remeber, having lesser number of pages always helps

Fewer Pages = Fewer Problems

In addition to creating high-quality content, and sound technical frame work as discussed earlier,  it is imperative to focus some time and energy on a savvy link building strategy that what we commonly call Off Page SEO.

Whether you’re guest posting on other websites or stealing your competitors’ backlinks, you need a game plan. You might think, the more content that you create, the greater chance there is that you will naturally receive incoming links. Actually, there is no guarantee that this will generate the results that you are chasing, could also be counter-productive. Because these days quality of both content and backlinks matters, not just the numbers.

Gone are the days when you could build low-quality backlinks to your website as a means of boosting your website rankings. Do this today and you’ll end up causing more harm than good to your SEO strategy

Since incoming links remain a top search engine ranking factor, now’s the time to implement a building strategy that you can trust.

What’s the best approach for this link building strategy? This depends on the type of website you have, the strategy that you are using and your overall view on SEO.

Regardless of where you stand, it’s important to spend time learning more about the many backlink tools that are helping other marketing professionals generate positive results for their search engine rankings. Lets discuss some of them.

Building Quality Links with Industry-Specific Tools

Well it is a common practice to stick to info-graphics, “ultimate guides”, and other similar kinds of content when trying to build backlinks. There are tools and methods available both free and paid that you can use.


    1. Target resource/links pages
    2. Target how-to guides
    3. Run “shotgun skyscraper” outreach for your tool

Resource pages: Here are the steps:

    1. Use Google search operators to find a number of resource pages (e.g. “SEO intitle:resources”, “SEO inurl:”resources”, etc.)
    2. Scrape the results (using Linkclump)
    3. Find contact information and pitch your content

How-to Guides: Here are the steps:

    1. Find “how-to” articles related to the issue your tool solves (e.g. if you have a calorie counter tool, find articles about “how to count calories” or “how to lose weight”)
    2. See if they’re linking out to any similar tools
    3. If not, reach out and pitch your tool for inclusion

Shotgun Outreach: Here are the steps:

    1. Find tools that are similar to yours, yet not quite as good
    2. Reach out to anyone linking to those tools and explain WHY they should link to your tool instead (i.e. because it’s better!)

The 3-step R-L-R Framework (Research-Learn-Replicate)

Always remember you are not the only person the niche who is looking for backlinks. There are players and might be more in the future. Believe me this is a good thing as it will save you hours of research that your competitor has already done and now owns up tons of useful backlinks.


  1. Research the competition (i.e. find similar websites and/or pages in your niche with a bunch of links)
  2. Figure out howcompetitors are building links
  3. Replicate/steal their tactics

Tools: Backlink Checker

Deep Broken Link Building (at scale)

This has been for years and is a tried and tested way of creating backlinks.


    1. Find broken links
    2. Replicate the content (that used to exist at that link) on your website
    3. Reach out to the person/website linking to the broken resource and suggest that they change the broken link to the working link (i.e. to the replicated content on your website)


  1. Use LinkMiner extension in Google Chrome
  2.  WayBack Machine

Are you confused?

Then follow these easy steps:

    1. Gather a BIG list of resource/links pages in your industry (using Google advanced search operators)
    2. Find all broken links on those pages in bulk (using Screaming Frog)
    3. Find the broken links with the most inbound links (using Ahrefs Batch Analysis tool)
    4. Recreate the content (or create something similar) on your website
    5. Reach out to all the sites linking to the broken link.

Build High-Probability Link Channels with Custom Search Engines

Create links with website that have already linked you in the past. Make a list of all those websites. Now you can you have a fully searchable database of all the websites who’ve already linked to you and, therefore, are likely to be interested in your new content.


  1. Ahrefs Site Explorer
  2. Google’s CSE.


    1. Find all websites already linking to you (with ‘natural’ editorial links).
    2. Load them into a custom search engine.
    3. Use the search engine to find likely link prospects for future content.

Convert Homepage Links to Deep Page Links

Using Ahrefs Site Explorer, paste in your root domain and click the “best by links” filter from the left-hand menu. In most cases it is the home page that has the maximum links. But you want your visitor to go to a specific page that fulfils website hosting objective.

As explained before, find the number of dofollow backlinks for your home page and convert them to deep links requesting the owner of the website by showing him the value.

Build Links with Blog Comments

Build links by writing comment on popular blogs.


    1. Scrape the websites of everyone who left a blog comment in the last 30 days.
    2. Check if they have any content on their website related to your niche (e.g. in my case, this would be SEO/marketing-related content).
    3. If so, reach out, thank them for the comment and ask if they’d consider linking to your post.


One can effectively scale up the strategy creating a Custom Google Search engine similar to one discussed earlier.

Reclaim Links from Stolen Images

If your website contain high quality images, e.g., infographicsphotography (that you own the copyright to), diagramsscreenshots, etc., then you can reclaim your links


    1. Find and list down all high-quality images on your website
    2. Find websites using these images without permission. Tool: TinEye
    3. Make sure these websites are giving you credit for those images (if not, reach out and reclaim the link)

Strategic Guest Blogging (Tenant SEO)

Guest Blogging may sound cliché and of late have suffered loads of criticism, but if done properly it could help you:

    1. Generate backlinks from high authority sites in your industry
    2. Build exposure and credibility for your brand
    3. Deliver targeted traffic to your site
    4. Create a powerful relationship-building platform
    5. Provide a vehicle to rank for insanely competitive keywords (Tenant SEO)

Here is how it is done:

    1. Use Google search
    2. Reverse engineer prolific guest bloggers (in your industry)

However the targeted website should meet the following criteria:

    1. High domain authority
    2. Related to your niche
    3. Post high quality content
    4. Receives lots of traffic (use Alexa)
    5. Has an engaged audience
    6. Provides contextual links
    7. Active social presence

Link Reclamation (Google Alerts)

Most of us seldom use Google Alerts. You would be surprised to know it is as effective a tool for link building as It helps you to keep track of your brand mention across the web

So set up your Google Alert and monitor them regularly.

Landing authority links with "Alternate Content Creation"

Sometimes website have excellent content and tons of links but no infographics or videos . You can here by add value to them and create effective backlinks for yourself

This is how:

    1. Identify informational “how-to” articles and “ultimate guides” in your industry with a ton of links
    2. Create a video or audio version of that post (or at least part of it)
    3. Contact the website owner and give them the content for free

Expert Roundups

These are one of the easiest ways of creating backlinks


    1. Find a relevant question to ask the influencers/experts in your niche
    2. Make a comprehensive list of influencers/experts in your niche
    3. Consolidate responses into a blog post

Inform the influencers about your latest post and request them to link to it.

Video Transcription

If a picture can speak a thousand words then a minute of video speaks 1.8 million words.

Most experts prefer videos as a medium to express themselves.

Different formats of videos:

    1. Tutorials
    2. Presentations
    3. Webinars, Hangouts, Q&A’s
    4. Vlogs (video blogs)
    5. Industry updates


    1. It is a fast way to build linkable content as it comes directly from influencer/ expert in your industry.
    2. You can get links through attribution.
    3. It helps you get in front of influencers and stand out. This makes it easier to build relationships.
    4. You can leverage their large social audience. If they share your transcription, you get more (targeted) eyeballs on your site.

Create your own YouTube channel to upload videos.

Video Transcription

If a picture can speak a thousand words then a minute of video speaks 1.8 million words.

Most experts prefer videos as a medium to express themselves.

Different formats of videos:

    1. Tutorials
    2. Presentations
    3. Webinars, Hangouts, Q&A’s
    4. Vlogs (video blogs)
    5. Industry updates


    1. It is a fast way to build linkable content as it comes directly from influencer/ expert in your industry.
    2. You can get links through attribution.
    3. It helps you get in front of influencers and stand out. This makes it easier to build relationships.
    4. You can leverage their large social audience. If they share your transcription, you get more (targeted) eyeballs on your site.

Create your own YouTube channel to upload videos.

Social Bookmarking

It is one of the cost effective tool for creating backlinks. Social Bookmarking (SB) is a way to bookmark your favourite web pages online to read them anytime and anywhere if we are connected with the internet.

The web pages bookmarked at social bookmarking sites are consider quality backlinks by search engines. Off course those page should be targeted with keywords.

While working on social bookmarking site following should be kept in mind:

    1. Use only single keyword in Title.
    2. Use other keywords in tags.
    3. Use different description per submission.
    4. Don’t submit more than one link in single bookmarking site.
    5. Don’t submit links in those sites which have same IP.
    6. Don’t do SB continuously. Use it wisely means sometime do SB and sometime do Directory Submission (explained below) for the same keywords to gain ranking for those keywords.

Don’t do SB more than 25-30 on same day.

Directory Submission

Directory Submission is a process of submitting your site to directories, best is to submit manually. Directories submission is done to get permanent, relevant and one-way link building. Directories submission helps in getting fast back links.

Surely directory submissions are very boring, but they deliver.

PR Sites

To my opinion this might not be as effective as the previous ones as the PR sites are huge and your links could be lost in minutes. However, they are still effective depending on the nature of your business or niche.

It is important to post absolutely original (not spun contents) and interesting content to create attention of the visitors. Remember you are competing with load of important contents ranging for political and economic affairs to war and human right violations.

The main factor that can give you numbers is whether your content is news worthy.

Mutual Link Exchange

Don’t of Link Exchange:

    1. The website is unrelated to our site niche
    2. The Page is of no value to you
    3. Their website has lots of outbound links
    4. The website is new
    5. They offer no follow links only
    6. Doesn’t really score high on UI|UX
    7. Link exchange are spammy and are not personal.

Do’s of Link Exchange:

    1. The website has a potential to deliver loads of traffic
    2. The have a good collection of information relevant to you
    3. The website is not your competition
    4. The website has high PA & DA scores
    5. The website appear in search results with your keywords (or approximately similar).
    6. If you are request do right a very meaningful email that is very personal and adds value to him.

Article Submission Websites

Content is a very vital component when it comes to digital marketing. Rich content with relevant keywords, anchor text and backlinks help you score high for the search engines But you can only expect result if you are careful choosing the precise submission sites that match your niche.


    1. Creating keyword-rich content articles for submission
    2. Finding the top article submission websites matching the niche
    3. Submit article link to as many websites as possible
    4. Use the resource box to provide a link to your website or landing page.

Out Bound Links

Well in this article of backlinks you must be amazed to see why I am talking about out bound links. Well, in the post-panda world the importance of Out Bound Links or external links lies in the fact that it never lets your blog get devalued.

This is how it works:

    1. They give search engines a clear idea about the blog
    2. It offers interaction and relationship building amongst blogger to cross leverage in the niche

Here are a few suggestions for creating outbound links:

    1. Link to pages which actually add value to your topic
    2. Link to articles having good PA & DA
    3. Link to articles which have got high number of social media share
    4. Link to the bloggers in your network or in your niche and create a network

If you have found this blog helpful please like & share

We hope the article helped you understand the topic, but you might still have questions about how it can work for your business / brand. At Tejom Digital, we have built a reputation for answering those tricky questions.

Are you looking to take your business to the next level? 

Tejom Digital

is here to help! 

Our award-winning digital marketing agency has the expertise and data-driven solutions to ensure optimal reach, engagement, traffic and conversion. With our proven strategies, you can rest assured that your business will be seen by the right people and achieve your desired outcomes. Plus, we offer comprehensive and personalized packages tailored to your specific needs! Let us help you reach your goals today.

Subscribe to our Blogs
Stay on top of the latest digital marketing tips, trends & best practices.