LS Staging

Tag: SEO Tools

  • SEO Technical Audit Checklist 2021: How to make your website crawler friendly

    SEO Technical Audit Checklist 2021: How to make your website crawler friendly

    SEO, as a process, basically consists of 3 important parts; Technical SEO, Content and Link Building.

    What is a Technical Audit?

    A technical audit is a process to check the crucial technical aspects of your website. Understanding and maintaining your website by identifying these technical aspects can help you rank prominently, drive organic traffic, and help achieve your goals.

    1. Content Visibility 

    Since content plays a huge role in targeting our keywords and providing relevant information for our product offering, it is imperative to ensure that the content is visible and available to the user and the web crawlers. This makes it a good first step for a technical audit.

    Ensure that the homepage, category pages, product pages, and all other high priority pages are visible to the user and available to the crawler.

    Given below are a few methods to check if the content is readable to the crawlers

    • Disabling JavaScript

    JavaScript can be disabled in Chrome to see if any critical elements of the webpage are not rendered. Unrenderable content, links or navigation elements could indicate that the crawler cannot view the webpage present. 

    Since Google has stated that it can render JavaScript websites now, a few additional checks need to be performed to ensure that the website’s content is renderable.

    • Cached version of the website

    A cached version of the website

    One way to check how the Googlebot views your page is to check the cached version. Checking the cached version of the webpage can help you identify if any essential content elements of the website are missing or not being rendered correctly.

    Given below are methods to check the same

    • Use the “site:” search operator in Google to find your page in the index (e.g., Site: example.com/abc)
    • Click on the three dots next to the page’s URL and click on ‘Cached

    This will show you the cached version of Google’s page in their index from the last time it was crawled.

    • The text-only version will help you understand if the content present in a webpage is rendered properly. 

    Alternatively, you can check the cached version of a specific page using the “cache:” operator (cache: example.com/example-url)

    Content Rendering

    To analyse if the content on the webpage is visible, we also need to understand how content is rendered. Rendering is the process where Googlebot retrieves your pages, runs your code, and assesses your content to understand the layout or structure of your site.

    There are three different types of rendering for web pages that can be leveraged to present content to the user and bot

    • Client-Side Rendering
    • Server-Side Rendering
    • Dynamic Rendering

    The debate between client-side rendering, server-side rendering, or dynamic rendering is only relevant for websites that utilise JavaScript. If your website is purely HTML, there’s nothing that human users or search engine bots need to render.

    Refer to the below-mentioned link for more information about Content Rendering

    (Reference Link)

    2. Indexation Checks

    To rank for relevant terms, increase visibility, and improve organic traffic, one must ensure that all website’s important pages are indexed in Google. 

    Given below are a few checks that you must do to check indexation

    • Google site: query

    In the Google search field or browser Chrome search bar, you would type in the following:

    site:www.abc.com

    This, in turn, will give you a list of all the indexed pages of your website, thereby helping you understand any indexation issues if present.

    The site: query can also be filtered using different options for specific indexation checks

    site:www.abc.com/subdirectory/

    Only displays the indexed pages in the directory /subdirectory/

    site:www.abc.com Phrase of choice

    Only displays those indexed pages which contain the phrase of your choice

    site:www.abc.com inurl:phrase

    Only displays those indexed pages which contain the word “phrase” in the URL

    site:www.abc.com intitle:phrase

    Only displays those indexed pages which contain the word “phrase” in the title

    site:www.abc.com filetype:pdf

    Only displays those indexed files with the chosen filetype

    • Google Search Console

    Google Search Console provides an overview of the indexed pages. Indexation can be checked by clicking on the option ‘Coverage’ under the ‘Index’ section.

    As seen from the above illustration, Google Search Console also provides valuable insights concerning indexed pages, including the presence of ‘no index’ tags, 404 pages, and server errors.

    3. Duplicate Content Checks

    Duplicate content confuses Google in choosing the identical pages that it should rank in the top results. This, in turn, harms website ranking, thereby leading to a drop in organic traffic in the process.

    Given below are a few checks that you must do to prevent content duplication

    • On-Page Elements

    Make sure that all the pages have a unique page title and meta description in the HTML code of the pages. Also, headings (H1, H2, H3 etc.) should be unique for every page. 

    • Use of Proper Canonical URLs

    The canonical tag is an integral part of letting Google know which site is original and should be ranked and which pages should not be ranked. If there are multiple similar pages, Google will be confused concerning ranking. As a result, it is vital to ensure that a proper canonical tag is defined pointing towards the master page.

    • HTTP vs HTTPS or WWW vs non-WWW pages

    If your site has separate versions for abc.com and www.abc.com (non- WWW and WWW versions of the website), Google will identify both the versions as duplicate content, thereby negatively affecting indexation and rankings. Similarly, different versions for HTTP and HTTPS can also create duplicate content issues. Proper 301 redirect (Permanent Redirect) should be employed to prevent duplication issues.

    • Use of Hreflang tags

    Hreflang tags are code snippets that tell search engines what language the content on a page of your site is written in, what geographical region it’s intended for and its relation to other pages of the website.

    Example

    <link rel=”alternate” hreflang=”en-in” href=”http://www.abc.com/in/” />

    With respect to multilingual websites, it is highly imperative to implement hreflang tags correctly.

    Refer to the below-mentioned link for more information on Hreflang Tags

    (Reference Link)

    4. Secure Protocols (HTTPS)

    As per Google, HTTPS is a ranking signal. To capitalise on the same, one needs to ensure that non-secure versions (HTTP) are changed to HTTPS.

    Given below are a few checks you must do to get a sense of the overall security protocol of the website.

    • Check the website’s homepage, category pages and product pages for HTTPS. Try accessing the website using the Non-Secure Protocol (HTTP) and check if it is getting redirected to the relevant HTTPS version of the website. The presence of the HTTP version of the websites can create mixed content issues like content duplication.
    • You can use Google’s Lighthouse browser plugin to verify if HTTPS has been used across the website. With the plugin installed, click ‘Generate Report’. Once the report id is generated, check under Best Practices > Passed Audits

    Source – Lighthouse

    Refer the below mentioned link for more information about HTTPS

    (Reference Link)

    5. Scanning for Broken Links

    To ensure that we cater relevant pages to the crawler and user, we need to check for pages displaying a 404-error status code (Page Not Found). Broken links on indexation can lead the users to an error page, thereby having a negative impact on user experience. 

    The Search Console ‘Coverage’ report can help you identify any error pages present on the website. On selecting ‘Errors’, one can scroll down and check for pages displaying a 404-status code.

    Source – Google Search Console

    These pages can be redirected to other relevant pages within the website to ensure that we do not lose out on potential traffic.

    6. Auditing Robots.txt file

    A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. It is mainly a directive to ensure that only relevant pages are presented to the crawler to ensure maximum coverage and avoid irrelevant indexing pages.

    The robots.txt file can be accessed by typing in the below-mentioned URL

    (Reference Link)

    The most common directive to ensure that all the pages are accessible to the crawler is as given below.

    User-Agent: *

    Disallow:

    Sitemaphttps://abc.com/sitemap.xml

    The asterisk symbol (*) denotes all user agents. Pages that need to be blocked can be added in the disallow directive. In contrast, the XML sitemap will present the crawler with all the website’s relevant pages, facilitating smooth crawling.

    Refer to the below-mentioned link for more information about how robots.txt works

    (Reference Link)

    7. Auditing XML Sitemap

    An XML Sitemap is a file that contains a list of all the crucial pages of the website to help Google find and index the same. The XML Sitemap can be accessed by typing in the below-mentioned URL

    (Reference Link)

    The exact URL of the XML sitemap can also be located in the robots.txt file.

    One must ensure that the XML Sitemap only contains relevant URLs that you want to get indexed.

    The XML Sitemap can also be added to Google Search Console by choosing the ‘Sitemap’ option under the ‘Index’ section to analyse if all the URLs present are valid and improve discoverability.

    Refer the below mentioned link for more information about XML Sitemaps

    (Reference Link)

    8. Auditing HTML Sitemap

    HTML Sitemap is an HTML file that contains all the website’s important pages and creates a general picture of the entire website. In simpler terms, XML Sitemap is for crawlers to understand the site structure, whereas an HTML Sitemap serves users with relevant pages of the website.

    The HTML Sitemap is usually located at the website’s footer section, helping users understand the site hierarchy in detail, thereby facilitating smooth navigation.

    Like XML Sitemap, one must ensure that only relevant pages are added in the HTML Sitemap.

    9. Page Speed Checks

    Page speed is a measurement of how fast the content on your page loads.

    Page speed is often confused with “site speed,” which is the page speed for a sample of page views on a site. Page speed can be described in either “page load time” (the time it takes to display the content on a specific page fully) or “time to first byte” (how long it takes for your browser to receive the first byte of information from the webserver).

    For both desktop and mobile search rankings, Google has announced that page speed is a ranking factor. Page speed also can poorly affect user experience, increase exits, and increase bounce rate.

    Go to the Google PageSpeed Insights tool and enter your site’s URL

    The tool, in turn, will provide a score for your desktop and mobile version with specific recommendations to help improve the page score and eventually improve the website experience as well.

    Core Web Vitals

    Core Web Vitals are a set of specific factors that Google considers important in a webpage’s overall user experience. Core Web Vitals are made up of three specific page speed and user interaction measurements: LCP (Largest Contentful Paint), FID (First Input Delay), and CLS (Cumulative Layout Shift)

    Since Google has made Core Web Vitals an important ranking factor, ensuring that the scores meet the guidelines set is vital.

    Core Web Vitals can be checked using Google Search Console in the ‘Experience’ section.

    Core Web Vital scores for all URLs on Mobile and Desktop is given in the above dashboard. This makes it easier to identify and optimise the pages based on three vital parameters.

    Refer to the below-mentioned link for more information about Google Search Console

    (Reference Link)

    1. Mobile Checks

    With the advent of mobile-first indexing and the increasing number of mobile phone users, it is imperative to ensure that the website is mobile-friendly for user experience, indexation and ranking purpose.

    Mobile Optimization ensures that mobile visitors have an experience optimised for their device and considers factors such as website design, site structure, page speed and more to ensure a seamless and quality mobile experience.

    Given below are a few vital mobile checks that you must do

    • Use the Mobile-Friendly Test Tool

    The Google Mobile-Friendly Test is a free tool that allows you to type in the website URL and find out how mobile-friendly Google thinks their site is. It also gives specific recommendations concerning mobile checks, helping identify any issues (if any).

    Content Mismatch

    It is vital to ensure that the content present on the mobile device and desktop is the same. Keeping in mind the mobile-first approach taken by Google, if your mobile site has less content than your desktop version, Google might not get additional information about your website, thereby hampering the indexation and ranking of the website.

    Although different designs can be adopted for the mobile version to maximise user experience (Creation of accordions and tabs to show content), the content present should be exactly like that of the desktop version

    • Use Large Buttons

    Mobile-friendly websites need to have call to action buttons that are large enough to induce users to tap them rather than zoom. This, in turn, makes it easier for the user to navigate and increases user-friendliness in the process.

    Additionally, there needs to be spacing between hyperlinks. If links are placed too close to each other, users might attempt to click on a link and accidentally click on a different one, thereby hampering user experience in the process.

    • Make the text large enough to read

    The text on the website needs to be legible and large enough, thereby making it comfortable to read. Users should not have to zoom in and scroll left or right to read something as this might negatively affect user experience.

    • Navigation Mismatch

    The options present under the navigation in the mobile version should be similar to the desktop version to ensure accessibility. Missing options in the navigation menu can affect the website’s visibility, affect ranking, and hurt user-friendliness.

    • Simplification of Navigation Menus

    Since desktop website menus have a lot of space, they can take up the entire bar at the top of the screen with additional drop-down options without negatively impacting the user experience.

    There isn’t much space to incorporate the menu like the desktop version with respect to mobile devices.

    Mobile-friendly websites must use simple menus that are easy to use and present an accurate overview of the website. Users can then use categories, filters, or the search functionality to navigate further.

    Most mobile-friendly websites use the hamburger menu. A hamburger menu, when clicked, opens to reveal a navigation menu, thereby not utilising a large amount of space.

    • Avoid Pop-Ups and Intrusive Interstitials

    In 2017, Google rolled out an algorithm change that penalizes websites that include specific types of pop-ups on mobile devices.

    As stated by Google, the below mentioned pop-ups can have a negative effect on the website by making content less accessible.

    Source – (Reference Link)

    As seen from the image, any pop-up or interstitial that makes it difficult to access the content is termed as intrusive and can lead to further penalties.

    Given below are the pop-ups that are not deemed as intrusive thereby having no negative effect on the website

    Source – (Reference Link)

    As seen from the above image, pop-ups that do not take much space, add value to the user journey and cookie usage settings are not deemed as intrusive.

    11. Auditing Backlinks

    Backlink Audit is the process of analyzing links pointing your website. The process involves identifying good and bad backlinks and developing a robust strategy.

    Backlinks for your website can be downloaded from Google Search Console by choosing the ‘Links’ option.

    Backlinks can also be extracted by third party tools like Ahrefs and Majestic SEO.

    Given below are few steps to understand how backlinks need to be audited

    • Benchmark Your Backlink Profile Against Competitors

    With the help of third-party tools, we can analyze the backlinks for our competitors to understand the quality for the same. This can help you create a robust link building strategy for your website.

    • Find and Evaluate Each Backlink You Have

    Backlink portfolio of the website needs to be analyzed in order to identify and remove irrelevant and spammy links pointing to your website.

    Since, Google can penalize you for disavowing sites for no apparent reason, it is highly important to ensure that only spammy links are disavowed.

    Irrelevant backlinks can be divided into two separate categories: Low Quality Links and Spammy Links.

    In case of low-quality links, one can reach out to the site owner to get the link removed from their end. Whereas in case of Spammy links (Porn Sites, Casino Sites etc.) , they can be removed by submitting a list of URLs in Disavow Links (Reference Link)

    • Domain Authority (DA) Bifurcation of website

    Domain Authority of a website describes its relevance for a specific subject area or industry. The domain authority score helps us understand the authority of the website for that industry.

    Getting backlinks from high domain authority will not only help solidify our backlink portfolio but also help increase our domain authority in the process.

    • Anchor text diversification

    Anchor texts are the visible, clickable words used to link one page to another. Anchor texts can have a major influence on rankings in the search engine result pages since the crawler gets more information about your brand and product offering through the anchor text. As a result of which, it is highly important to ensure that we do not overuse terms like click here, learn more etc. Anchor texts should have a healthy mix of brand terms and relevant non brand terms to ensure relevancy and visibility.

    12. Analytics Set Up

    If Google Analytics is already set up, ensure that it is set up and working properly to get accurate data.

    Checks that can be done to ensure proper set up

    • Source Code

    Check your source code (Ctrl+U) of the Home Page, main category pages, and product pages for the presence of the tracking code associated with your Analytics account and property. Since some Content Management Systems handle page templates differently, it’s a good idea to check each kind to make sure tracking is consistent.

    The code should be installed in the head tags (between <head> and </head>) and usually includes an ID that you can match with your Analytics property/view.

    • Google Tag Assistant

    Google’s Tag Assistant browser plugin can help verify if the tracking code is installed correctly.

    • Analyze the Google Analytics Traffic

    If you see extreme shifts in data, then it could be an indication that the tracking code present is not firing properly when loaded.

    In your Google Analytics data, look for periods with no sessions, sudden change in traffic and very high/low bounce rate. These can be symptoms of potential tracking code issues

    13. Accessing Google Search Console

    Search Console is a free tool from Google that helps developers, website owners, and SEO professionals understand how their site is performing on Google Search.

    Search Console allows you to check indexing status and optimize visibility for the website. It also allows you to check for pages displaying server error, manual actions, page experience, core web vitals to help identify issues within the website.

    Refer the below mentioned link for more information about Google Search Console

    (Reference Link)

  • Local SEO Tips for 2017

    Local SEO Tips for 2017

    The New Year has kicked off and it is time for digital marketers to start off on the right foot by rolling out their digital marketing strategies. In today’s post, we will provide you with simple yet effective local SEO tips so that your business can dominate search rankings in 2017 and beyond.

    The concept of Local SEO has gained traction for the following two reasons—

    • Majority of search queries come from mobile devices which make location a crucial factor; and
    • Google has been giving priority to organic local results for desktop searches as well.

    Local SEO campaigns are quite different than general SEO campaigns. Though it is a challenging task to implement a local SEO strategy, it has the potential to provide your business with exceptional benefits. Local SEO strategies can provide your brand with better online visibility besides maximizing your online presence.

    Here are few guidelines to help you execute local SEO strategies so that you can get the most out of it.

    • Find out the Right Keywords: Once you launch a campaign using Google Adwords, you need to utilize the keyword tool and incorporate it into your website so that the tool can extract countless keywords relevant to your business. Once the keyword tool presents the relevant keywords, you can delete all the Ad Groups on the left column which is not relevant to your business. Now you will find five to fifteen keyword phrase groupings that your business can rank for. After highlighting the different groups, you need to optimize a different page for each group of keyword phrases.
    • Utilize Google My Business (GMB): This easy-to-use tool launched by Google helps brands to get free business listings on Google. With the help of GMB, businesses can reflect their information on Google Search Engine, Google Plus, and Google Maps once they add their business details to it. GMB is beneficial not only for businesses but also for customers as this tool helps your customers identify your business easily. When filling your GMB page, make sure you provide complete, consistent, and accurate information. Also, ensure that the GMB location listed by you is verified by the owner. Last but not least, do not forget to mention the specific city that you are targeting in the title of GMB landing page.
    • Maintain Consistency in NAP (Name, Address & Phone): In order to gain greater online visibility, ensure that your contact details are accurate. At the same time, you should use a uniform template for your NAP on your website, Google Plus account, GMB page, online business directories, and other sites where these particulars appear online. This will help your business build a solid relationship with the location you are willing to target.
    • Utilize Positive Reviews: Positive feedback helps businesses significantly by establishing credibility. Google gives utmost precedence to business reviews and also considers them while ranking your website. The reason why you should capitalise on your business reviews is because people seeking information about your business count on the reviews to understand the quality of your business and if they will be benefitted by your offerings or not. Thus, you should motivate your customers to write reviews about your business. By providing top-notch service to your customers and going beyond their expectations, you can easily generate positive reviews about your business.
    • Make Sure You Publish Unique, Enlightening and Comprehensive Content: Ensure that your web pages contain unique and engaging content so as to keep your online visitors engrossed. Moreover, by focusing on regular organic SEO, you can significantly improve your local ranking on Google. At the same time, to protect your Google ranking, make sure you follow SEO guidelines recommended by Google, instead of employing unethical SEO tactics.
    • Utilize Citations Properly: Citations lend credibility to your website besides building authority and driving traffic to your site. While considering citations, make sure your citations are uniform. At the same time, to gain a competitive edge, ensure that the citations used by you are from superior websites. Another indispensable requirement is to have several industry-relevant or locally-relevant domains that talk about your business in a good light.
    • Set up Different Web Pages for Different Store Locations/Products: If your business has stores in different cities or in different localities within a city, it is advisable to have separate web pages for each store location. You can also incorporate separate links for each store location on a single page. This will help your customers to easily find information about specific stores.

    Similarly, if you offer a wide array of products or services, make sure that each product/ service you offer has its own webpage with specific product/ service description.

    So these are some local SEO tips that you can incorporate into your local SEO campaign. Though many webmasters find it difficult to implement local SEO, a good understanding of Google’s directives and adherence to fair practices can help you considerably in rolling out your local SEO campaign.

     

  • Data Driven Attribution: A Google Analytics Premium Feature

    Data Driven Attribution @LogicserveDigi

    How many times have your marketing campaign simply failed because regardless of the effort you put into it, you just can’t track when and where your customers click the exit button? Now, you finally have a respite as digital marketing might as well take a new turn.
     

    What Is Data Driven Attribution?

    You may be wondering as to what is this term and why has it generated so much hype. We are here to talk about it.

    Data driven attribution is a model which actually charts the whole journey of a customer and helps you trace the end point as to whether the customer ended up making a purchase or did he/she fail to make the conversion.

    This modeling method has been computed after careful consideration of too many parameters that influence the buying decision of a customer. We are all aware that more than one factor comes into play when you are looking to generate the right leads from your business and convert them to sales.

    With this model, you will be able to get a much better and clearer insight into which keywords are performing better; which channels seem to be offering better dividends and other sales points that have been offering you the best output you could have asked for.

    Google analytics tool @LogicserveDigi

    Now that you have some idea of what the data driven attribution model is all about, we will talk about the reasons you should use it too.
     

    Why Data Driven Attribution?

    Data driven @LogicserveDigi

    There are plenty of different reasons which will offer you the right incentives for using this model. We are going to talk of those which will get you interested.

    • Increased transparency: one of the key reasons is that this model is going to offer you much larger transparency and you will be able to get a better idea of the methods that are offering the results. Often, when you carry out digital marketing, you use a blend of multiple methods and we fail to see which method is good enough and the ones that are useless. With this method and better data output, you can get a clear picture of the channels you should tap more as compared to others.
    • Improved decision making capability: when you are taking marketing decisions, it becomes important to be sure that you are blessed with the right decision making capabilities. With this tool, you will have a much better picture of how your conversion rate is working. Thus, it improves the ease with which you can come to the right decision.
    • Easy to integrate: marketing and business requires more than one tool for operation. With data driven attribution model, you will find integration with other leading Google tools like YouTube and AdWords is not going to be difficult. This model has been designed to offer you easy integration.

    Aren’t these enough incentives to try out this model? You will be able to get the actual numbers, percentages and the sequential flow of the precise points which the customers are following from start till the end of their navigation cycle. It might end up making all the difference to your business!

  • What is Google Webmaster Tools API

    What is Google Webmaster Tools API

    Webmster_features @LogicserveDigi

    If you are looking for attention from that daddy of search engines Google for your site and through organic free traffic, then you know you have to optimize your site. The first step towards it is by signing up for the Google Webmaster Tools or GWT.

    The GWT is free and has been designed to help webmasters like you to know what is happening with your site. This will enable you to then take decisions that are based on hard data as opposed to doing anything through guesswork.

     

     

    So what are the objectives you can achieve through GWT?

    Once you have logged in to the GWT, you need to verify you are the owner of the site, which can be done through:

    1. Adding the DNS record to the domain configuration.
    2. Adding a Meta tag to the homepage and verified you are the owner of the site.
    3. Uploading the HTML file to the server.
    4. Linking the Google Analytics account you have created to the GWT.

    You will now begin to see data pertaining to your site within some hours. The dashboard presents you with an overview of the keywords you are beginning to rank for along with the traffic. You will also see the pages Google has already indexed, sites that are beginning to link to your site and whether the Google bot is finding it difficult to crawl through your site.

    There is other stuff you can do to help Google notice and rank your site using GWT. They are:

     

     

    Configuration of your site

    webmaster_dashboard @LogicserveDigi

    This is done by submitting a sitemap to let Google know the pages of your site. Google will then start indexing them. The sitemap is submitted in XML format. It can be done by just going to XML Sitemaps, entering the URL and clicking the “Start” button. Once the sitemaps are loaded, Google will let you know the URLs that are indexed. Do not worry if all pages are not indexed as you will not want RSS feeds and other important data pages or private login pages to be indexed.

    As your site gains in popularity, Sitelinks are added to it by Google and these connect to the interior pages of the site presenting visitors with information they may want to peruse. You are free to remove them or keep the links you want people to access. It is also possible to inform GWT about your target audience and the specific countries they belong to. You can also pick the domain name of your preference as either https://yourdomain.com or https://www.yourdomain.com. Either of them will work and help in site rankings.

     

     

    Crawl Rate

    Crawler @LogicserveDigi

    You can instruct the Google Bot to crawl your site more often as well as faster if you wish to. Typically, though you should let Google decide the frequency and pace at which they crawl your site. Too frequent crawling can cause a lot of bot traffic going to your server and that would increase hosting cost.

     

     

    Crawl Stats

    Thanks to the GWT, you would be able to view how Google is looking at your site through the crawl stats it provides. This is done through graphs, data tables that you would be able to see and if they are not provided, you can consider making changes to the crawl rate on the settings option.

     

     

    Monitoring of Links

    One of the ways of getting ranked higher by Google is to have more sites linking to your site. This should ideally happen naturally as it is expected that your site would provide valuable information to visitors. However, you can still monitor link growth through GWT where in you will also be able to know the pages that people have liked and are linking.

    If you find it difficult to get links, you may try writing content for social web directories like Digg.com. The link bait content you present can provide you with a lot of links.

     

    LogicSpeak:

    The above are just some of the useful ways in which you can make use of GWT. It is a free useful guide that will set you on the right track to make sure your site is noticed by Google and you get the free organic traffic you are looking for.

     

    Previous Post: Tips to Leverage Google Plus for Higher Web Traffic