How to Get Your Site Back on Track After It Lost Some Traffic

19 min to read 12.01.2020
How to Get Your Site Back on Track After It Lost Some Traffic

Digital marketing is constantly changing, improving its techniques to meet consumer and business requirements. For the last four years, the most noticeable changes occurred in SEO. For users, these changes almost always slip through the cracks; for businesses, they are extremely important as they influence brand awareness; for SEO professionals, they always bring a new challenge and an opportunity to assess the quality of their previous actions.

The changes in SEO are often observed through the traffic fluctuations. Remember, it is important to differentiate between the traffic sources. Traffic statistics of your site will show where you started to lose visitors.

Decline in organic traffic is a common issue among the majority of websites. We ourselves have been there, so we can understand the agitation behind it. The said issue can be due to some logical and natural causes that won’t in any way affect your conversions. Such cases most probably will not require any actions from you except for a little waiting. Or, they can be more severe and demand quick and decisive actions – such case calls for an experienced SEO team.

The main objectives of this article are to cover the important questions of what to check first, how to determine reasonsbehind traffic decrease quickly, and what steps to take in order to sort this issue out.

The First Thing to Check

The first thing you want to look into is your recent activity as it might be the reason for the traffic decrease. If you know that your site underwent some changes during the last couple of weeks, then they are probably the ones that caused the drop. This will save you plenty of time and help you understand what works for your site the best. Oftentimes the issue that lies on a surface is the main one, same as with electrical devices – the most obvious, yet not so much, thing to do is just to check if it’s plugged in instead of running for a screwdriver and messing it up even more.

Major changes have a higher chance to cause the problem. However, even minor mending or implementation of a new back-end feature can lead to technical issues and conflicts with the already existing services.

Now, let us dive into the main levels, on which a sudden traffic drop could occur.

Where troubles with your traffic can occur

The levels we are about to examine can be quite complex, thus require in-depth understanding. If you are a beginner in SEO analytics, you will certainly feel the need to learn the basics of the numerous SEO tools. That is why we recommend you to pass this duty to competent and experienced professional instead of dealing with the issue on your own. Nevertheless, it is fairly possible to indicate the source of an issue, especially if you have the necessary knowledge and instruments to do so. Here we have divided these sources into four levels ones so you could paint a big picture of what is going on with your traffic. So, let us move on to these levels and study each one in more detail.

Technical Issues

If the reason behind the low visitor rate lies in technical side of your site, it can mean several things. It could simply be a low quality of your site, which, for some reason, Google had not noticed until now. Remember, your website is for people, not for machines. Status codes will help you indicate if there are issues with server maintenance. Indexation issues can also be the cause of a traffic fall. To check your site for possible technical errors, click here.

On-Page Optimization

On-page optimization level covers Meta data, content, and links. Meta data is an important factor that determines the quality of your website. Search engines use them to assess the relevance of your pages. Every year, Google search engine adds functionality to layout, presentation, and microformat tools. During the last 5 years, a lot of instruments marked as recommended became mandatory. Sometimes they even play a critical role in site ranking. Therefore, make sure your internal links are neat and tidy so they do not spoil interlinking between the pages of your site. Also, it is crucial that all pages can be found by users in approximately three or four clicks. If a page is not easily discovered by users, it will see little to no traffic.

Niche and Competitor Changes

In laymen terms, niche is your field of expertise. Yours and billions of other’s. Unless, of course, your site is devoted to something so specific and unique that you have no competitors, which is good and bad at the same time. Be as it may, competitors are everywhere, so if you lost some traffic, someone somewhere gained it. After you made sure which competitor gained your users over, the only thing left to figure out is how they managed to pull it off. Perhaps there have been some niche or keyword changes. Or, you and your competitors all have lost some of the traffic, which can be seasonal change or due to holidays.

Search Engine Algorithms Update

If your site is airtight, on-page optimization is mind-blowing, and your popularity makes competitors weep like babies, the last thing you want to check is the new search engine algorithm updates. Generally speaking, search engines are constantly working on their algorithms and actively test them. Starting from 2013 and up to now Google have officially released 4 major updates and multiple add-ons as stated by the corporation. Usually, changes by an algorithm update influence 20-40% of niche, thus reshuffling businesses’ sites.

Useful Tools to Check Your Site

Among all the necessary and useful SEO instruments that exist, in this article we mostly cover information about Google services as well as web crawling applications for collecting data from the web. Let us look at the main SEO instruments that will provide you with the most information about your website.

Google Analytics

google analytic logo

Google Analytics helps you track a traffic flow, conversion, and provides you with statistics based on the collected and analyzed data. People especially like this tool, because it is free, simple and easy to use.

Screaming Frog

​ screeming frog logo

This tool has a wide set of features, but the main thing one should know is that it collects data of all necessary onsite elements. It analyzes literally every page and gathers data such as the overall amount of website pages, broken links, keywords elements and much more.

PageSpeed Insights

PageSpeed Insights logo

This instrument will test your site and suggest tips to improve its performance and speed. What is more, it is not only good for testing the desktop version of your site, but also its mobile version.

Google Search Console

google search console logo

Yet another tool from Google! It also provides free analysis of your site and allows for reporting collected data to Google. It will show your keywords ranking on the search engine, information about links, index status, and much more. Being created as a tool for Webmasters, it collects statistical data on web site visibility, indexation, crawl statistics, and other.

Ahrefs

Ahrefs logo

Ahrefs is a web service that collects external information about a website to form statistical data such as backlinks, positions, anchors, and factors used for external optimization research. This tool can also be used for a research on your competitors, which will help you in making better marketing decisions.

Analyzing Your Sites Technical Problems

Server Accessibility plays the important role in site quality assessment by search engines. Inconsistent hosting accompanied by internal technical issues can block further optimization, while sudden connection disruptions will not let clients visit your site.

To check basic technical aspects of a site, use Screaming Frog, PageSpeed Insights, or Google Analytics.

Screaming Frog

After Screaming Frog finished site analysis, you will see a table of all pages of your site with an Overview section on the right. Here you can see the amount of pages blocked by the Robot file. If you have found landing pages that are important for your business among them, it most likely is one of the reasons you are having a traffic drop.

Another indicator of your site’s health is Status Codes. Perfectly maintained site has over 90% of pages returning 200 status code. 404 and all 5xx status codes pose the biggest issues to a website, so you should check them first. 404 status code shows the absence of a corresponding page on a server. If this is your case, you can check all inbound links and determine how they happened to be on your site. All 5xx errors are very often associated with server issues. If your site returns a lot of 5xx errors, check your hosting settings and accessibility. Also, do not forget to check how many pages with 301 and 302 redirects you have.

PageSpeed Insights

The PageSpeed Insights tool gives you the overall page scoring for mobile and desktop devices to determine your site optimization level. Homepage analysis will draw you a picture of site optimization quality. If you noticed that the rating of one of the traffic channels has decreased, it requires analysis too. The rating of a well-optimized page is 80 and more, so if your rating is higher, you have nothing to worry about. Together with the analysis results, the application will give you some tips on how to improve your site pages optimization.

Google Analytics

All technical information about the website you can find in the Site Speed section.The average page load time has to be no longer than 3 seconds.Major fluctuations or, for example, apparent connection between page load time and traffic can be a sign of server malfunction or incorrect setup. Other tabs in this section also contain reports with statistics of all pages load speed or, to be more precise, a graph displaying deviations from the average site indexes. Pay attention to troublesome pages – they might be the cause of traffic loss. But, let us not rush things and move on to the next tool.

Google Search Console

Google Search Console allows you to check Crawl Statistic and Indexing Stats of your site. Check Google Index -> Index Status, Crawl-> Crawl Errors tabs for statistical deviations or critical errors.

Hosting (tech skills required)

Another way to check up accessibility and uptime of your website is to check server documents log. However, you will probably need a helping hand from a technologically skilled professional to do this. Alternatively, you can contact support service of your provider and ask for site condition and errors log check.

On-Page optimization

On-page optimization is the main quality assessment criterion of a site. The situations when the traffic of a site suddenly drops make search engines start to question its quality. In such case, first thing you should do is try to identify the issue. For this purpose, use Google Analytics to get a report of page visits. The Landing Pages report tab contains Sessions and Bounce Rate of your top pages. Comparing these two will save you a whole lot of time in the future. Collect only pages that are losing traffic and further analyze them instead of the whole site.

Meta Tags

The page’s structure contains special markup called Meta tags. Meta tags carry the main information about a page. The most important ones are Title, Description, and H1. The latter is not a Meta tag as such, however, it is very important for site optimization. Use Screaming Frog to assess your Meta tags. First tabs you should check are “Missing” and “Duplicate”.

If more than 40% of your pages have poor Titles, it could be the source of the issue, especially if the situation was different before. This means that the errors and changes to the Titles have not gone unnoticed and most certainly impacted your traffic. In order to fix the Meta data by yourself, learn these fundamentals:

  1. First of all, the Titles of your site should be unique. They should also not exceed 75 characters. However, if you find it impossible to do, make sure all important words and terms are at least close to the beginning of a Title. Search results do not fully display long Titles, so it is important to let the users know what your page is about.
  2. Add Description. In the past, when search engines only started to learn how to assess the content of a page, this tag was of a second importance. In fact, it was almost as important as Keywords Meta tag. Today, however, Keywords Meta tag is unnecessary. More so, it is even neglected by some search engines. If your page lacks Description, search bot will borrow some of page's content and put it in the search results snippet. If your site is small, a good Description will raise its value and relevance from the perspective of a search engine. Just as the Title tag, Description should be unique. The recommended size of a Description is 200 to 255 characters.
  3. H1 differs from Title tag.Since it is practically a page heading - it should be unique. The same headings on different pages confuse search bot and complicate page relevance assessment.

Mobile Front-end

The traffic coming from mobile devices has been on the rise for the past 10 years. According to the research, the traffic from mobile devices had surpassed the desktop traffic for the first time in the late 2016 and continues to do so. Google actively promotes and develops this tendency by constantly creating new services and tools for enhancing quality and usability of search results on mobile devices. According to studies, the difference between mobile and desktop top 10 search results has become striking for the last couple of years. To check the number of clients visiting your site from their smartphones, look at your device statistics. The simplest way to do so is by using Google Analytics:

You can clearly see the decline in traffic of one of the site’s pages. If you have found pages with similar situation, you should check them using PageSpeed Insights and Mobile-Friendly Testtools.

If the rating of a mobile page is lower than 50, it might have caused the traffic drop. Read through the recommendations by PageSpeed Insights and try to increase page score to 80+ points.

The second tool has similar features, but it also allows for submitting a scanned page of your site to Google so it could index it quicker.

Do not forget to check the page that caused a traffic drop from a mobile device by yourself to make sure it works fine from a user’s perspective.

Duplicate or poor content

Search engines have never approved of duplicate content. Pages with the same content are confusing for search engines, so search bots just exclude these pages from index until webmasters dealt with them. Borrowing content from other sites can impose the risk of penal sanctions imposition on a site. These days, search bots can identify the original author and content thieves quite effectively. However, the duplicate content search is not the simplest task. Especially, if, say, you have to compare not 10 or 15 pages, but hundreds and even thousands. The best free tool for this is siteliner.com.

This tool scans your site and shows a summary statistics of all scanned pages afterwards. Check your site for duplicate content and get rid of it as quickly as possible.

Speaking of external duplicates – if you have noticed that specific pages are losing traffic, check their content for duplication immediately. For this purpose, simply choose any sentence, which contains important for your niche keywords, enclose it in quotation marks, and insert it into the Google search field. Search results should show only your page. If there are more pages and some of them are not on your domain, make sure your page is at the top of the search results. This means that you are the original source according to the search engine.

Also, after identifying the pages that caused the issue, you should definitely check who took your place on the first page. Pay attention to the way your competitors present their content, like the word count and usage of headings. It is quite possible that Google finds their content to be more relevant comparing to yours.

Moved Content

Another issue can be the alteration of landing pages’ end addresses. Imagine that your site had a page that brought organic traffic and was at the top of search results for a long time. For some reason, you decide to move it, for example, to another domain. Google will perceive the moved page as a new one. As a result, the search engine will revoke all previous advantages the page had, which is organic traffic and a top place in search results. The perfect solution for this would be to return this page its old URL. If it seems impossible, make sure your site can redirect all internal links to the new page.

It is important to remember that after these changes the new page also loses about 50% of its advantage and will not appear on a first page of Google search results at first. However, if you have moved the page the proper way, it will take its rightful place in approximately 2-3 weeks. Without redirection, all backlinks to the old landing page can become useless for the new page since the search bot will no longer count them during new landing assessment.

Off-Page Activities

If on-page research did not bring results and your traffic is still dropping, it is a high time to check the situation of your site and its’ pages according to the search engine results.

Google uses many external factors To rank pages. We, however, would like to describe the four general ones, which are:

  • Backlinks
  • Brand authority
  • Social signals
  • Relevance to query and user’s intentions

Seeking for reasons of traffic drop is tied to many nuances, which are difficult to fully cover in this section. That being said, let us focus on the main tools and signs of potential issues with a site.

Backlinks

We are starting from this factor for a good reason. However hard search engines try to minimize the influence of inbound links on site’s position in search results, they are still the main external factor in competitive niches. The only thing, though, the search engines have improved is the assessment of referring domains quality. In 2012 Google launched Penguin algorithm, which forever changed the strategies of sites’ off‑page optimization. Thanks to this algorithm, Google now can detect pages and sites that use manipulative tactics to increase amounts of backlinks. The purpose of Penguin is to detect:

  • Paid links;
  • Link exchange;
  • Backlinks from irrelevant sites;
  • Backlinks from satellite sites;
  • Participation in backlink schemes;
  • Other manipulations.

If you are involved in one or several of these activities, especially when all pages of your site are losing traffic, there is a high chance Penguin got to you. Also, do not forget that your competitors can apply same tactics too. That is why, even if you have never bought backlinks or hardly ever bothered with them, you need to check your site from the off‑page optimization perspective anyway. The best tool for this would be Ahrefs. All you need to do is to simply enter domain or page address of your interest in the field and get summary statistics of its’ positions and external links.

Abrupt fluctuations on the Referring Domains graph indicate a sudden backlink rise or drop over a short period. If you have noticed that the similar situation is happening to your site, take this into consideration.

Ahrefs is a paid service, so as a free alternative you can use Google Search Console. This tool allows for in‑depth analysis of your backlink profile and check for some penal sanctions imposition on your site. With the help of Manual Actions section, you can check if Google filtered your site. Remember, however, if you did not find any mentions of sanctions imposed on your site, it does not mean there are none. This is because Google selectively informs owners about banning their site.

In the Manual Actions section of Google Search Console, you can find the information about the following filters:

  • Manual Action : user-generated spam
  • Manual Action: pure spam
  • Manual Action: thin content with little or no added value
  • Manual Action: unnatural links to your site
  • Manual Action: unnatural links from your site
  • Manual Action: hidden text and/or keyword stuffing
  • Manual Action: unnatural links to site – impact links
  • Manual Action: spammy structured markup

If this section shows messages about manual actions, we recommend you to email Google Assessors with reconsideration request immediately. After that, try to find the reasons behind the manual actions.

Now let us return to the external links analysis.

The main page of the section shows a list of domains that have the most links to your site and their anchors. You can find pages of your site that have the most backlinks as well.

To see and download a full report on data from any section, click on “More>>”. Next, using Screaming Frog, select pages, which currently have links to your site, from the downloaded report. For this purpose, navigate to Configuration -> Custom -> Search and enter your site’s domain in any field. Then, switch to the List mode (Mode-> List) and upload the file with list of pages linking to your site or simply copy/paste them. After the program performed analysis, go to Custom and search for your domain. The program will create a list of all pages linking to your domain that is all pages with live links.

Examine all links from the report and list all domains or pages which you consider to be of low quality or spammy. After that, you can add these links to Disavow Tool to tell the search engine not to consider certain links when ranking your site. Following the Google guidelines, create a file and upload it into Disavow Tool. Do not forget to select the right domain, especially if you own several of them.

Of course, you should try to delete as many spammy links as possible by yourself. A disavow tool is just a way to point out the remaining links. At the very least, it is how the Search Giant planned it.

Most importantly, do not run amok during links assessment. You should not delete all links you did not build yourself. The link mass that comes naturally is the most important ranking factor for any site. Only spammy and low-quality links have to be taken out. The simplest way to pick them out is based on anchors and domains. Select off-topic and “weak” sites and links with irrelevant anchors. Try to remove links from such sites and pages. If you cannot delete the links, put domains linking to you into the disavowal file.

If the external links were the actual reason behind your website traffic drop, but Google still got you filtered, be prepared for a long road to site’s recovery. If you do everything right, your site will recover in 2-3 months. You will not have to trouble yourself with assessing links all this time. A search system will give back your site’s positions in SERP only when it understands that you not only deleted all poor links, but also stopped building the same ones.

Niche and Competitors

Finally, if any of the above methods helped identify the reason of traffic drop – it is the right time to check, who took your place in the search results. Niche and competitors analysis is a very broad topic. Essentially, you will have to analyze competitors’ domains the same way you analyzed yours. You will have to select among them the ones that show traffic growth following your traffic drop. Try to focus on figuring out why these changes are happening. Starting from this moment, search for the factors behind traffic decline slowly transforms into the search for ways to promote your site. In other words, that is where SEO process begins.

Conclusion

Now, whenever you notice changes with your traffic, you are now equipped with basics to deal with this. You can check your website on several levels we covered above, the most important ones being technical, on-page, and off-page. We described each one in order of their severity since occasional technicalities are far less serious than off-page, which implies a lot more responsibility and risks. Then again, decline in traffic should not cause panic. Jumping straight to off-page is unnecessary, if you do not know what exactly caused the drop. Start small. Do everything step‑by-step and move to a higher level, if the research on a current level did not bring you any results. Sometimes, the traffic drop can occur as a result of small issues on all levels, which require a complex approach. And sometimes, decline in traffic can be so challenging that basic knowledge turns out to be insufficient. Cases like these require professional skills and experience. Our team can provide you with all help needed to recover your site’s traffic rate and make it even better. Feel free to contact us.

Start a project with danavero

Fill in the form, and we will be in touch shortly.