Today’s tip is about correctly diagnosing your website problems.
Have you observed a drop in traffic to your website? Before you jump to conclusions, take the time to correctly diagnose the problem. Scary as it sounds, you’ll have to approach your website the way a doctor would approach a patient. Granted, there’s a reason you’re interested in SEO and not in biology, but you’ve got to make a correct diagnosis before you can begin treatment. A traffic problem is not always a ranking problem. It’s important to remember that a drop in traffic isn’t always the result of an issue with search rank. You may not have a problem with your ranking at all. You have to explore all of your options, including issues with crawling. Maybe your site is being crawled but not getting indexed. It could even be an issue with extraction, but it’s up to you to figure it out. How? Start here:
* Develop an infrastructure to diagnose your website dilemmas. Start with ranking report benchmarks so that you can get a better sense of where you stand in the rankings for some of your top queries. Once you get an idea of how you compare with your competitors in those queries for the search engine ranking pages, you’ll be able to stay up-to-date on any significant ranking movement.
* Organize your pages into categories. Look at your server logs for search engine bot activity on a per category basis. Once you implement this, you’ll be able to better understand about how well the bots are spidering/indexing your content. You might see that certain categories are being actively crawled by the search bots, while others are getting very little attention. Different category pages might also be crawled at different rates, which is important to remember. Rates can range from 10 to 100 pages per day. Being able to see how many pages the crawlers pick up from the individual categories will tell you a lot about how long it takes the bots to get through your whole site.