Every day, seo specialists are called upon to solve problems of optimization of websites. To help them in spots, some tools exist and can be based analysis and make relevant decisions about the actions that must be put in place to improve the positioning of sites and expect an increase in traffic from engines research.
I therefore propose a list, some problem faced by webmasters and some tools on the web that can help us in these spots.
Xml Sitemap Generator
The problem of generating XML sitemap
When your website contains thousands of pages, generate an XML sitemap to submit to search engines is becoming a problem and do it by hand is very time consuming. You need to use a tool to do that.
Tools available to solve the problem of generating sitemap XML
GsiteCrawler | Xenu | Google Sitemap Generator
There are several tools for doing so: firstly, most crawler website offering a XML sitemap, so feel free to use them. Among these web crawlers can find Xenu and GsiteCrawler (Keep in mind that these tools can do more than generate XML sitemap)
Google Sitemap Generator is a tool that comes straight from the Google lab and used to generate XML sitemap easily.
Follow the Viral or Tracking of your Posts on Social Networks
The problem of tracking of posts:
The blogosphere community has a hard time predicting what kind of blog is going and what other buzzer go unnoticed. The quality of the post is certainly for something but in some cases even if the item is quality, it is not included in the various social networks like twitter, facebook, digg’s, so delicious …. What is needed is a tool to collect information related to your various posts over time and allow you to see what type of post and performs when and using what channel.
Tool to solve the problem of tracking blogs posts:
This tool will send regular reports (frequency parameter) which inform you about the performance of your posts and articles in social networks (figures derived from Digg, Twitter, Facebook …).
Compare Traffic from Multiple Websites
In our work as SEO, it is very important to look at the competition in order to properly estimate our SEO results and compare themselves to the traffic of our major competitors. Free tools like compete.com and Alexa have big worries details. As for pay tools, they remain very expensive, some are user centric and not very relevant when it comes to estimate the traffic of a site that is less than a thousand hits per month.
The tools available to compare traffic between competitors:
Quantcast | Google Trends for Websites
Google Trends is one of the tools to give an estimate of traffic and compare multiple sites. It is a good indicator when you want to look at the competition. Figures are sometimes weird but in most cases, reflect the positive trends. It is also now have more details on traffic and attendance of a site using Google adplanner.
See a Page as a Robot
Developing Web sites can quickly turn into a nightmare when you realize that what the user sees is not identical to what is proposed to search engines. It is therefore very important to see the search engine version of the site which is exactly identical to what is being proposed to the Internet for fear of being heavily penalized if you have little recourse to technical guides friendly engines links (cloacking the ip or user agent).
Tools to solve the problem:
This tool is very powerful and gives you a wink vision search engine to your site.
Identify the Crawl Errors
The problem of crawl errors:
To make the engines crawl effective, it must remove all obstacles to a good SEO as: 302 redirects (instead of 301s), errors 4xx, 5xx, the titles not indicated, the pages duplicates … etc.. these errors can not be detected manually especially if the site has several thousand pages. Fortunately there are tools that will help us in discovering these errors.
Tools to identify errors crawl:
GSiteCrawler | Xenu | Google Webmaster
Xenu is really a crawler to do anything! when you know to use it, we can draw a maximum of information especially concerning the structure of the site. to use without moderation!
Determine How Often you Lose your Back links
The problem of back links:
A study of SEOmoz, 75% of the web is lost every 6 months ! This means that your campaigns to obtain back links older than 6 months may be totally lost. There is a cool tool that calculates a rate of atrophy of your links. The higher this rate is high and your links are more volatile and thus disturbed if your position is based on backlinks.
Tool for calculating the rate of atrophy backlinks:
Virante’s Link Atrophy Diagnosis
Found 404 errors without Webmaster Tools and Create 301
The problem of error 404:
In some cases, Webmaster Tools can announce thousands of pages in 404 but very few of them can be problematic. There is an analysis tool that allows linking to external pages to find the site and that link heavily on pages 404. You immediately see the benefit of this tool that allows thus to identify which pages to redirect primarily using 301 redirects.
Tool to solve the problem:
Virante’s PageRank Recovery Tool
Follow the Temporal Evolutions and Trends of Keywords
It is very important to follow the trend of traffic keywords on which you want to position yourself. This tool allows you to predict potential traffic from a keyword by analyzing its trend over time or in the region (or location) you are targeting.
Tool trend over time or following the geolocation:
Google Insights | Trendistic
Determine a Semantic Proximity of an Expression
Search engines use many environmental semantics of a page to define its relevance to a given query. They are a kind of semantic analysis of a request to define what kind of expressions can be bound. It is therefore very interesting, to find the semantic variations of an expression is to add them to your landing page to enrich semantically. The brand is found associated with these requests.
Tools to find words related:
Google External Tool
Do you have other SEO problem? Let discuss here.