Too many webmasters believe thatoptimization is a set of actions that have been taken after the web site was created, whether it is the optimizationof certain parameters specific pages to increase the chances of getting top rankings for specific keywords, or the process of acquiring back links from authoritative sources in order to improve the external optimization.
However, ignoring the important role of the coding of your website, which he can play in the overall optimization is akin to that of building a house on an unstable foundation. Consider all of the following tips to prove that coding can help search engine optimization, so as not to miss the chance to take advantage of these important benefits.
Tip number 1 - Check your code availability for crawler
Remember that search engine crawlers have some serious limitations with regards to research and indexing sites. So how to really effectively they can read only the text, the other elements of your site, including images, audio, video files and scripts, may not allow the crawler to read information on the site properly.
In order to see for himself the site exactly as it is seen by the search robot, use the "search robot Simulator" from Webconfs. If you notice that no part of the text on your pages, check your code for errors and correct so that search engine spiders can read all the information you need.
Tip number 2 - Build-oriented search engines NC
Creating a focused search engines CNC (from the phrase Beautifying URL) is useful from the point of view of optimization, and in terms of convenience for the user. Time and effort that you need to minimize the number of additional characters in your URL depend on what platform your site. If you useWordPress, Joomla, or some powerful content management system, you will need access to plug-ins, or the control panel settings to make desired changes. In other cases, especially when it comes to e-commerce platforms,open source, you have to manually edit your permalink structure with a file Ht-access.
Tip number 3 - Clean your code to help accelerate the download site
While at first the code of your site and so was "clean", with the time when your site will acquire modifications, it is likely to cause errors that can slow down the operation of the site. So it makes sense to carry out regular checks and to pay attention to these points:
· Unnecessary spaces - remove them, while leaving the code readable
· Use the HTML validator to fix the broken and mismatched tags
· Use the tool checks the broken links, to remove invalid URL.
Tip number 4 - Organize the text version of this
As stated in the first board, search engines usually can not get access to information that is located inside the videos, images or script files. However, since these elements improve the ergonomics of cool site, remove them completely is not a good idea.
Instead, the best solution in terms of coding will arrange for alternative, text version of the information that you think should be indexed by search engines. For example, if you insert files Flash, connect the library SWFObject2, which will automatically create an alternate version of the text, if the visit will determine the search robot, which can not handle these types of files correctly.
Tip number 5 - Prevent indexing in robots.txt
Since there is no way to control the behavior of the crawler with 100 percent accuracy, point them to ban indexing certain pages of your site using tags "noindex" in your file robots.txt, may be useful in terms of optimization. This tag should be added to your robots.txt file for all pages that should not be displayed in the search results, including:
· Recycle and confirmation page
· Pages panel member
· Archive of
· Pages with contacts
Tip number 6 - Use «rel = canonical» to solve the problem of duplicate content
If you are using engines like WordPress, Magento and Joomla to create a website, you probably have a problem with duplicate content, which appeared because of the specific ways in which these engines create URLs. Every time you create a new post in your site, often these systems automatically generate variants addresses.
Since all these different URL to direct you to the same page, you risk to fall under sanctions search engine associated with duplicate content, if you do not specify how the robot to deal with each of these pages.
The best way to inform the search robot how to deal with your URL, is to use the tag "rel = canonical". It must be added to the <head> manually or by using a special plug, thus speaking to a spider to ignore, redirect, or index that page for a specific URL.
Tip number 7 - Set 301s redirects to the proper distribution of PageRank
When it comes to installing the correct x 301 redirects, there are two things that you should consider in terms of search engine optimization. First - use the code to redirect to inform the search engine that both versions of the site (with www and without www) should be the same.
Second, if you ever move the content within your website (for example, if you change the title and the permalink articles on the blog), create a 301 redirectto tell the search robot changes. Such actions are likely to minimize the loss ofPageRank, which can occur if the back links are no longer correspond to a page with this content.
Tip number 8 - Use Microdata to create extended descriptions
One of the last replenishment of developer tools SEO is the microdata - a new language, which allows you to add multiple levels of specific information in the HTML tags of your website. These items can not only help your site to be indexed and ranked correctly, they can also increase the click through the search results page, creating an extended description pages in the sickle, so called, "rich snippets".
Since there are suggestions that the overall CTR of a sickle is a factor in Google ranking, the addition of these new features can help search engine optimization + provide additional traffic from the search.
To find out more about what the extended descriptions and how to create them using microdata, visit Schema.org.
Tip number 9 - Combine the script files to reduce the download time
Recently, the download speed was particularly important as a factor in search engine rankings due to Google's announcement that they prefer to set up in the results of on-demand fast websites. Unfortunately, if you have created your site using a ton of different scripts in order to provide additional features for itsvisitors, downloading all these different codes are constantly can significantly reduce the speed of your website. By bringing these separate pieces of code into a smaller number of files you minimize download time due to excessive number of calls to the script, and improve the overall search engine optimization of the site.
Tip number 10 - Use the CDN, to minimize the resource consumption of your server
In the end, if you have done all that was in your power to improve and maximize your website's code, but you still do not manage to make tangible improvements in the speed of loading a website, try a Content Delivery Network (CDN), to put your content on the external websites. This will help to minimize the total amount of resources required to run your site.
0 comments