Technical SEO Issues & Best Practices
While basics of technical SEO a bit like the foremost effective ways to make links to drive program rankings have changed in recent years (and content marketing has become increasingly important) what many folks would consider as more “traditional SEO” remains incredibly valuable in generating traffic from search engines. As we’ve already discussed, keyword research remains valuable, and technical SEO issues that keep Google and other search engines from understanding and ranking sites’ content are still prevalent.
Technical SEO for larger, more complicated sites is essentially its own discipline, but there are some common mistakes and issues that the bulk of web sites face that even smaller to mid-sized businesses can enjoy being aware of:
Search engines are placing an increasing emphasis on having fast-loading sites – the good news is that this is not only beneficial for search engines, but also for your users and your site’s conversion rates. Google has actually created a useful gizmo here to supply you with some specific suggestions on what to vary on your site to affect page speed issues.
If your site is driving (or could be driving) significant program traffic from mobile searches, how “mobile-friendly” your site is will impact your rankings on mobile devices, which can be a fast-growing segment. In some niches, mobile traffic already outweighs compare to desktop traffic.
Google recently announced an algorithm update focused on this specifically. you’ll determine more about the thanks to seeing what quite mobile program traffic is coming to your site alongside some specific recommendations for things to update in my recent post, and here again, Google offers a very helpful free tool to urge recommendations on the thanks to making your site more mobile-friendly.
Header response codes are a very important technical SEO issue. If you’re not particularly technical, this may be a complicated topic (and again more thorough resources are listed below) but you’d wish to form sure that working pages are returning the proper code to seem engines (200), which pages that are not found are also returning a code to represent that they are not present (a 404). Getting these codes wrong can inform Google and other search engines that a “Page Not Found” page is really a functioning page, which makes it appear as if a thin or duplicated page, or even worse: you’ll inform Google that everyone among your site’s content is basically 404s (so that none of your pages are indexed and eligible to rank). you’ll use a server header checker to determine the status codes that your pages are returning when search engines crawl them.
Improperly implementing redirects on your site can have a big impact on search results. Whenever you’ll avoid it, you’d wish to stay from moving your site’s content from one URL to another; in other words: if your content is on example.com/page, which page is getting program traffic, you’d wish to avoid moving all of the content to example.com/different-Url/newpage.html, unless there is a particularly strong business reason which may outweigh a possible short-term or even a long-term loss in program traffic. If you’re doing need to move content, you’d wish to form sure that you simply implement permanent (or 301) redirects for content that’s moving permanently, as temporary (or 302) redirects (which are frequently employed by developers) inform Google that the move won’t be permanent, which they shouldn’t move all of the link equity and ranking power to the new URL. (Further, changing your URL structure could create broken links, hurting your referral traffic streams and making it difficult for visitors to navigate your site.)Duplicate Content
Thin and duplicated content is another area of emphasis on Google’s recent Panda updates. By duplicating content (putting an equivalent or near-identical content on multiple pages), you’re diluting link equity between two pages rather than concentrating it on one page, supplying you with less of an opportunity of ranking for competitive phrases with sites that are consolidating their link equity into one document. Having large quantities of duplicated content makes your site appear as if it’s cluttered with lower-quality (and possibly manipulative) content within the eyes of search engines.
There are varieties of things that will cause duplicate or thin content. These problems are often difficult to diagnose, but you’ll check out Webmaster Tools under Search Appearance > HTML Improvements to urge a fast diagnosis.
And inspect Google’s own breakdown on duplicate content. Many paid SEO tools also offer a way for locating duplicate content, like Moz Analytics and Screaming Frog SEO Spider.
XML sitemaps can help Google and Bing understand your site and find all of its content. Just make certain to not include pages that aren’t useful, and know that submitting a page to an enquiry engine during a sitemap doesn’t make sure that the page will actually rank for love or money. There are varieties of free tools to get XML sitemaps.
Robots.txt, Meta No Index & Meta No Follow
Finally, you’ll inform search engines how you would like them to handle certain content on your site (for instance if you’d like them to not crawl a selected section of your site) during a robots.txt file. This file likely already exists for your site at website name/robots.txt. you would like to form sure this file isn’t currently blocking anything you’d need a program to seek out from being added to their index, and you furthermore may use the robots file to stay things like staging servers or swaths of thin or duplicate content that are valuable for internal use or customers from being indexed by search engines. You’ll use the Meta no index and meta no-follow tags for similar purposes, though each function differently from each other.
This is an excellent checklist of varied technical SEO issues your site could also be affected by
Gregory Ciotti offers tips to hurry up Word Press sites
Richard Baxter offers a variety of tools to assist you to speed up your site
Several places offer in-depth duplicate content articles, including Moz, Yoast, and Hobo Web
Google gives some tips for creating your XML sitemap, as does Lunamatrix.
Technical SEO is often tough to try on your own, so if you’re thinking professional assistance is a worthwhile investment, inspect this post on the way to find the proper SEO Services for Your Small Business
The way to Track & Measure SEO Results
So once you begin writing your awesome SEO content and putting all of those steps into motion, how does one actually track whether and the way well it’s working?
On its face, this question features a fairly straightforward answer, with some key metrics to specialize in, but with each metric, there are some key factors to think about as you measure your site’s SEO performance.
Looking at where your site ranks for an inventory of keywords certainly isn’t a final destination – you can’t pay your staff in rankings, things like personalization in search results have made them variable across different locations, and thus hard to trace, and in fact, all they indicate is where you show up in search results. Some would even go thus far on declare them dead. But getting a rough idea of where your site ranks for core terms are often a useful index of your site’s health. This doesn’t mean you ought to get overly hooked into rankings for anybody’s term. Remember: your ultimate goal is to drive more relevant traffic that drives more business – if you sell blue widgets, is it more important that you simply rank for “blue widgets” or that you execute an SEO strategy that helps you sell more blue widgets within the most cost-efficient common way possible? Use rankings as a general day to day checkup, not a course-charting KPI.
A number of tools can assist you to check your rankings. Most offer fairly similar functionality but features like local or mobile rankings are sometimes unique in a number of the tools. If you’re a little or local business or simply getting started with SEO strategy, I’d recommend picking a free and easy-to-use tool and just keeping an eye fixed on a couple of the core terms you would like to trace to assist you to gauge progress.
Organic traffic may be a far better index of the health of your SEO efforts. By watching the organic traffic to your site, you’ll get a gauge for the particular volume of tourists coming to your site, and where they’re going.
You can measure your organic traffic easily with most analytics tools – since it’s free and therefore the most-used, we’ll check out the way to get this information in Google Analytics.
For a fast check, you’ll simply check out your site’s main reporting page and click on “All Sessions” to filter for organic traffic (traffic from search engines that excludes paid search traffic):