Expert publishing blog opinions are solely those of the blogger and not necessarily endorsed by DBW.
Millennials of the world might cite the most important event of 1994 as the birth of pop star Justin Bieber (I did say “might,” ok?). But webmasters and coders might view 1994’s highlights a bit differently: that year was the origin of the World Wide Web Consortium (W3c), the organization that dictates the Web standards for creating elements in the online environment that coders still use today. In my opinion, one of these has a slight edge in importance. You can probably figure out which.
What you see in your browser when you view this post is the end result of thousands of lines of HTML code and cascading style sheets (CSS) that inform what gets displayed on the Digital Book World (DBW) website. Without proper coding to standards set by W3c, your pages might show up as blank or possibly the dreaded and much lampooned 404 error.
Much has been debated about the importance (or lack thereof) of valid coding when it comes to SEO. However, I hope to show you that clean website code can 1) be an SEO factor worth reviewing, 2) influence click through rates (CTR) and 3) prevent users from bouncing away from your webpages. In any case, what we are looking for in these analyses is a modern, optimized website. And if your code is full of errors compared to the set standards, it is neither modern nor optimized. In my humble opinion.
You should know that most webpages have an error or two when it comes to coding. A quick check of Google’s homepage using the W3c code validator shows 32 errors. Our own independent book publisher participants scored a 2.8 (a “D” grade) in terms of code validation. The highest number of homepage errors was 100+ (Oldcastle Books) and the largest number of book page errors was over 750 (Greystone Books).
The Truth About SEO and Code Validation
Until 2009, many coders, webmasters and early SEO experts believed that proper code validation brought them closer to higher search engine rankings. Then Google guru Matt Cutts set the record straight on code validation not affecting search engine ranking very much. Cutts and Danny Sullivan of SearchEngineLand did it again in 2011, explaining that search engines are less concerned with validation and more concerned with content. So does that mean that you should not worry about code errors?
Not really. In my past posts analyzing the 12 independent book publishers that agreed to participate in this evaluation of SEO Factors, I have made it pretty clear that SEO is more than just ranking on a page. You need to consider:
• The search engine’s ability to crawl as many pages of your website as possible without encountering errors.
• The CTR and bounce rate for pages that do not render well in a user’s browser due to HTML and CSS errors.
Don’t Break Google’s Concentration
As I mentioned in past blog posts, Google indexes more than 50 billion webpages. It accomplishes this through frequent crawls through websites using its “spider” to:
• Maintain that current webpages still exist
• Confirm updated content on these webpages
• Add new webpages to the index
And while the spider, the index and the search engine algorithm are cutting edge, Google still needs help finding the pages to index (and to then present them in the search results). That’s why I have clued you in to a few important suggestions, including the schema.org common vocabulary and the importance of keywords, titles and tags. Search engines need your help to develop a consistent index in order to display your pages in appropriate search results.
Page code errors can hamper a search engine’s path and break its concentration. I’m not saying that a few code issues are going to prevent users from finding your site and your books. What I am saying, though, is that pages with large amounts of errors can:
• Prevent search engine spiders from reading the entire webpage
• Skip important page information from being added to the index
• Make your important content disappear or become inaccessible in SERPs
To prevent code errors from becoming an SEO factor to your online presence, it is important to validate your code using the W3c tools. This will confirm the number and location of errors on your site. Additionally, I recommend you might even go one step further by evaluating your site with site crawl software. Both of these steps will help you figure out which HTML or CSS errors are preventing your pages from being indexed, and point you toward efficient and effective solutions.
Keep Your Customers on the Page
SEO is more than just page rankings. It also takes into account the millions of people who have their questions answered by search engines and choose the best link(s) available from the SERPs provided. I have a wish for each and every one of our participants and readers, that users will pick your site, choose your books, make a purchase and return with more customers. That being said, page code errors can really prevent this wish from coming true.
Per KissMetrics, 79 percent of shoppers who are dissatisfied with website performance will never come back again. Let me repeat that: they will NEVER come back again. This is where CTR and bounce rates come into play.
• Code errors decrease your CTR: CTR represents the number of times your entry in the SERPs is clicked on compared to the number of times it is shown to a searcher. The higher the CTR, the more people are potentially visiting and purchasing books from your site. Code errors can damage your CTR by:
o Preventing content from being indexed by a search engine spider and showing up in the SERPs.
o Creating a poor user experience on the page (broken links, inaccessibility, and e-commerce errors) and ensuring that users will not return to your site.
• Code errors increase your bounce rate: “Bounce rate” means that users click on your entry in the SERPs, visit your website and a) leave quickly or b) don’t explore further than the homepage. One of the reasons they may do that is because code errors are creating a poor website/online purchasing experience for them. And while code errors, as Google states, are not a significant ranking factor, bounce rates are a very significant ranking factor.
Some great ways to prevent high bounce rates and poor CTR include:
• Validating your code and indexing your pages using site crawl software.
• Testing your website in multiple browsers to make sure that it displays correctly for all customers, including mobile users. At a minimum, you should understand which browsers and versions your users are using most frequently. This can be found in your analytics.
• Revisiting your pages frequently to confirm that the information you present is current, correct and easy for users to read/skim.
Fixing Code Errors Is Valid
With all the commentary surrounding code validation, you might think that pages with multiple HTML and CSS errors are no big deal. And up to a point, you are right. Google and other search engines are willing to overlook minor errors in order to get your pages out to your customers via SERPs. Still, as I mentioned, errors that lead to broken pages can damage your online reputation in more ways than one. Professional-looking pages are the stepping stones to accessible information on the Web, and that starts with coding that matches W3c standards.
Next up? Let’s talk about moving your website into the mobile environment for customers on the go. Mobile-friendly websites are an SEO Factor for the future.
How do your pages validate? Let me know in the comments below.
To get all the ebook and digital publishing news you need every day in your inbox at 8:00 AM, sign up for the DBW Daily today!