May 14, 2018
Your website has to be indexed (added to Google’s web search index) before it shows up in Google’s search results. Indexing happens only after Google’s spiders crawl your website. Spiders (not real spiders, just bits of computer code) crawl through (scan) your site, checking the validity of your links and looking for new and unique content. If Google’s spiders determine that your content is worthy of indexing, the particular webpage it crawled and deemed worthy will be indexed and displayed in future Google searches. If your website contains duplicate, unoriginal content, broken links, structural errors, or renders poorly, spiders will have a difficult, if not impossible, time crawling your website, and may not index your website at all, making it invisible to would-be-customers. This is one of the main reasons why the code your website is built on matters, which we will talk about in the next section of this article. Although it could take months to crawl even the best websites, Google offers a free tool to speed things up. We employed Google Webmaster Tools to ensure Google crawled and indexed our website fast. Google Webmaster Tools is how Google communicates with webmasters (website owners) and lets them know if there are issues or errors found in their website (which affects Google’s ability to crawl and index your website). GWT also lets webmasters submit XML site maps (a roadmap to all the important pages in your website) and manually request that their newly added webpages be crawled and indexed, which is usually much faster than waiting for Google to do it on their own. Crawling and indexing shouldn’t just happen once, it should be continual, and occur every time new content is added to your website. This can be tedious, so we added code to our website that does this for us automatically. When this blog post was added to our website, GWT was instantly notified that we added new content, making it easy for spiders to quickly crawl and index it. After submitting our site map to GWT, manually asking GWT to crawl and index our current pages, and implementing code to automate this for future ones, most of our website’s pages were indexed in less than 24 hours.
Google’s spiders were not always as smart as they are now. In the past, there were many not-so-ethical ways to manipulate spiders into quickly indexing your page and ranking it very high on search engine results based on nothing more than the frequency with which certain keywords appeared in your website. However, Google’s search engine algorithm today looks nothing like it did years ago and changes daily to favor ethical websites. Cramming your site with keywords may actually result in penalties. Google now rewards websites that create an awesome user experience by rendering quickly and beautifully on mobile devices and contain helpful, original content. Architecturally sound code is also rewarded as it directly affects user experience by ensuring pages are rendered quickly and correctly and contain no broken links or unloadable pictures. While meta tags (snippets of text hidden in your website’s HTML code that won’t be seen by people visiting your website) matter in terms of providing Google with the summary you would like displayed under your website when people search for it, they are no longer used by Google to rank your website in its search results. Therefore, any company whose SEO strategy for your website centers on creating great keyword meta tags should be avoided, as this may actually hurt your website and will have no impact on its ranking. The most effective SEO strategy nowadays is to ensure your website’s content is original, useful, and current, code is sound and error-free, mobile-friendly and easy to use, fast (Google’s search engine algorithm now penalizes slow loading websites), and contains good backlinks.
Google now uses backlinks (inbound links which are created when one website links to, or includes, the URL of your website on their page) to rank websites, and having a lot of reputable ones can positively impact your website’s authority and position. Getting good backlinks is a marathon not a sprint and is something that VirtualZero, as a new company, is still working on. Yes there are unethical ways to quickly gain large amounts of backlinks, but this always backfires. Google is smart enough to spot when websites employ shady methods of acquiring them, as the websites backlinking to their website are usually unrelated and contain an unusually high number of backlinks to other sites as well. This always backfires and can result in your website getting penalized or banned, so we recommend a steady, ethical way to continually acquiring reputable backlinks for your website. A great way to do this is to continually create great, useful content so that websites in your industry will want to link to your own, often citing your website as an authority on the subject they are referencing or a useful place to learn about an industry-related topic. Blogs and social media accounts are great for this, so we immediately started a blog and created all the social media accounts we could think of to begin providing the internet with useful, current content. Having websites backlink to this content will take time, but all of the links will be legitimate, which will ultimately result in our website receiving a high authority ranking.
People rarely talk about this, but it really counts. If you do not provide content for your website or request that the web design firm working on your website create content for you, it’s crucial that you investigate their ability to effectively write. If someone visits your website and has difficulty understanding your company’s message or purpose, they may immediately leave your website, increasing your bounce rate, (the rate at which people quickly leave your website to click on another) which decreases your website’s ranking. If they stay and read through your content only to find it’s filled with grammatical errors and passive sentences, their impression or your company’s credibility will quickly dwindle. Inexperienced web developers turned pseudo writers may even result to plagiarism (whether intentional or accidental) to create content for your site, increasing the percentage of duplicate content found in your website. Google often penalizes or refuses to index websites with a high percentage of duplicate content. VirtualZero’s team of developers have collectively taken hundreds of college and graduate level liberal arts classes requiring research papers and theses, with some papers even being published by academic journals. This sets our team apart from others who may have gone straight into computer science. Our team is well-rounded and focused on writing engaging, error free content. It absolutely matters, and it’s something we’ve leveraged to increase our visibility on the web.