All SEO updates for each CMS platform and website builder platform depends on the search engine giant. If Google rolls out a new algorithm update, it can mess up your months’ worth of optimization work. Search optimizer are always on the watch for the latest updates from the tech giant. There are at least a million articles and blogs online, which harp on about Google’s SEO updates and their subsequent consequences.
Sustainability and SEO:
Now, the truth is, sometimes, Google rolls out updates for an entirely different reason. It turns out to affect Ithaca SEO massively, but that is never their initial intention. We know, that seems quite hard to believe. However, that is what happened after their “Fred” update on March 8, 2017. Although, unconfirmed until now, it has had a widespread effect on website SEO, e-commerce SEO and social media’s take on SEO. SEO professionals are going crazy trying to figure out exactly, which out of the 200 factors that used to govern Google SEO has changed. It has been almost six months since the launch, and the best of the best have barely scratched the top of the mystery box.
So, let us leave the change and the cause of it alone for a while. All the new updates and patches have been making us feel rather shaky. Let us focus on the permanence of principles of SEO for a while. Yes! There are some things about SEO that are simply permanent.
What are these “permanent” Principles?
Bots will get to the scene first: this has been true since time immemorial. Google cannot rank your site until the bots read out what your site is about. Now that RankBrain is here, Google is using automated bots to crawl all websites and index their content. If you have a strong content and good keyword use, you can sit back and relax. RankBrain acts like a human reader. It finds the “interesting” parts, phrases and words, even understands nuanced sentences and synonyms. Therefore, gaining high ranks using an honest game is much easier now that AI is in the picture.
From the technical vantage point – you need to make sure; your bots see your site. Your site needs to be fast, needs to have a clean code and it needs to have a clear organizational hierarchy of information. Technical SEO is not going to take a backseat anytime soon just because we have AI and fancy updates coming in. You will still need to work on the visibility quotient of your website using plugins and extensions.
Bots will want to feel welcome: your site structure will determine how the bots “feel” about your site. You need to pay extra attention while naming your posts and writing the descriptions. Structured markup is also important for ranking your site. The bots will see your site as a clear road map. You will still need an organizational structure and an XML or HTML sitemap. Once you have the right sitemap, go ahead and submit it to your search engine console for ideal crawling and ranking.
Bots will still be looking for quality content: you cannot replace good quality content with re-spun content simply because you have a great collection of plugins. Bots are becoming scarily human. They will “Read” through your content, check your keyword density, look for any duplication or plagiarism online and then decide about the quality.
What is “great” content as of the latest update?
As of the most recent update or According to Fred, great content is long form content. It can be between 1200-1800 words long. Keyword density should ideally vary between 1%-4% and no more. Any more use of the keyword or synonyms of it will be “keyword stuffing.” The content itself has to be interesting and informative. Fred has no patience for fluff. If you say the same thing through paraphrasing, Fred will know! Google has been penalizing websites that are using duplicate content or old content that they have rewritten, somewhat amateurishly.
Till date, Google has never said what they consider “Good” content. Matt Cutts has said a few words about it though. Content has always been his focus of SEO development, even when he was working for Google. His principle has been to work on original content and use reliable crawlers to check for mistakes, broken links and more before Google gets to it.
Great content also includes great images and videos. Not all website owners and admins know this, but even media needs optimization. Google bots cannot “read” images and videos. As a result, if you are not including descriptions, alt tags, and titles in your images and videos, it is no surprise you do not get to see them on Google image searches. If you want to keep your on-site media visible, you better optimize your images and your videos. Your images need to be small but high quality. Include thumbnail views for your videos. Add keywords in their meta descriptions and titles. That is the only way Google bots can “read” what these media are about.
Build trust: if you are a new site, it is challenging for both Google and your new visitors to trust you. That is why, even though a new site may offer something at a steal, Amazon and eBay will always remain the top seller sites in the USA. It is all about trustworthiness. Right now, you can tell the reliability in the form of links. Now, link building will take some time. We are talking about high-quality link building and not the kind of link farming more second-grade sites engage in. Google actively penalizes sites that use link building from questionable sources. Building your link authority is the only way to garner trust, both from your users and from Google.
No matter what you are trying to sell and which update you are trying to tackle, using spam and black hat is not the way to crawl up the ranks. Google can actively ban your site from the first page for a couple of days, if they detect duplicate content, although they may not say so! So, get up, get your hands dirty a little, work on building links and gathering great content that will help you in the long run.
Last Updated on