Search Engine Optimization is an ongoing process, you have to keep making effective changes in your optimization strategy to keep your site well working and appearing at the desired position over the web. We make a lot of changes to our site, from changing the theme to adding new plugins, from updating plugins to refreshing the old content. All such things ultimately affect your ON Page SEO Health which reflects in your site’s performance and ranking.
To make sure that your site will always perform at it’s the best level, you should perform ON Page Optimization on regular time slices. This is because, only link building is an ongoing process and all things other than link building, changes frequently. That’s why you should take care of the optimization process more often and more carefully to maintain the site’s performance at its peak.
Site updates, theme updates/changes, plugin updates, adding a new plugin/functionality, and other changes can cause some accidental errors that could lead to on-page SEO issues. Unless you proactively look for these errors, they will go unnoticed and will negatively influence your rankings.
Keeping the importance of SEO in mind, here are important checks that you need to conduct on a Weekly basis to ensure that your on-page SEO is on point.
Check Robots.txt to see if you are blocking important resources
It is very important to check the Robots.txt files regularly. Many times we block some important stuff unknowingly which is not crawled by the search engines. In CMS like WordPress, Drupal its very easy to block something important because initially, WordPress, Drupal comes with many default settings.
For example, blocking out the wp-content folder in your Robots.txt would mean blocking out images from getting crawled. If the Google bots cannot access the images on your site, your potential to rank higher because of these images reduces. Similarly, your images will not be accessible through Google Image Search, further reducing your organic traffic.
check your Robots.txt file go to Google Search Console -> Google Index -> Blocked Resources.
If your these resources are blocked you can simply enable it by adding
For example, let’s say you are blocking the following two resources:
You can unblock these resources by adding the following to your Robots.txt file:
By adding these files crawlers will be able to crawl your site’s important content. To ensure that if these resources are now crawlable or not, go to Crawl -> Robots.txt Tester in Google Search Console, then enter the URL and hit the Test button.
Check your site for broken links
Links are the most important factor in SEO. As links are the concern, they got affected very frequently. You should do a link check on a periodic basis. Broken links damage site performance very badly, you can easily fix your internal broken links with the help of a broken links checker. But if the external links are broken than it is very difficult to manage.
It is so because you have no control over the external broken links. There may be a possibility that the page you linked with is no longer exist. Such links show the error of 404 (Page Not Found) which affects your site ranking in a negative manner.
If you are using WordPress, you can also use a plugin like the Broken Link Checker. This plugin will find and fix all broken links.
Another way to check for broken links is through the Google Search Console. Log in and go to Crawl > Crawl Errors and check for “404” and “not found” errors under the URL Errors section.
If you do find 404 URLs, click on the URL and then go to the Linked From tab to see which page(s) contain this broken URL.
Regularly check your indexed pages
It is very important to check that all your pages are properly indexed by search engine. To check this simply type “site:sitename.com” into the search panel and hit enter. By doing this you can easily figure out that all your quality pages are properly indexed. You can also check if there are any low-quality pages.
By roughly scanning through these results, you should be able to check if all pages indexed are of good quality or if there are some low-value pages present.
Quick Tip: If your site has a lot of pages, change the Google Search settings to display 100 results at a time. This way you can easily scan through all results quickly.
An example of a low-value page would be the ‘search result’ page. You might have a search box on your site, and there is a possibility that all search result pages are being crawled and indexed. All these pages contain nothing but links and hence are of little to no value. It is best to keep these pages from getting indexed.
You can easily exclude such pages from being indexed by disallowing them in Robots.txt, or by using the Robots meta tag. You can also block certain URL parameters from getting crawled using the Google Search Console by going to Crawl > URL Parameters.
Check the HTML Source Code to ensure everything is right
It’s one thing to use SEO plugins to optimize your site, and it’s another thing to ensure they are working properly. The HTML source is the best way to ensure that all of your SEO-based meta tags are being added to the right pages. It’s also the best way to check for errors that need to be fixed.
You can check these following things to make sure that everything is working fine.
- Check to see if the page has a meta robots tag, and ensure that it is set up properly and working well.
- Check for the page has a rel=”canonical” tag and make sure it is showing the proper canonical URL.
- For mobile responsiveness check if the pages have a viewport meta tag.
- Check for the OG Tag in the pages.
- Check to see if pages have proper OG tags (especially the “OG Image” tag), Twitter cards, other social media meta tags, and other tags like Schema.org tags.
To check the source, do the following Open the page that needs to be checked in your browser window. next Press CTRL + U on your keyboard to bring up the page source, or right-click on the page and select View Source. Now check the content within the ‘head’ tags ( <head> </head> ) to ensure everything is right.
Mind Your Down Time
It’s necessary to notice your site’s downtime patterns. Down Time ruins your visitor’s experience which will hurt the site on page SEO health in a long term. Too often Down Time negatively affects sites ranking factor and the Search Engine Optimization strategy. You should regularly audit your site performance.
Request your hosting provider to regularly send the performance report so that you can easily check site performance. If you are having too many downtimes than you should switch the hosting service and get the faster-hosting services.
This is because numerous hosting services are available which offers hosting services at a very low cost. But their performance is not good all the time and they are not even reliable. Switching to a better hosting gives better performance and fewer downtimes that means less bounce rate.
Check for render blocking scripts
Check Site Loading Speed
The site’s loading time matters the most for keeping the bounce rate low. For maintaining a good ON Page score, the site must load fast i.e within 3 seconds. According to research, 75% of the users will not re-visit the site if it takes more than 4 seconds to load. You can easily conduct a site speed audit by Google’s free tool.
Check for mobile usability errors
Sites that are not responsive do not rank well in Google’s mobile search results. Even though your site is responsive, there is no saying what Google bots will think. Even a small change like blocking a resource can make your responsive site look unresponsive in Google’s view.
So even if you think your site is responsive, make it a practice to check if your pages are mobile friendly or if they have mobile usability errors.
To do this, log in to your Google Search Console and go to Search Traffic > Mobile Usability to check if any of these pages show mobile usability errors.
From the above On Page Optimization definition, we can come to the conclusion that it is endless time taking a process and has many aspects. But if you properly audit perspectives carefully, you will have better optimization for your site without doing anything related to Black Hat SEO. Don’t wait for a perfect time to edit. Make a checklist and perform your On-Page Optimization now to ensure smooth and swift working. If you have Drupal site, Please follow "13 Things that Will help to Improve Your Site's Ranking".
Apart from On Page Optimization, another most important thing is the Off Page Optimization. We can say that Off Page Optimization is the fuel for keep running the blog posts effectively. From social sharing to link building, all these aspects matter the most for keeping the traffic coming to your site. Check out our blog on Off–Page SEO Techniques to Improve Online Reputation