Netumo’s team is currently busy testing and finalizing our next huge release to Netumo. In this release, we are going to launch what we call Verifications.
Following extensive feedback from our customers and the community, we have understood that there are certain checks required that cannot be done with Netumo. These checks are much more complicated and take more time, however, they do not need to be as frequent as monitoring checks. You need to ensure that your robots.txt file is valid, but you don’t need to do that every minute, nor every hour, once a couple of days is enough. Similarly, you do not need to validate your sitemap with the same frequency as you check your website.
With Netumo Verifications, all of the above will be possible. In our initial version launching over the next few weeks, we plan to have the following verifications.
- Keyword Check – Ability to check for words or terms in a URL, that can not be adjacent to each other.
- Advanced Check – Ability to run a Regular Expression on a URL.
- Indexability Check – Check the robots.txt, X-Robots-Tag and HTML Meta tags for any issues blocking bots from accessing your site.
- FTP Check – Check the username and password to FTP, FTPS or SFTP server.
- Sitemap Check – Check that the Sitemap is valid and so are the URLs within.
- Change Check – Check for changes in a section of a URL.
- Links Check – Check all links in a particular URL to ensure that they are all valid.
Stay tuned to our Twitter and Facebook pages when this feature will be rolled out. We expect more updates to this in the coming weeks.