These elements can be quite overwhelming for any small business owner or new marketer to understand fully… We get that! So today we are breaking these elements of your website down and telling you what they are, why they matter, and how you can easily make them search-engine friendly… No coding or technical knowledge required!
Sitemaps.xml are what search engine crawlers use to learn about a website and it’s structure. It’s important to note that a Sitemap.xml is not the same thing as a “sitemap.” A Sitemap.xml file is an XML-encoded listing of the key content files within a website, built specifically for search engine crawlers. This stands in contrast to a traditional sitemap file (lower-case s) that is an HTML file that lists the content files within a site for human users to find the content they’re looking for in a site.
The key difference is the intended audience (search engine vs. human) and the code. Search engines use Sitemaps.xml to learn about a website and it’s structure. That means that if a Sitemap, xml uses well-formed XML code, the website will be considered by search engines for future crawling activity. Sitemap.xml is simple to manage with Inbound Brew.
Manage Your Sitemap
To begin, click on Settings or Sitemap.xml from the main navigation bar.
From this screen you can adjust a variety of settings such as which search engines get blog update notifications and which types of page and post types you would like indexed. Once you are done selecting your options, be sure you hit “Save Settings” at the bottom of the screen.
Ping Your Website
When you download Inbound Brew, the plugin sitemap tool will automatically alert major search engines to re-index your website every time a page or post is published. This means you don’t have to wait for the crawler to discover your new content. However, you can also manually ping your website using the Sitemaps tool. Simply click the “Ping Now” button on the right.
One you do so, our plugin will Notify Google and/or Bing about updates of your blog. Then, you can choose to download Your Sitemap.xml File
You can also download your Sitemap index file from the settings screen. Simply click on the blue “sitemap.xml” link and a pop-up window will open with your complete Sitemap.
Manage Your Robots.txt
Lastly, you can also manage your robots.txt files from your settings window. For review, robots.txt is a text file that instructs search engine robots what pages not to crawl and also tells the crawler how to index and navigate a website. The proper use of robot.txt ensures your website “works” with search engines just as good as it looks to your users.
You can also discourage search engines from indexing your website all together by simply unchecking the box at the bottom of the page. When you’re done adjusting your robots and Sitemap settings, hit “Save Settings” and you are good to go!
Want to learn more about Robots.txt and Sitemaps.xml? Read this blog post!