Speaking to Search Engines in Their Language
How To Speak Their Language: Schema.org Structured Data, XML Sitemaps, and Robots.txt
By the end of this, you’re going to realize that you have more control over your internet destiny than you realize.
Let’s start with Schema.org Structured Data first as it’s the biggest beast to conquer.
While we certainly can’t teach you how to implement Schema markup on your website today in this one email (much like Link Building), we can give you a healthy introduction to the topic to help you understand it a tad bit better.
So let’s start by explaining what Schema is!
Schema is essentially a language, similar to HTML, that is maintained and developed by an open community.
Think of it as a series of flags. 🚩 🚩 🚩
You can use these different flag colors to denote relationships between ‘entities’ on the internet. In this case, an entity could be a phone number, address, recipe, company, or review.
The Schema vocabulary is broken out into ‘types’ and ‘properties.’
For example, a ‘company’ is a type that has many properties. In this case, a property could be considered the geographic area where the company provides services, its email address, or the date that the company was founded.
People use these ‘properties’ as indicators to help search engines like Google better understand the information provided on a website.
There’s only one way to find the Schema markups you need and that’s on https://schema.org/. Once you find what you need, you then have to add it to the relevant pages only your website.
If you’re hoping to try this yourself but you’re intimidated by the code, Google has a Structured Data Markup Helper that really breaks it down and makes it super easy to tag your pages.
You’re probably wondering why we’re droning on and on about something that barely makes sense.
Why is Schema important?
Not only are you contributing to making the internet a better place by helping searching engines understand the content on your website, but this also helps users understand your content.
While Schema won’t increase your ranking factor directly, it can greatly increase your click-through rates through triggering rich snippets in search engine results.
A rich snippet could be a company’s customer service number, a list of instructions, or some of your best 5-star reviews under a search result.
Here’s an example of a rich snippet that appears when searching for “sourdough bread recipe.”
The rich snippet is the 4.5-star review, the number of people who have voted on this particular recipe, how long it takes to make it, and how many calories it has.
Now that’s pretty neat!
Moving along, let’s next talk about XML Sitemaps.
What are they and what significance do they hold for your website?
Firstly, let’s make sure you understand what an XML Sitemap is.
We can all agree that we want search engines like Google to crawl all the essential pages of our websites, right?
The problem here is sometimes, pages can end up having no internal links pointing to them.
That makes them hard to find, even by search engines.
An XML Sitemap lists all the website’s pages, ensuring Google can find and crawl them while also understanding the structure of your website.
The best way to think of what an XML Sitemap is is to imagine a roadmap. This roadmap shows Google where all the important pages are on your website.
Additionally, XML Sitemaps can be useful for your SEO, too, since they allow Google to find those essential pages faster.
While we often think that Google is all-knowing, it probably doesn’t know that you just created an XML Sitemap. If you want Google to be able to find your XML Sitemap, you’ll need to submit it to your Google Search Console account.
Lastly, let’s cover Robots.txt and why this it’s important.
Back in ancient times, when the internet was just a baby with its whole future in front of it, developers figured out a way to index and crawl new pages. These were aptly named ‘spiders’ or ‘robots.’
These little guys were then set out on their own to wander the vast lands of the internet, and occasionally they’d stumble onto websites that weren’t intended to be indexed or crawled (like sites underground maintenance).
Aliweb, the world’s first search engine, recommended a solution – a road map that each robot could follow.
And just like that, Robots.txt was born, which is an execution of a protocol called the “Robots Exclusion Protocol.”
This protocol lays out a set of guidelines that every legitimate robot has to follow, including Google’s bots.
You’re probably wondering why you still need to use Robots.txt in the 21st Century.
Again, search engines don’t know everything!
Your Robots.txt file acts as an indicator to search engines, telling it which pages you want accessed and indexed on your website and which pages you don’t want accessed or indexed.
See? All of this goes to show you that you have way more control over search engines than you may think.
If you’re already a client of ours, you can rest easy knowing that your site already has basic Schema Structured Data, an XML Sitemap, and Robots.txt within its framework. We include this on all the websites we build because we believe in how important it is.
Curious about what else there is to learn about Schema, XML Sitemaps, or Robots.txt?
Schedule Your Discovery Call
Give us 30 minutes to get to know each other, to learn about your business and objectives. We'll offer solutions, advice and an honest assessment of how we can help you.