Turn up the volume.
You've created content that people are searching for, that answers their questions, and that search engines can understand, but those qualities alone don't mean it'll rank. To outrank the rest of the sites with those qualities, you have to establish authority. That can be accomplished by earning links from authoritative websites, building your brand, and nurturing an audience who will help amplify your content.
Google has confirmed that links and quality content (which we covered back in Chapter 4) are two of the three most important ranking factors for SEO. Trustworthy sites tend to link to other trustworthy sites, and spammy sites tend to link to other spammy sites.
But what is a link, exactly? How do you go about earning them from other websites? Let's start with the basics.
There's a lot to remember when it comes to the wide world of link building. Check out more definitions for this section in the SEO glossary.
Since the late 1990s, search engines have treated links as votes for popularity and importance on the web.
Internal links, or links that connect internal pages of the same domain, work very similarly for your website. A high amount of internal links pointing to a particular page on your site will provide a signal to Google that the page is important, so long as it's done naturally and not in a spammy way.
The engines themselves have refined the way they view links, now using algorithms to evaluate sites and pages based on the links they find. But what's in those algorithms? How do the engines evaluate all those links? It all starts with the concept of E-A-T.
Google's Search Quality Rater Guidelines put a great deal of importance on the concept of E-A-T — an acronym for expert, authoritative, and trustworthy. Sites that don't display these characteristics tend to be seen as lower-quality in the eyes of the engines, while those that do are subsequently rewarded. E-A-T is becoming more and more important as search evolves and increases the importance of solving for user intent.
Creating a site that's considered expert, authoritative, and trustworthy should be your guiding light as you practice SEO. Not only will it simply result in a better site, but it's future-proof. After all, providing great value to searchers is what Google itself is trying to do.
"User intent" refers to the driving reason behind a searcher's query. A search for "puppy" doesn't have a strong intent — are they looking for pictures? Facts about breeds? Care information? On the other hand, a search for "puppy training in Seattle, WA" has a very strong intent: this user wants to train their puppy, they're probably looking for help in Seattle, and they may wish to sign up for a class. Try to craft content that satisfies your searchers' intent.
The more popular and important a site is, the more weight the links from that site carry. A site like Wikipedia, for example, has thousands of diverse sites linking to it. This indicates it provides lots of expertise, has cultivated authority, and is trusted among those other sites.
To earn trust and authority with search engines, you'll need links from websites that display the qualities of E-A-T. These don't have to be Wikipedia-level sites, but they should provide searchers with credible, trustworthy content.
Remember how links act as votes? The rel=nofollow attribute (pronounced as two words, "no follow") allows you to link to a resource while removing your "vote" for search engine purposes.
Just like it sounds, "nofollow" tells search engines not to follow the link. Some engines still follow them simply to discover new pages, but these links don't pass link equity (the "votes of popularity" we talked about above), so they can be useful in situations where a page is either linking to an untrustworthy source or was paid for or created by the owner of the destination page (making it an unnatural link).
Say, for example, you write a post about link building practices, and want to call out an example of poor, spammy link building. You could link to the offending site without signaling to Google that you trust it.
Standard links (ones that haven't had nofollow added) look like this:
<a href="">I love Moz</a>
Nofollow link markup looks like this:
<a href="" rel="nofollow">I love Moz</a>
Not necessarily. Think about all the legitimate places you can create links to your own website: a Facebook profile, a Yelp page, a Twitter account, etc. These are all natural places to add links to your website, but they shouldn't count as votes for your website. (Setting up a Twitter profile with a link to your site isn't a vote from Twitter that they like your site.)
It's natural for your site to have a balance between nofollowed and followed backlinks in its link profile (more on link profiles below). A nofollow link might not pass authority, but it could send valuable traffic to your site and even lead to future followed links.
Use the MozBar extension for Google Chrome to highlight links on any page to find out whether they're nofollow or follow without ever having to view the source code!
Visit Moz Link Explorer and type in your site's URL. You'll be able to see how many and which websites are linking back to you.
When people began to learn about the power of links, they began manipulating them for their benefit. They'd find ways to gain artificial links just to increase their search engine rankings. While these dangerous tactics can sometimes work, they are against Google's terms of service and can get a website deindexed (removal of web pages or entire domains from search results). You should always try to maintain a healthy link profile.
A healthy link profile is one that indicates to search engines that you're earning your links and authority fairly. Just like you shouldn't lie, cheat, or steal, you should strive to ensure your link profile is honest and earned via good, old-fashioned hard work.
Editorial links are links added naturally by sites and pages that want to link to your website.
The foundation of acquiring earned links is almost always through creating high-quality content that people genuinely wish to reference. This is where creating 10X content (a way of describing extremely high-quality content) is essential! If you can provide the best and most interesting resource on the web, people will naturally link to it.
Naturally earned links require no specific action from you, other than the creation of worthy content and the ability to create awareness about it.
When websites are referring to your brand or a specific piece of content you've published, they will often mention it without linking to it. To find these earned mentions, use Moz's Fresh Web Explorer. You can then reach out to those publishers to see if they'll update those mentions with links.
Links from websites within a topic-specific community are generally better than links from websites that aren't relevant to your site. If your website sells dog houses, a link from the Society of Dog Breeders matters much more than one from the Roller Skating Association. Additionally, links from topically irrelevant sources can send confusing signals to search engines regarding what your page is about.
Anchor text helps tell Google what the topic of your page is about. If dozens of links point to a page with a variation of a word or phrase, the page has a higher likelihood of ranking well for those types of phrases. However, proceed with caution! Too many backlinks with the same anchor text could indicate to the search engines that you're trying to manipulate your site's ranking in search results.
Consider this. You ask ten separate friends at separate times how their day was going, and they each responded with the same phrase:
"Great! I started my day by walking my dog, Peanut, and then had a picante beef Top Ramen for lunch."
That's strange, and you'd be quite suspicious of your friends. The same goes for Google. Describing the content of the target page with the anchor text helps them understand what the page is about, but the same description over and over from multiple sources starts to look suspicious. Aim for relevance; avoid spam.
Link building should never be solely about search engine rankings. Esteemed SEO and link building thought leader Eric Ward used to say that you should build your links as though Google might disappear tomorrow. In essence, you should focus on acquiring links that will bring qualified traffic to your website — another reason why it's important to acquire links from relevant websites whose audience would find value in your site, as well.
Spammy link profiles are just that: full of links built in unnatural, sneaky, or otherwise low-quality ways. Practices like buying links or engaging in a link exchange might seem like the easy way out, but doing so is dangerous and could put all of your hard work at risk. Google penalizes sites with spammy link profiles, so don't give in to temptation.
A guiding principle for your link building efforts is to never try to manipulate a site's ranking in search results.
But isn't that the entire goal of SEO? To increase a site's ranking in search results? And herein lies the confusion. Google wants you to earn links, not build them, but the line between the two is often blurry. To avoid penalties for unnatural links (known as "link spam"), Google has made clear what should be avoided.
Google and Bing both seek to discount the influence of paid links in their organic search results. While a search engine can't know which links were earned vs. paid for from viewing the link itself, there are clues it uses to detect patterns that indicate foul play. Websites caught buying or selling followed links risk severe penalties that will severely drop their rankings. (By the way, exchanging goods or services for a link is also a form of payment and qualifies as buying links.)
If you've ever received a "you link to me and I'll link to you" email from someone you have no affiliation with, you've been targeted for a link exchange. Google's quality guidelines caution against "excessive" link exchange and similar partner programs conducted exclusively for the sake of cross-linking, so there is some indication that this type of exchange on a smaller scale might not trigger any link spam alarms.
It is acceptable, and even valuable, to link to people you work with, partner with, or have some other affiliation with and have them link back to you.
It's the exchange of links at mass scale with unaffiliated sites that can warrant penalties.
These used to be a popular source of manipulation. A large number of pay-for-placement web directories exist to serve this market and pass themselves off as legitimate, with varying degrees of success. These types of sites tend to look very similar, with large lists of websites and their descriptions (typically, the site's critical keyword is used as the anchor text to link back to the submittor's site).
There are many more manipulative link building tactics that search engines have identified. In most cases, they have found algorithmic methods for reducing their impact. As new spam systems emerge, engineers will continue to fight them with targeted algorithms, human reviews, and the collection of spam reports from webmasters and SEOs. By and large, it isn't worth finding ways around them.
Link building comes in many shapes and sizes, but one thing is always true: link campaigns should always match your unique goals. With that said, there are some popular methods that tend to work well for most campaigns. This is not an exhaustive list, so visit Moz's blog posts on link building for more detail on this topic.
If you have partners you work with regularly, or loyal customers that love your brand, there are ways to earn links from them with relative ease. You might send out partnership badges (graphic icons that signify mutual respect), or offer to write up testimonials of their products. Both of those offer things they can display on their website along with links back to you.
This content and link building strategy is so popular and valuable that it's one of the few recommended personally by the engineers at Google. Blogs have the unique ability to contribute fresh material on a consistent basis, generate conversations across the web, and earn listings and links from other blogs.
Careful, though — you should avoid low-quality guest posting just for the sake of link building. Google has advised against this and your energy is better spent elsewhere.
Creating unique, high-quality resources is no easy task, but it's well worth the effort. High quality content that is promoted in the right ways can be widely shared. It can help to create pieces that have the following traits:
Creating a resource like this is a great way to attract a lot of links with one page. You could also create a highly-specific resource — without as broad of an appeal — that targeted a handful of websites. You might see a higher rate of success, but that approach isn't as scalable.
Users who see this kind of unique content often want to share it with friends, and bloggers/tech-savvy webmasters who see it will often do so through links. These high-quality, editorially earned votes are invaluable to building trust, authority, and rankings potential.
Resource pages are a great way to build links. However, to find them you'll want to know some advanced Google operators to make discovering them a bit easier.
For example, if you were doing link building for a company that made pots and pans, you could search for:
cooking intitle:"resources"
...and see which pages might be good link targets.
This can also give you great ideas for content creation — just think about which types of resources you could create that these pages would all like to reference and link to.
For a local business (one that meets its customers in person), community outreach can result in some of the most valuable and influential links.
All of these smart and authentic strategies provide good local link opportunities.
From content research to plagiarism checks to technical audits and beyond, using advanced Google search operators can power up your SEO research.
Building linked unstructured citations — references to a business' contact information on a non-directory platform, like a blog or a news site — is important for moving the ranking needle for local SEO. It's also a great way to earn valuable links when you're marketing a local business. Learn more in our guide:
You likely already know which of your site's content earns the most traffic, converts the most customers, or retains visitors for the longest amount of time.
Take that content and refurbish it for other platforms (Slideshare, YouTube, Instagram, Quora, etc.) to expand your acquisition funnel beyond Google.
You can also dust off, update, and simply republish older content on the same platform. If you discover that a few trusted industry websites all linked to a popular resource that's gone stale, update it and let those industry websites know — you may just earn a good link.
You can also do this with images. Reach out to websites that are using your images and not citing you or linking back to you and ask if they'd mind including a link.
Earning the attention of the press, bloggers, and news media is an effective, time-honored way to earn links. Sometimes this is as simple as giving something away for free, releasing a great new product, or stating something controversial. Since so much of SEO is about creating a digital representation of your brand in the real world, to succeed in SEO, you have to be a great brand.
The most common mistake new SEOs make when trying to build links is not taking the time to craft a custom, personal, and valuable initial outreach email. You know as well as anyone how annoying spammy emails can be, so make sure yours doesn't make people roll their eyes.
Your goal for an initial outreach email is simply to get a response. These tips can help:
Metrics for link building should match up with the site's overall KPIs. These might be sales, email subscriptions, page views, etc. You should also evaluate Domain Authority and/or Page Authority scores, the ranking of desired keywords, and the amount of traffic to your content. We'll talk more about measuring the success of your SEO campaigns in Chapter 7.
So far, we've gone over the importance of earning quality links to your site over time, as well as some common tactics for doing so. Now, we’ll cover ways to measure the returns on your link building investment and strategies for sustaining quality backlink growth over time.
The most direct way to measure your link building efforts is by tracking the growth of total links to your site or page. Moz’s Link Explorer is a great tool for doing that. For example, say you recently published a blog post that received a lot of attention and you want to track total links that resource earned.
Pop the URL into Link Explorer…
Some SEOs not only need to build good links, but need to get rid of bad ones as well. If you’re performing link cleanup while simultaneously building good links, just keep in mind that a stagnating or declining "linking domains over time" graph is completely normal. You might also want to check out Link Explorer’s "Discovered and Lost" tool to keep track of exactly which links you’ve gained and lost.
If you didn’t see the number of backlinks come in that you were aiming for, all hope is not lost! Each link building campaign is something you can learn from. If you want to improve the total links you earn for your next campaign, consider these questions:
It’s possible that the reason your link building efforts fell flat is that your content wasn’t substantially more valuable than anything else like it. Take a look back at the pages ranking for that term you’re targeting and see if there’s anything else you could do to improve.
Promotion is perhaps one of the most difficult aspects of link building, but letting people know about your content and convincing them to link to you is what’s really going to move the needle. For great tips on content promotion, visit Chapter 7 of our Beginner's Guide to Content Marketing.
Consider how many backlinks you might actually need to rank for the keyword you were targeting. In Keyword Explorer’s "SERP Analysis" report, you can view the pages that are ranking for the term you're targeting, as well as how many backlinks those URLs have. This will give you a good benchmark for determining how many links you actually need in order to compete and which websites might be a good link target.
One link from a very authoritative source is more valuable than ten from low-quality sites, so keep in mind that quantity isn’t everything. When targeting sites for backlinks, you can prioritize by how authoritative they are using Domain Authority and Page Authority metrics.
A lot of the methods you'd use to build links will also indirectly build your brand. In fact, you can view link building as a great way to increase awareness of your brand, the topics on which you're an authority, and the products or services you offer.
Once your target audience is familiar with you and you have valuable content to share, let your audience know about it! Sharing your content on social platforms will not only make your audience aware of your content, but it can also encourage them to amplify that awareness to their own networks, thereby extending your own reach.
Are social shares the same as links? No. But shares to the right people can result in links. Social shares can also promote an increase in traffic and new visitors to your website, which can grow brand awareness, and with a growth in brand awareness can come a growth in trust and links. The connection between social signals and rankings seems indirect, but even indirect correlations can be helpful for informing strategy.
For search engines, trust is largely determined by the quality and quantity of the links your domain has earned, but that's not to say that there aren't other factors at play that can influence your site's authority. Think about all the different ways you come to trust a brand:
That last point is what we're going to focus on here. Reviews of your brand, its products, or its services can make or break a business.
In your effort to establish authority from reviews, follow these review rules of thumb:
Be aware that review spam is a problem that's taken on global proportions, and that violation of governmental truth-in-advertising guidelines has led to legal prosecution and heavy fines. It's just too dangerous to be worth it. Playing by the rules and offering exceptional customer experiences is the winning combination for building both trust and authority over time.
Authority is built when brands are doing great things in the real-world, making customers happy, creating and sharing great content, and earning links from reputable sources.
Set yourself up for success.
They say if you can measure something, you can improve it.
In SEO, it’s no different. Professional SEOs track everything from rankings and conversions to lost links and more to help prove the value of SEO. Measuring the impact of your work and ongoing refinement is critical to your SEO success, client retention, and perceived value.
It also helps you pivot your priorities when something isn’t working.
While it’s common to have multiple goals (both macro and micro), establishing one specific primary end goal is essential.
The only way to know what a website’s primary end goal should be is to have a strong understanding of the website’s goals and/or client needs. Good client questions are not only helpful in strategically directing your efforts, but they also show that you care.
Client question examples:
Keep the following tips in mind while establishing a website’s primary goal, additional goals, and benchmarks:
Asking your client the right questions is key to understanding their website goals. We've prepared a list of questions you can use to start getting to know your clients below!
Speaking of industry marketing jargon, make sure you're on top of it with the SEO glossary for this chapter!
Now that you’ve set your primary goal, evaluate which additional metrics could help support your site in reaching its end goal. Measuring additional (applicable) benchmarks can help you keep a better pulse on current site health and progress.
How are people behaving once they reach your site? That’s the question that engagement metrics seek to answer. Some of the most popular metrics for measuring how people engage with your content include:
The number of conversions (for a single desired action/goal) divided by the number of unique visits. A conversion rate can be applied to anything, from an email signup to a purchase to account creation. Knowing your conversion rate can help you gauge the return on investment (ROI) your website traffic might deliver.
How long did people spend on your page? If you have a 2,000-word blog post that visitors are only spending an average of 10 seconds on, the chances are slim that this content is being consumed (unless they’re a mega-speed reader). However, if a URL has a low time on page, that’s not necessarily bad either. Consider the intent of the page. For example, it’s normal for “Contact Us” pages to have a low average time on page.
Was the goal of your page to keep readers engaged and take them to a next step? If so, then pages per visit can be a valuable engagement metric. If the goal of your page is independent of other pages on your site (ex: visitor came, got what they needed, then left), then low pages per visit are okay.
"Bounced" sessions indicate that a searcher visited the page and left without browsing your site any further. Many people try to lower this metric because they believe it’s tied to website quality, but it actually tells us very little about a user’s experience. We’ve seen cases of bounce rate spiking for redesigned restaurant websites that are doing better than ever. Further investigation discovered that people were simply coming to find business hours, menus, or an address, then bouncing with the intention of visiting the restaurant in person. A better metric to gauge page/site quality is scroll depth.
This measures how far visitors scroll down individual webpages. Are visitors reaching your important content? If not, test different ways of providing the most important content higher up on your page, such as multimedia, contact forms, and so on. Also consider the quality of your content. Are you omitting needless words? Is it enticing for the visitor to continue down the page? Scroll depth tracking can be set up in your Google Analytics.
Ranking is a valuable SEO metric, but measuring your site’s organic performance can’t stop there. The goal of showing up in search is to be chosen by searchers as the answer to their query. If you’re ranking but not getting any traffic, you have a problem.
But how do you even determine how much traffic your site is getting from search? One of the most precise ways to do this is with Google Analytics.
Google Analytics (GA) is bursting at the seams with data — so much so that it can be overwhelming if you don’t know where to look. This is not an exhaustive list, but rather a general guide to some of the traffic data you can glean from this free tool.
Isolate organic traffic
GA allows you to view traffic to your site by channel. This will mitigate any scares caused by changes to another channel (ex: total traffic dropped because a paid campaign was halted, but organic traffic remained steady).
Traffic to your site over time
GA allows you to view total sessions/users/pageviews to your site over a specified date range, as well as compare two separate ranges.
How many visits a particular page has received
Site Content reports in GA are great for evaluating the performance of a particular page — for example, how many unique visitors it received within a given date range.
Traffic from a specified campaign
You can use UTM (urchin tracking module) codes for better attribution. Designate the source, medium, and campaign, then append the codes to the end of your URLs. When people start clicking on your UTM-code links, that data will start to populate in GA’s "campaigns" report.
Click-through rate (CTR)
Your CTR from search results to a particular page (meaning the percent of people that clicked your page from search results) can provide insights on how well you’ve optimized your page title and meta description. You can find this data in Google Search Console, a free Google tool.
In addition, Google Tag Manager is a free tool that allows you to manage and deploy tracking pixels to your website without having to modify the code. This makes it much easier to track specific triggers or activity on a website.
Moz’s proprietary authority metrics provide powerful insights at a glance and are best used as benchmarks relative to your competitors’ Domain Authority and Page Authority.
A website’s ranking position for desired keywords. This should also include SERP feature data, like featured snippets and People Also Ask boxes that you’re ranking for. Try to avoid vanity metrics, such as rankings for competitive keywords that are desirable but often too vague and don’t convert as well as longer-tail keywords.
Total number of links pointing to your website or the number of unique linking root domains (meaning one per unique website, as websites often link out to other websites multiple times). While these are both common link metrics, we encourage you to look more closely at the quality of backlinks and linking root domains your site has.
There are lots of different tools available for keeping track of your site’s position in SERPs, site crawl health, SERP features, and link metrics, such as Moz Pro and STAT.
The Moz and STAT APIs (among other tools) can also be pulled into Google Sheets or other customizable dashboard platforms for clients and quick at-a-glance SEO check-ins. This also allows you to provide more refined views of only the metrics you care about.
Dashboard tools like Data Studio, Tableau, and PowerBI can also help to create interactive data visualizations.
By having an understanding of certain aspects of your website — its current position in search, how searchers are interacting with it, how it’s performing, the quality of its content, its overall structure, and so on — you’ll be able to better uncover SEO opportunities. Leveraging the search engines’ own tools can help surface those opportunities, as well as potential issues:
While we don’t have room to cover every SEO audit check you should perform in this guide, we do offer an in-depth Technical SEO Site Audit course for more info. When auditing your site, keep the following in mind:
Are your primary web pages crawlable by search engines, or are you accidentally blocking Googlebot or Bingbot via your robots.txt file? Does the website have an accurate sitemap.xml file in place to help direct crawlers to your primary pages?
Can your primary pages be found using Google? Doing a site:yoursite.com OR site:yoursite.com/specific-page check in Google can help answer this question. If you notice some are missing, check to make sure a meta robots=noindex tag isn’t excluding pages that should be indexed and found in search results.
Do your titles and meta descriptions do a good job of summarizing the content of each page? How are their CTRs in search results, according to Google Search Console? Are they written in a way that entices searchers to click your result over the other ranking URLs? Which pages could be improved? Site-wide crawlsare essential for discovering on-page and technical SEO opportunities.
How does your website perform on mobile devices and in Lighthouse? Which images could be compressed to improve load time?
How well does the current content of the website meet the target market’s needs? Is the content 10X better than other ranking websites’ content? If not, what could you do better? Think about things like richer content, multimedia, PDFs, guides, audio content, and more.
Removing thin, old, low-quality, or rarely visited pages from your site can help improve your website’s perceived quality. Performing a content auditwill help you discover these pruning opportunities.
Keyword research and competitive website analysis (performing audits on your competitors’ websites) can also provide rich insights on opportunities for your own website.
For example:
Discovering website content and performance opportunities will help devise a more data-driven SEO plan of attack! Keep an ongoing list in order to prioritize your tasks effectively.
In order to prioritize SEO fixes effectively, it’s essential to first have specific, agreed-upon goals established between you and your client.
While there are a million different ways you could prioritize SEO, we suggest you rank them in terms of importance and urgency. Which fixes could provide the most ROI for a website and help support your agreed-upon goals?
Stephen Covey, author of The 7 Habits of Highly Effective People, developed a handy time management grid that can ease the burden of prioritization:
Urgent | Not Urgent | |
---|---|---|
Important | Quadrant I: Urgent & Important | Quadrant II: Not Urgent & Important |
Not Important | Quadrant III: Urgent & Not Important | Quadrant IV: Not Urgent & Not Important |
Urgent | Not Urgent | |
---|---|---|
Important | Primary page issues, high-volume issues | Non-primary page issues, mid-volume issues |
Not Important | Client reports (unrelated to goals), vanity keywords | Video sitemaps, meta keywords tag |
Much of your success depends on effectively mapping out and scheduling your SEO tasks. Free tools like Google Sheets can help plan out your SEO execution (we have a free template here), but you can use whatever method works best for you. Some people prefer to schedule out their SEO tasks in their Google Calendar, in a kanban or scrum board, or in a daily planner.
Use what works for you and stick to it.
Measuring your progress along the way via the metrics mentioned above will help you monitor your effectiveness and allow you to pivot your SEO efforts when something isn’t working. Say, for example, you changed a primary page’s title and meta description, only to notice that the CTR for that page decreased. Perhaps you changed it to something too vague or strayed too far from the on-page topic — it might be good to try a different approach. Keeping an eye on drops in rankings, CTRs, organic traffic, and conversions can help you manage hiccups like this early, before they become a bigger problem.
Many SEO fixes are implemented without being noticeable to a client (or user). This is why it’s essential to employ good communication skills around your SEO plan, the time frame in which you’re working, and your benchmark metrics, as well as frequent check-ins and reports.
Congratulations on making it through the entire Beginner’s Guide to SEO! Now it’s time for the fun part — applying it. As a next step, we recommend taking the initiative to start an SEO project of your own. Read on for our suggestions!
Get started on your SEO planning with the worksheet template we've provided below. Make a copy and edit it to match your needs!
The best thing you can do to build your confidence, skills, and abilities is to dive in and get your hands dirty. If you’re serious about SEO and hope to serve clients someday, there’s no better place to start than with your own website, whether there’s a hobby you’d like to blog about or you need to set up a personal freelancing page.
We've put together a quick to-do list you can use to guide your next steps in the wide, wonderful world of SEO:
When it comes to tracking your SEO progress, data is your best friend. You can use Moz Pro's suite of SEO analytics and research tools to keep a close eye on rankings, link building, technical site health, and more. Put your new SEO skills into action with a free 30-day trial of Moz Pro!
We know learning all the ins and outs of SEO vocabulary and jargon can feel like learning another language. To help you get a handle on all the new terms we're throwing at you, we've compiled a chapter-by-chapter SEO glossary with definitions and helpful links. You might want to bookmark this page for future reference!
10 blue links: The format search engines used to display search results; ten organic results all appearing in the same format.
Black hat: Search engine optimization practices that violate Google’s quality guidelines.
Crawling: The process by which search engines discover your web pages.
De-indexed: Refers to a page or group of pages being removed from Google’s index.
Featured snippets: Organic answer boxes that appear at the top of SERPs for certain queries.
Google My Business listing: A free listing available to local businesses.
Image carousels: Image results in some SERPs that are scrollable from left to right.
Indexing: The storing and organizing of content found during crawling.
Intent: In the context of SEO, intent refers to what users really want from the words they typed into the search bar.
KPI: A “key performance indicator” is a measurable value that indicates how well an activity is achieving a goal.
Local pack: A pack of typically three local business listings that appear for local-intent searches such as “oil change near me.”
Organic: Earned placement in search results, as opposed to paid advertisements.
People Also Ask boxes: A box in some SERPs featuring a list of questions related to the query and their answers.
Query: Words typed into the search bar.
Ranking: Ordering search results by relevance to the query.
Search engine: An information retrieval program that searches for items in a database that match the request input by the user. Examples: Google, Bing, and Yahoo.
SERP features: Results displayed in a non-standard format.
SERP: Stands for “search engine results page” — the page you see after conducting a search.
Traffic: Visits to a website.
URL: Uniform Resource Locators are the locations or addresses for individual pieces of content on the web.
Webmaster guidelines: Guidelines published by search engines like Google and Bing for the purpose of helping site owners create content that will be found, indexed, and perform well in search results.
White hat: Search engine optimization practices that comply with Google’s quality guidelines.
2xx status codes: A class of status codes that indicate the request for a page has succeeded.
4xx status codes: A class of status codes that indicate the request for a page resulted in error.
5xx status codes: A class of status codes that indicate the server’s inability to perform the request.
Advanced search operators: Special characters and commands you can type into the search bar to further specify your query.
Algorithms: A process or formula by which stored information is retrieved and ordered in meaningful ways.
Backlinks: Or "inbound links" are links from other websites that point to your website.
Bots: Also known as “crawlers” or “spiders,” these are what scour the Internet to find content.
Caching: A saved version of your web page.
Caffeine: Google’s web indexing system. Caffeine is the index, or collection of web content, whereas Googlebot is the crawler that goes out and finds the content.
Citations: Also known as a “business listing,” a citation is a web-based reference to a local business' name, address, and phone number (NAP).
Cloaking: Showing different content to search engines than you show to human visitors.
Crawl budget: The average number of pages a search engine bot will crawl on your site
Crawler directives: Instructions to the crawler regarding what you want it to crawl and index on your site.
Distance: In the context of the local pack, distance refers to proximity, or the location of the searcher and/or the location specified in the query.
Engagement: Data that represents how searchers interact with your site from search results.
Google Quality Guidelines: Published guidelines from Google detailing tactics that are forbidden because they are malicious and/or intended to manipulate search results.
Google Search Console: A free program provided by Google that allows site owners to monitor how their site is doing in search.
HTML: Hypertext markup language is the language used to create web pages.
Index Coverage report: A report in Google Search Console that shows you the indexation status of your site’s pages.
Index: A huge database of all the content search engine crawlers have discovered and deem good enough to serve up to searchers.
Internal links: Links on your own site that point to your other pages on the same site.
JavaScript: A programming language that adds dynamic elements to static web pages.
Login forms: Refers to pages that require login authentication before a visitor can access the content.
Manual penalty: Refers to a Google “Manual Action” where a human reviewer has determined certain pages on your site violate Google’s quality guidelines.
Meta robots tag: Pieces of code that provide crawlers instructions for how to crawl or index web page content.
Navigation: A list of links that help visitors navigate to other pages on your site. Often, these appear in a list at the top of your website (“top navigation”), on the side column of your website (“side navigation”), or at the bottom of your website (“footer navigation”).
NoIndex tag: A meta tag that instructions a search engine not to index the page it’s on.
PageRank: A component of Google's core algorithm. It is a link analysis program that estimates the importance of a web page by measuring the quality and quantity of links pointing to it.
Personalization: Refers to the way a search engine will modify a person’s results on factors unique to them, such as their location and search history.
Prominence: In the context of the local pack, prominence refers to businesses that are well-known and well-liked in the real world.
RankBrain: the machine learning component of Google’s core algorithm that adjusts ranking by promoting the most relevant, helpful results.
Relevance: In the context of the local pack, relevance is how well a local business matches what the searcher is looking for
Robots.txt: Files that suggest which parts of your site search engines should and shouldn't crawl.
Search forms: Refers to search functions or search bars on a website that help users find pages on that website.
Search Quality Rater Guidelines: Guidelines for human raters that work for Google to determine the quality of real web pages.
Sitemap: A list of URLs on your site that crawlers can use to discover and index your content.
Spammy tactics: Like “black hat,” spammy tactics are those that violate search engine quality guidelines.
URL folders: Sections of a website occurring after the TLD (“.com”), separated by slashes (“/”). For example, in “moz.com/blog” we could say “/blog” is a folder.
URL parameters: Information following a question mark that is appended to a URL to change the page’s content (active parameter) or track information (passive parameter).
X-robots-tag: Like meta robots tags, this tag provides crawlers instructions for how to crawl or index web page content.
Ambiguous intent: Refers to a search phrase where the goal of the searcher is unclear and requires further specification.
Commercial investigation queries: A query in which the searcher wants to compare products to find the one that best suits them.
Informational queries: A query in which the searcher is looking for information, such as the answer to a question.
Keyword Difficulty: At Moz, Keyword Difficulty is an estimate, in the form of a numerical score, of how difficult it is for a site to outrank their competitors.
Keyword Explorer: A Moz tool for in-depth keyword research and discovery.
Local queries: A query in which the searcher is looking for something in a specific location, such as “coffee shops near me” or “gyms in Brooklyn.”
Long-tail keywords: Longer queries, typically those containing more than three words. Indicative of their length, they are often more specific than short-tail queries.
Navigational queries: A query in which the searcher is trying to get to a certain location, such as the Moz blog (query = “Moz blog”).
Regional keywords: Refers to keywords unique to a specific locale. Use Google Trends, for example, to see whether “pop” or “soda” is the more popular term in Kansas.
Search volume: The number of times a keyword was searched. Many keyword research tools show an estimated monthly search volume.
Seasonal trends: Refers to the popularity of keywords over time, such as “Halloween costumes” being most popular the week before October 31.
Seed keywords: The term we use to describe the primary words that describe the product or service you provide.
Transactional queries: The searcher wants to take an action, such as buy something. If keyword types sat in the marketing funnel, transactional queries would be at the bottom.
Alt text: Alternative text is the text in HTML code that describes the images on web pages.
Anchor text: The text with which you link to pages.
Auto-generated content: Content that is created programmatically, not written by humans.
Duplicate content: Content that is shared between domains or between multiple pages of a single domain.
Geographic modifiers: Terms that describe a physical location or service area. For example, “pizza” is not geo-modified, but “pizza in Seattle” is.
Header tags: An HTML element used to designate headings on your page.
Image compression: The act of speeding up web pages by making image file sizes smaller without degrading the image’s quality.
Image sitemap: A sitemap containing only the image URLs on a website.
Keyword stuffing: A spammy tactic involving the overuse of important keywords and their variants in your content and links.
Link accessibility: The ease with which a link can be found by human visitors or crawlers.
Link equity: The value or authority a link can pass to its destination.
Link volume: The quantity of links on a page.
Local business schema: Structured data markup placed on a web page that helps search engines understand information about a business.
Meta descriptions: HTML elements that describe the contents of the page that they’re on. Google sometimes uses these as the description line in search result snippets.
Panda: A Google algorithm update that targeted low-quality content.
Protocol: The “http” or “https” preceding your domain name. This governs how data is relayed between the server and browser.
Redirection: When a URL is moved from one location to another. Most often, redirection is permanent (301 redirect).
Rel=canonical: A tag that allows site owners to tell Google which version of a web page is the original and which are the duplicates.
Scraped content: Taking content from websites that you do not own and republishing it without permission on your own site.
SSL certificate: A “Secure Sockets Layer” is used to encrypt data passed between the web server and browser of the searcher.
Thin content: Content that adds little-to-no value to the visitor.
Thumbnails: Image thumbnails are a smaller version of a larger image.
Title tag: An HTML element that specifies the title of a web page.
AMP: Often described as “diet HTML,” accelerated mobile pages (AMP) are designed to make the viewing experience lightning fast for mobile visitors.
Async: Short for “asynchronous,” async means that the browser doesn’t have to wait for a task to finish before moving onto the next one while assembling your web page.
Browser: A web browser, like Chrome or Firefox, is software that allows you to access information on the web. When you make a request in your browser (ex: “google.com”), you’re instructing your browser to retrieve the resources necessary to render that page on your device.
Bundling: To combine multiple resources into a single resource.
ccTLD: Short for “country code top level domain,” ccTLD refers to domains associated with countries. For example, .ru is the recognized ccTLD for Russia.
Client-side & server-side rendering: Client-side and server-side rendering refer to where the code runs. Client-side means the file is executed in the browser. Server-side means the files are executed at the server and the server sends them to the browser in their fully rendered state.
Critical rendering path: The sequence of steps a browser goes through to convert HTML, CSS and JavaScript into a viewable web page.
CSS: A Cascading Style Sheet (CSS) is the code that makes a website look a certain way (ex: fonts and colors).
DNS: A Domain Name Server (DNS) allows domain names (ex: “moz.com”) to be linked to IP addresses (ex: “127.0.0.1”). DNS essentially translates domain names into IP addresses so that browsers can load the page’s resources.
DOM: The Document Object Model (DOM) is the structure of an HTML document — it defines how that document can be accessed and changed by things like JavaScript.
Domain name registrar: A company that manages the reservation of internet domain names. Example: GoDaddy.
Faceted navigation: Often used on e-commerce websites, faceted navigations offer a number of sorting and filtering options to help visitors more easily locate the URL they’re looking for out of a stack of thousands or even millions of URLs. For example, you could sort a clothing page by price: low to high, or filter the page to view only size: small.
Fetch and Render tool: A tool available in Google Search Console that allows you to see a web page how Google sees it.
File compression: The process of encoding information using fewer bits; reducing the size of the file. There are many different compression techniques.
Hreflang: A tag that indicates to Google which language the content is in. This helps Google serve the appropriate language version of your page to people searching in that language.
IP address: An internet protocol (IP) address is a string of numbers that’s unique to each specific website. We assign domain names to IP addresses because they’re easier for humans to remember (ex: “moz.com”) but the internet needs these numbers to find websites.
JSON-LD: JavaScript Object Notation for Linked Data (JSON-LD) is a format for structuring your data. For example, schema.org can be implemented in a number of different formats, JSON-LD is just one of them, but it is the format preferred by Google.
Lazy loading: A way of deferring the loading of an object until it’s needed. This method is often used to improve page speed.
Minification: To minify something means to remove as many unnecessary characters from the source code as possible without altering functionality. Whereas compression makes something smaller, minification actually removes things.
Mobile-first indexing: Google began progressively moving websites over to mobile first indexing in 2018. This change means that Google crawls and indexes your pages based on their mobile version rather than their desktop version.
Pagination: A website owner can opt to split a page into multiple parts in a sequence, similar to pages in the book. This can be especially helpful on very large pages. The hallmarks of a paginated page are the rel=”next” and rel=”prev” tags, indicating where each page falls in the greater sequence. These tags help Google understand that the pages should have consolidated link properties and that searchers should be sent to the first page in the sequence.
Programming language: Writing instructions in a way a computer can understand. For example, JavaScript is a programming language that adds dynamic (not-static) elements to a web page.
Rendering: The process of a browser turning a website’s code into a viewable page.
Render-blocking scripts: A script that forces your browser to wait to be fetched before the page can be rendered. Render-blocking scripts can add extra round trips before your browser can fully render a page.
Responsive design: Google’s preferred design pattern for mobile-friendly websites, responsive design allows the website to adapt to fit whatever device it’s being viewed on.
Rich snippet: A snippet is the title and description preview that Google and other search engines show of URLs on its results page. A “rich” snippet, therefore, is an enhanced version of the standard snippet. Some rich snippets can be encouraged by the use of structured data markup, like review markup displaying as rating stars next to those URLs in the search results.
Schema.org: Code that “wraps around” elements of your web page to provide additional information about it to the search engine. Data using schema.org is referred to as “structured” as opposed to “unstructured” — in other words, organized rather than unorganized.
SRCSET: Like responsive design for images, SRCSET indicates which version of the image to show for different situations.
Structured Data: Another way to say “organized” data (as opposed to unorganized). Schema.org is a way to structure your data, for example, by labeling it with additional information that helps the search engine understand it.
10x content: Coined by Rand Fishkin to describe content that is “10x better” than anything else on the web for that same topic.
Amplification: Sharing or spreading the word about your brand; often used in the context of social media, paid advertisements, and influencer marketing.
DA: Domain Authority (DA) is a Moz metric used to predict a domain’s ranking ability; best used as a comparative metric (ex: comparing a website’s DA score to that of its direct competitors).
Deindexed: When a URL, section of URLs, or an entire domain has been removed from a search engine index. This can happen for a number of reasons, such as when a website receives a manual penalty for violating Google’s quality guidelines.
Directory links: “Directory” in the context of local SEO is an aggregate list of local businesses, usually including each business’s name, address, phone number (NAP) and other information like their website. “Directory” can also refer to a type of unnatural link that violates Google’s guidelines: “low-quality directory or bookmark site links.”
Editorial links: When links are earned naturally and given out of an author’s own volition (rather than paid for or coerced), they are considered editorial.
Fresh Web Explorer: A Moz tool that allows you to scan the web for mentions of a specific word or phrase, such as your brand name.
Follow: The default state of a link, “follow” links pass PageRank.
Google Analytics: A free (with an option to pay for upgraded features) tool that helps website owners get insight into how people are engaging with their website. Some examples of reports you can see in Google Analytics include acquisition reports that show what channels your visitors are coming from, and conversion reports that show the rate at which people are completing goals (ex: form fills) on your website.
Google search operators: Special text that can be appended to your query to further specify what types of results you’re looking for. For example, adding “site:” before a domain name can return a list of all (or many) indexed pages on said domain.
Guest blogging: Often used as a link building strategy, guest blogging involves pitching an article (or idea for an article) to a publication in the hopes that they will feature your content and allow you to include a link back to your website. Just be careful though. Large-scale guest posting campaigns with keyword-rich anchor text links are a violation of Google’s quality guidelines.
Link building: While “building” sounds like this activity involves creating links to your website yourself, link building actually describes the process of earning links to your site for the purpose of building your site’s authority in search engines.
Link exchange: Also known as reciprocal linking, link exchanges involve “you link to me and I’ll link to you” tactics. Excessive link exchanges are a violation of Google’s quality guidelines.
Link Explorer: Moz’s tool for link discovery and analysis.
Link profile: A term used to describe all the inbound links to a select domain, subdomain, or URL.
Linked unstructured citations: References to a business’ complete or partial contact information on a non-directory platform (like online news, blogs, best-of lists, etc.)
MozBar: A plugin available for the Chrome browser that allows you to easily view metrics for the selected page, like DA, PA, title tag, and more.
NoFollow: Links marked up with rel=”nofollow” do not pass PageRank. Google encourages the use of these in some situations, like when a link has been paid for.
PA: Similar to DA, Page Authority (PA) predicts an individual page’s ranking ability.
Purchased links: Exchanging money, or something else of value, for a link. If a link is purchased, it constitutes an advertisement and should be treated with a nofollow tag so that it does not pass PageRank.
Qualified traffic: When traffic is “qualified,” it usually means that the visit is relevant to the intended topic of the page, and therefore the visitor is more likely to find the content useful and convert.
Referral Traffic: Traffic sent to a website from another website. For example, if your website is receiving visits from people clicking on your site from a link on Facebook, Google Analytics will attribute that traffic as “facebook.com / referral” in the Source/Medium report.
Resource pages: Commonly used for the purpose of link building, resource pages typically contain a list of helpful links to other websites. If your business sells email marketing software, for example, you could look up marketing intitle:"resources" and reach out to the owners of said sites to see if they would include a link to your website on their page.
Sentiment: How people feel about your brand.
Spam Score: A Moz metric used to quantify a domain’s relative risk of penalization by using a series of flags that are highly correlated with penalized sites.
Unnatural links: Google describes unnatural links as “creating links that weren’t editorially placed or vouched for by the site’s owner on a page.” This is a violation of their guidelines and could warrant a penalty against the offending website.
API: An application programming interface (API) allows for the creation of applications by accessing the features or data of another service like an operating system or application.
Bounce rate: The percentage of total visits that did not result in a secondary action on your site. For example, if someone visited your home page and then left before viewing any other pages, that would be a bounced session.
Channel: The different vehicles by which you can get attention and acquire traffic, such as organic search and social media.
Click-through rate: The ratio of impressions to clicks on your URLs.
Conversion rate: The ratio of visits to conversions. Conversion rate answers how many of my website visitors are filling out my forms, calling, signing up for my newsletter, etc.?
Qualified lead: If you use your website to encourage potential customers to contact you via phone call or form, a “lead” is every contact you receive. Not all of those leads will become customers, but “qualified” leads are relevant prospects that have a high likelihood of becoming paying customers.
Google Analytics goals: What actions are you hoping people take on your website? Whatever your answer, you can set those up as goals in Google Analytics to track your conversion rate.
Google Tag Manager: A single hub for managing multiple website tracking codes.
Googlebot / Bingbot: How major search engines like Google and Bing crawl the web; their “crawlers” or “spiders.”
Kanban: A scheduling system.
Pages per session: Also referred to as “page depth,” pages per session describes the average number of pages people view of your website in a single session.
Page speed: Page speed is made up of a number of equally important qualities, such as first contentful/meaningful paint and time to interactive.
Pruning: In an SEO context, pruning typically refers to removing low-quality pages in order to increase the quality of the site overall.
Scroll depth: A method of tracking how far visitors are scrolling down your pages.
Scrum board: A method of keeping track of tasks that need to be completed to accomplish a larger goal.
Search traffic: Visits sent to your websites from search engines like Google.
Time on page: The amount of time someone spent on your page before clicking to the next page. Because Google Analytics tracks time on page by when someone clicks your next page, bounced sessions will clock a time on page of 0.
UTM code: An urchin tracking module (UTM) is a simple code that you can append to the end of your URL to track additional details about the click, such as its source, medium, and campaign name.