iloveseo.com https://iloveseo.com The Friendliest Source of Industry News and Information Mon, 18 Sep 2023 20:48:56 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 Semrush IPO Today https://iloveseo.com/seo/semrush-ipo-day/ https://iloveseo.com/seo/semrush-ipo-day/#respond Thu, 25 Mar 2021 21:27:57 +0000 https://iloveseo.com/?p=1697 Semrush IPO Today on iloveseo.com by Brian Harnish

Following up on our earlier story regarding the Semrush IPO, today is the day that Semrush shares can be purchased on the stock market under the ticker symbol SEMR. The...

]]>
Semrush IPO Today on iloveseo.com by Brian Harnish

Following up on our earlier story regarding the Semrush IPO, today is the day that Semrush shares can be purchased on the stock market under the ticker symbol SEMR.

The offering was first priced at $14 per share on Wednesday. This is on the lower end of their expected range, which was $14-$16 per share.

The range of their stock price still values the company at approximately $2 billion. So far, the firm has sold 10 million of its first 16.8 million shares.

The IPO is expected to generate approximately $140 million for Semrush minus expenses. If all overallotment options are exercised, however, that amount would rise to $161 million.

Semrush team celebrating Semrush ipo day

Semrush IPO: What’s Happening?

What’s currently happening with the Semrush IPO? They had a strong start, but have dropped significantly in the time since.

They were previously at the upper end of $14 per share, but have now dropped to approximately $11.01 per share at the time of writing:

With this initial public offering, Semrush gained $140 million and were able to bank it. The market voice, however, says their actual worth is $110 million.

More About the Semrush IPO

From the Business Wire press release:

“Semrush Holdings, Inc. (“Semrush”), a leading online visibility management SaaS platform, today announced the pricing of its initial public offering of 10,000,000 shares of Class A common stock at an initial public offering price of $14.00 per share. In addition, Semrush has granted the underwriters a 30-day option to purchase up to an additional 1,500,000 Class A shares at the initial public offering price less the underwriting discount. The shares are expected to begin trading on the New York Stock Exchange on March 25, 2021 under the ticker symbol “SEMR” and the offering is expected to close on March 29, 2021 subject to customary closing conditions.

Goldman Sachs & Co. LLC, J.P. Morgan Securities LLC, and Jefferies LLC are acting as joint lead bookrunning managers. KeyBanc Capital Markets Inc. is also acting as a joint bookrunning manager for the proposed offering. Piper Sandler & Co. and Stifel, Nicolaus & Company, Incorporated are acting as co-managers.”

More from the Semrush IPO day event

Other Semrush IPO Facts

This $14-per-share pricing also values the company at around $1.95 billion to $1.99 billion.

The total pricing is also dependent on whether or not underwriters take advantage of overallotment options or not.

Semrush is scheduled to begin fully trading on Thursday on the New York Stock Exchange (NYSE) under the ticker symbol SEMR.

The company wrote in its U.S. Securities and Exchange Commission S-1 filing that “we enable companies globally to identify and reach the right audience for their content, in the right context, and through the right channels.”

They also reported a 2020 net loss of $7 million on approximately $124.9 million of revenues. This helped improve 2019’s $10.2 million of red ink on $92.01 million in sales.

After the IPO, Semrush fell more than 20 percent. Based on the Chief Financial Officer Evgeny Fetisov’s comments to Seeking Alpha, they are not worried. Fetisov explained:

“The market is soft these days [and it’s] not like we are not desperate to get more capital,” he said. “We think the right time for us to go public [and] the price is acceptable, so we’re happy with what we got, with the quality of the investment base that we got.”

Letter from the Founders of Semrush

The founders of Semrush published a letter today announcing the full initial public offering, in which they described their upcoming plans for the company:

“We dream that in the future, marketing is not an overly difficult profession. We believe it should be a skill that is accessible to everyone, much like the skill of using the Microsoft Word or Excel.We envision that artificial intelligence and automation will empower people to innovate which will lead to an explosive growth in goods and services. Marketers will need to figure out how to talk about these innovations, through what channels and to which audiences. Semrush is here to help our users achieve those measurable results.

In years to come, our dream is about 1 billion marketers who will better understand: Who their audience is, their key interests, how to best interact, as well as analyse and predict consumer trends. Who we are if we stop dreaming?

So many dreams ahead.
We invite you to dream with us.

Oleg and Dmitiri.”

Indeed, Oleg and Dmitiri. We love your vision of the future of internet marketing as a more accessible profession.

We all want to better understand our audience and their key interests as well as how to precisely engineer an SEO campaign to help drive those all-important traffic numbers, rankings and conversions. Luckily, tools like Semrush can help us do just that.

Image Credit: 

Featured image: Monticello / March 2021
Image 1: Semrush Twitter Account / March 2021
Image 2: Semrush Twitter Account / March 2021

]]>
https://iloveseo.com/seo/semrush-ipo-day/feed/ 0
How to Amplify Your Content with Semrush’s Social Media Toolkit https://iloveseo.com/seo/how-to-amplify-your-content-with-semrushs-social-media-toolkit/ https://iloveseo.com/seo/how-to-amplify-your-content-with-semrushs-social-media-toolkit/#respond Wed, 10 Mar 2021 01:15:43 +0000 https://iloveseo.com/?p=1415 How to Amplify Your Content with Semrush’s Social Media Toolkit on iloveseo.com by Brian Harnish

If you’re embarking on a content promotion campaign, you may be wondering “How do I amplify my content to increase its reach across social media platforms? One of the best...

]]>
How to Amplify Your Content with Semrush’s Social Media Toolkit on iloveseo.com by Brian Harnish

If you’re embarking on a content promotion campaign, you may be wondering “How do I amplify my content to increase its reach across social media platforms? One of the best ways you can do this is by using the Semrush content amplification tools, namely their Social Media Toolkit.

The Semrush Social Media Toolkit can make your life so much easier. It allows you to easily schedule updates to your social networks in advance, track your social media campaign’s performance and use UTM parameters to do so. Plus, you can pursue aggressive social media strategies designed to amplify your content and increase its reach across your industry’s verticals. Pretty cool, huh?

To explain how, we’ve put together a quick primer on the basics of content amplification and how you can take advantage of it to truly engage your audience.

What Is Content Amplification?

Content amplification is the process of increasing your content’s reach across your entire market through social media channels. Whether it’s by leveraging your industry influencers or promoting your content to new contacts, you are engaging in amplifying the reach of your content.

There are two reasons why this is an important activity:

  • It helps to legitimize your content, because great content is the content that gets shared, right?
  • It helps to get you links by leveraging influencers who share your content. While the links directly on social media are nofollowed , the links on your influencer’s websites aren’t, so when they add your content to their site with a link, it directly benefits your SEO.

As a result of this amplification, you can easily acquire links and build a great reputation across your social networks. Both of these form a double-edged sword when they are completed properly and accurately.

What Is Amplification in Social Media?

If content amplification is the process of amplification, then amplification in social media is the promotion of this content on the way to achieving amplification.

It’s about connecting with influencers, passing your link to them, working on those relationships and getting them to add your link to their site.

When your content is exceptional, it exceeds expectations. When that happens, it is that much easier to get the link.

Amplification in social media helps you find the right audience to promote your content to. If your website is where you promote your products and services, then social media is where you enrich your online presence with your brand persona.

It also helps you gain higher-quality traffic: Driving readers directly to your blog through social media can help you reach people who are more likely to convert into paying customers.

Why are they more likely to convert? These customers know you, and statistics show that people are more apt to buy something from a brand or person they know personally compared to an unfamiliar acquaintance.

The more you engage with your audience, the more familiar you become. And the more familiar you become, the more trust you build. That trust leads to better authority. And the more authority you establish, the better people’s perception of your expertise.

What Is Influencer Amplification?

There are four different kinds of people on social media: your regular clients, people who follow you, people you want to follow you and people who are influencers.

Your regular clients are those who have bought from you before and are currently singing your praises. While you may have some complaints here and there, they are few and far between and have never gotten to the point where you have detractors who actively seek to destroy your social reputation. You still want to cultivate these relationships, because you never know when they will refer a new client through word of mouth.

The people who follow you are not necessarily also your clients. They are people you want to impress enough into buying, yes. Luckily, social media can help in this regard. Every little interaction you have builds up that relationship and trust, which is why spending your time on social media is critical to building your online brand presence.

The people you want to follow you are all untapped markets just waiting to buy from the right person. If you come along and fulfill their objectives, you will again have an advantage in that you’ve already built up a brand and developed your reputation in the space, and they’re more likely to become followers (and customers) because of it.

People who are influencers are those you want to amplify your content for you. These are people who you want to keep in a separate database of contacts—think of it as your own virtual Rolodex of influencers. With such a virtual Rolodex, you could technically call on them any time you want to amplify a piece of content, and they may be more than willing to oblige. You can also create a mailing list from these contacts (just double-check with them and ask if they want to receive your newsletter first).

How Do You Amplify Your Content?

This may sound like a complex process, but it really isn’t. Just make sure you connect and maintain contact with the influencers you want to market that content to. As they share the content, they will drive more traffic to that post. In turn, this will (hopefully) result in more shares and views.

This is where the Semrush Social Media Toolkit comes in. It helps you automate some of the more complex parts of the process by letting you schedule posts up to a month or so in advance. Depending on how you construct these posts, you can include built-in influencer amplification too.

For example, you could include posts that create opportunities for amplification. One method you could use to do this includes creating posts around industry-wide topics that are interesting to influencers and social media readers alike. This is one of many ways to build a following and gain the respect of other influencers in your space.

Generate conversations with influencers by creating posts that target them by helping in some way. But don’t be spammy or a jerk about it (no one likes either one). Be genuine and honest, make it fun and create outreach copy that speaks to them.

If you’re in an industry that permits it, you could even start entertaining brand battles, like Wendy’s often does on Twitter with Burger King and McDonald’s. Wendy’s employs a lot of snark in their tweets, which is part of their brand persona. This is another great way to round up influencers: Figure out what will make your brand account unique and go to town with it.

Let’s take a closer look at some of the strategies you can use to amplify your content.

Use Only Your Best Editorials

Let’s face reality for a second: not every piece of content is going to be remarkable. It’s an impossibility to suggest otherwise. However, if a project is our baby, we are probably going to think that. This isn’t a bad thing in and of itself, but it can create an issue when you want to amplify everything because you think it’s all the best.

However, the truth is that the “best” content stands out on its own. It is a cut above the rest, and it includes points of interest along with in-depth discussions that you won’t find anywhere else. This is the content you want to share and amplify so that it stands out to influencers who may be faced with thousands of other people trying to vie for their attention.

This is where influencers will want to share your content more easily, because of how great it is. That is why a regular publishing schedule will help you get that top-tier content in front of as many influencers as you possibly can.

Post Shout Outs to Industry Influencers

If your post was particularly influenced by someone’s conversation or ideas, you could always post a shout out to them on social media in an attempt to gain traction. But be careful with this one: You don’t necessarily want to tag 20 influencers and say they were an inspiration for your post. Something like this requires good timing and authenticity, so make sure they actually were an inspiration to begin with and be smart about reaching out.

Strategically targeting your post from the beginning with this in mind could be a great way to begin this strategy, as this will help reinforce the idea that your inspiration truly was in large part sparked by their work.

This is of course an ego bait of sorts, but we’re not doing SEO to speak to a vacuum, right?

Use Paid Social to Reach Multiple Channels

Most major social media platforms have paid ads available. These can help you get more followers through amplifying your content right on users’ social media feeds.

You can even use your email list to build remarketing ads on other platforms, targeting people who may have looked at your content before but perhaps have not followed you. This too can be a powerful amplification strategy when done right.

Create in-Depth Guides

Creating long-form guides is a great way to amplify the reach of your content in the form of in-depth education pieces about your target niche. If your niche is anything like that of SEO, your audience is always eager to learn. So, creating updated guides about every facet of your business is one way to create a following out of your content.

You can also set this up to be a lead generation tool. Every time you promote a guide could be a lead generation opportunity. Including a contact form or CTA that pushes potential leads to sign up for your offering is a great way to grow that leads list. With the qualified contact form and CTA, you can create these opportunities almost on autopilot.

Include Social Share Buttons and Promotional Messaging

Creating social share buttons and promotional messaging as part of your content can help you amplify it. If you integrate these elements into your copy in a smart fashion, you can increase the rate at which your content is shared. You will be more likely to have shares and likes compared to sites that don’t include this type of interactivity.

Semrush Content Amplification Blueprint

Doing content amplification with your Semrush tools is a relatively simple process. The blueprint involves connecting your social media accounts, and strategizing your posting schedule, topics, and audience outreach tactics.

This process involves using the Social Media Toolkit in tandem with the followings steps:

  • connect your social media accounts;
  • create a custom URL with UTM tags for proper attribution and reporting;
  • write 30 days worth of social media posts; and
  • schedule your posts using Semrush’s Social Media Toolkit

The social media poster is one of the most powerful assets within Semrush for this reason:

A calendar within Semrush titled 'Social Media Poster.'

As you can see, when you have your social profiles connected, it is a robust social media post scheduling tool.

Assuming you have already completed the setup process, in order to schedule a post you can do the following:

Click on Calendar and then New post:

A calendar titled 'Social Media Poster' with the 'Calendar' tab and 'New post' button indicated by red arrows.

You will then be taken to the New post screen:

 

An interface titled 'New post' with a box on the left for composing a social media post and a box on the right for previewing it.

Here’s where you can create your social media post. When you type it out in the box on the left, it will show up in the box on the right in the form of a preview.

With this preview feature, you can see exactly how this post will look on both desktop or mobile devices, a crucial ability in the age of mobile-friendly everything.

Below is what the previews of Twitter and LinkedIn posts look like on desktop:

 

Preview of a tweet from @BrianHarnish linking to an iloveseo.com article.

 

Preview of a LinkedIn post from Brian Harnish linking to an iloveseo.com article.

And here is what they look like on mobile devices:

Preview of a mobile LinkedIn post from Brian Harnish linking to an iloveseo.com article.

The Anatomy of Semrush’s New Post Scheduler

The post scheduler in Semrush’s Social Media Toolkit is a powerful tool that will help aid your amplification efforts. It allows you to quickly schedule social media posts, preview how they will look, add them to a queue, create a regular publishing schedule, publish posts now from the tool itself or even save them as drafts.

A draft of a tweet within Semrush's post scheduling tool.

Dissecting the tool even further, we can see several elements that enhance its functionality:

Three square icons, one with a Facebook logo, one with a Twitter logo and one with a LinkedIn logo.

This section allows you to select the specific social profiles you wish to post to. If you have to reconnect the profiles for whatever reason, that prompt will show up here.

Just click on the plus sign to the right so you can select each one of your profiles and connect them:

Drop-down menu with options to connect to various social media platforms.

Moving further into the top portion of the compose your post pane, we find these:

A tab at the top of the 'compose your post' pane labeled 'original.'

The above buttons allow you to control what appears on the right in the preview pane by prioritizing the social platform you wish to see. In this case, the Original display, LinkedIn or Twitter.

Other buttons offer similarly useful functions:

Three buttons, one with a smiley face, one labeled 'GIF' and one labeled 'UTM.

The smiley face allows you to add emojis to your post, the GIF icon allows you to add a selection of your favorite GIFs to the post and the UTM icon allows you to customize your UTM tag.

Window titled 'UTM settings' with the fields 'campaign name,' 'campaign medium,' 'campaign source' and the button 'apply to post' highlighted.

To use it, enter your campaign Name and the campaign medium (if you wish to track exactly where your traffic is coming from in Google Analytics).

If you want to select an automatic campaign source, the UTM tool will automatically select the source based on the medium that the post traffic is coming from.

Then, click on Apply to post.

If you already have an image on your post, Semrush will pull it automatically and show you the image they plan on displaying with the post. If you don’t mind it, you don’t have to change it.

Portion of search engine results with the annotation 'we've found on this page...'

The next section of the post scheduler gives you several scheduling options that you can select accordingly:

Scheduling options within Semrush's post scheduler.

  • Add to queue This will publish the post in the first available time slot for profiles that have been selected.
  • Schedule lets you schedule the post for specific days and times.
  • Publish regularly will allow you to create a regular publishing cadence for your post.
  • Post now is pretty much what it says—it will post it to your social media platform and account immediately.
  • Save as draft will save your post as a draft for later scheduling and/or posting.

Social Media Tracker

Semrush’s Social Media Tracker allows you to track the performance of your social media campaigns.

Semrush tool titled 'Social Media Tracker' with a list of social media channels.

Here, if you haven’t done so yet, you can add in your competitors by clicking on the blue “Add competitor” button.

Once you have added your competitors, you can review the social media metrics as shown above.

These are comprehensive reports that let you track what you’re doing, how you’re improving and how you’re moving forward throughout the promotion process.

Drop down menu listing a number of competitor URLs.

If you click on the Company: dropdown, you will be able to select the competitors whose data you would like to review and compare. You’ll also be able to select the specific dates for this comparison.

This will help you benchmark exactly how you’re performing against your competitors.

In the first section below, is a comprehensive report showing all of your followers, subscribers, posts and/or videos, as well as engagement.

These metrics show you how your site is performing across these various social channels.

Chart titled 'Social channels' with columns for followers/subscribers, posts/videos and engagement.

The next section shows you the top performing content from your competition. You can view both total engagement and the engagement rate among these competitors:

Interface titled 'Top content' with the boxes displaying social media metrics for different accounts.

There is also a comparison chart showing how well you stack up against your competitors:

Interface titled 'Top content' with the boxes displaying social media metrics for different accounts.

If you click on the Facebook tab, then click on the company competitor, you will see detailed statistics regarding posts, audience, activity, engagement, hashtags, as well as potential insights from Semrush:

Semrush tool titled 'Social Media Tracker' with various tabs for different social media platforms.

Under the posts tab, you can also filter by hashtags. This is especially useful if you know which hashtags you want to investigate. It will allow you to figure out exactly what has been published, what is currently trending and other popular hashtags among your competitors.

This is where you can really tailor your strategy to exceed them.

Published Posts of Competitors

If you click on the specific social media platform (such as Facebook) this will show you even more detailed insights based on which competitor you selected.

The below report is a comparison report showing current page likes versus page likes during the previous period. It also shows the page likes’ change and growth metrics during that time period. It’s a great way to measure how well your competitor’s Facebook page is performing.

By mining this data, you can develop wisdom plus data insights to help your overall strategy:

Semrush tool titled 'Social Media Tracker' displaying various graphs and metrics.

Moving down, the page likes trend graph shows how many likes the page has received over time (based on the time period you have selected):

Two graphs under the title 'Page likes trend.'

The page likes trends graph compares your trends to your competitors over the same time period:

Line graph titled 'Page likes trend compared to competitors.'

The next section, total page likes compared to competitors, shows your total page likes versus the competitors you have selected.

Bar graph under the title 'Total page likes compared to competitors.

If we click on the Activity tab next, we will see the following page and metrics:

Semrush tool titled 'Social Media Tracker' with the 'Activity' tab indicated by a red arrow.

Eight boxes, each for a different social media post metric.

Again, these metrics are based on the fact that we have chosen Moz as our competitor.

Let’s take a quick look at the included metrics:

  • posts are the total number of social media posts that have been created over the past 30 days;
  • vs posts is a comparison metric based on the selected time period; and
  • posts per day and vs. posts per day are both the same thing.

The green numbers on the right represent comparison metrics. Specifically, they show the difference between posts published during the current period and those published over the last 30 days.

They also show you the percentage of growth the account has seen during this time period, so you can gauge precisely how these metrics are actually performing for your competition’s brand.

Pie chart and list of post metrics under the title 'Published posts and their performance.

The following section shows your competition’s published posts and their performance.

Based on the selected time period, you will be able to see exactly which posts performed well, how many posts were published per day, the amount of engagement per post and the overall engagement rate for your competitors.

Two line graphs under the title 'When posts are published and people engage with content.'

Metrics in the next section shows exactly when posts are published and when users are engaging with them.

Many of these metrics will help you unearth even more content ideas for your posts, targeting users and opportunities you may never even know you have had.

In this case, we compared the Moz blog and Search Engine Journal.

Bar graph under the title 'Publishing trend.'

The next section shows you data related to publishing trends. These trends examine exactly what type of post was created and when, including links, photos and videos.

This can tell you exactly what another brand is focusing on as part of their social media promotion efforts, and how you can tailor your own social media efforts to match.

Line chart under the title 'Publishing trend compared to competitors.'

The next section shows you the specific publishing trends of your competitors. Here, Search Engine Journal tops the list at 13–20 posts per week. Everyone else is a cut below at seven, one, and even zero posts per week.

The insights in this section can help you tweak your social media strategy to match and exceed your competition’s publishing rate.

Bar graph under the title 'Number of posts published compared to competitors.'

In the number of posts published compared to competitors section, you’ll see detailed data about your competition’s social media activity. Again, you can continue tweaking your own social media strategy to match and/or exceed what your competitors are doing.

Achieving World Domination with Semrush Content Amplification

World domination? Is it possible? Perhaps it is with the right content amplification strategy in your arsenal. Using Semrush for content amplification can help make the process easier. It’s all about figuring out what your competitors are doing, and how you can do it better. Creating a content amplification strategy is not all that hard. But, maintaining consistency and ongoing action could be.

Developing a plan and sticking to it is one of the more difficult parts of content amplification. But when you have a regular schedule, set time aside each day to work on it and continue your cadence on social media, you can make great things happen. So is world domination possible? Maybe. But if your goal is to achieve industry domination, the answer is a definite “yes.” It’s all about the strategy and approach you use to get there.

Image credits:
Screenshots by author / February 2021

]]>
https://iloveseo.com/seo/how-to-amplify-your-content-with-semrushs-social-media-toolkit/feed/ 0
Semrush Guide: Monitor Your Rankings and Online Visibility https://iloveseo.com/seo/semrush-guide-monitor-your-rankings/ https://iloveseo.com/seo/semrush-guide-monitor-your-rankings/#respond Sat, 06 Mar 2021 00:07:30 +0000 https://iloveseo.com/?p=1357 Semrush Guide: Monitor Your Rankings and Online Visibility on iloveseo.com by Brian Harnish

Monitor your rankings using Semrush’s comprehensive reporting tools. Every SEO campaign can benefit from comprehensive monitoring of the results of your ongoing SEO efforts. When it comes to keeping track...

]]>
Semrush Guide: Monitor Your Rankings and Online Visibility on iloveseo.com by Brian Harnish

Monitor your rankings using Semrush’s comprehensive reporting tools. Every SEO campaign can benefit from comprehensive monitoring of the results of your ongoing SEO efforts.

When it comes to keeping track of your competitive positioning, Semrush provides an all-in-one solution for monitoring your site and making sure that you remain competitive for your all-important keyword phrases.

In addition, the comprehensive monitoring of your competition is worth the price of admission.

Let’s get started with working on setting up our example website for some comprehensive monitoring.

We will begin with competitor research and share insights into some of its benefits for SEO.

Table of Contents

  1. How Do You Do Competitor Research?
  2. What Are the Advantages of Competitor Research?
  3. It Helps to Confirm Your SEO Keyword Strategy
  4. It Gives You Insight into How Your Competition Is Targeting the Audience
  5. What Are the Three Types of Competitors?
  6. What Should Be Included in a Competitor Analysis?
  7. Key Insights about Your Competitor’s Content
  8. Key Insights about Your Competitor’s Overall SEO
  9. Key Insights about Your Competitor’s Social Media Presence
  10. Landscape
  11. Visibility
  12. Estimated Traffic
  13. Average Position
  14. Position Tracking Overview Tab
  15. Rankings Overview Section
  16. Comprehensive Rank Monitoring and Reporting Is Easier with Semrush

How Do You Do Competitor Research?

Research into your competition for any market can be challenging, and that’s especially true for SEO. This is because of the many factors you must take into consideration. However, there are three that matter to most all SEO practitioners: technical SEO, content and links. Everything else is all but irrelevant.

But what about social media? Doesn’t that play a part in rankings? Somewhat, but indirectly. Because all social media sites are nofollowed, they don’t provide all that much in terms of direct SEO value. Where social media comes in handy, however, is in verifying the authenticity of your content in the eyes of search engines. Articles that have more shares tend to be more legitimate, right?

In this case, the SEO value you get from social media shares is that you may help convince the search engines that your site is more legitimate than others. Plus, you gain those all-important eyeballs on your content.

Competitive research comes in when looking at how things are done from a competitor perspective. To perform a sufficient competitor analysis, you want to answer the following questions:

  • How many pieces of content is your competitor publishing daily?
  • How are they writing their page titles, meta descriptions and header tags?
  • How are they including keywords in their content?
  • Is there sufficient linear distribution of keywords?
  • What kind of links are they getting?
  • How many links are they building to the page?
  • How long is their content?
  • How do they have their content structured in their overall website architecture?
  • What kinds of media do they include in their content?
  • Are they optimizing for any rich snippets?

This is where you can gain an edge over your competitors. Becoming better than your competitors doesn’t always mean doing more: It could also mean doing just enough to beat them. But this magical dimension is heavily dependent on your niche and how competitive it is.

What Are the Advantages of Competitor Research?

There are several advantages of competitor research. First, you can really dial in your SEO to attributes that Google is rewarding, rather than blindly going after a strategy that may or may not work. Plus , no one knows exactly what’s behind the algorithm itself or what truly makes it tick— only Google’s engineers do. Unless they otherwise become vocal about the inner workings of Google’s algorithms, there’s really nothing else you can do.

This is why there is such an advantage to doing competitor research in this fashion using Semrush. When you create something within your strategy that’s actually successful, you can review Semrush results and see exactly what kind of an impact it had. In turn, you can then identify other areas where you may be weak in your SEO, and tweak them accordingly.

It Helps to Confirm Your SEO Keyword Strategy

If you want to look for a quick confirmation that your SEO keyword strategy is on the right track, look at what your competition is doing. Finding that keyword strategy sweet spot is something that will help you leverage the information into a strategy that will keep your site competitive.

It also helps to nail down exactly how you want to construct your site and tailor your SEO elements to exceed those competitors.

Plus, you can get more of an idea of how you want to optimize many of your SEO elements in order to exceed what that competitor is currently doing.

It Gives You Insight into How Your Competition Is Targeting the Audience

Are you at a loss as to how you should be going after your target audience? Research your competition. This will give you unparalleled data as to what currently works for your customer base.

If they are successful and where you want to be, it is likely that your competition has already done the research and implemented SEO based on that research.

However, you don’t always want to copy their strategy exactly. The insight does, however, give you solid clues as to where you may want to go next with your overall strategy.

For example, if you find that your competition is ranking for 1,580 words of content, you may want to tailor your content around that word count.

Here’s another example: If you find that your competition is getting a few links from some high-quality websites, then you may want to consider going after those links as well, considering this is something that Google is rewarding. (Please note that Semrush’s content optimization tools give you potential links you can build to your site in addition to the possible on-page optimization strategies you may want to implement.)

What Are the Three Types of Competitors?

In a competitor analysis, it helps to identify the types of competitors you will want to go after as part of your strategy.

Take for example direct competitors. These are those sites which are offering the same products and services as yours. They are involved in the market on a similar level, and will help you identify specific elements of your strategy.

On the other hand, indirect competitors are those who may be in your industry but actually provide different products or services than you.

Identifying these competitors can help you figure out where you may want to go next (vs. where you have been) so you can determine whether or not you would like to “go there” in terms of other sections on your site.

What Should Be Included in a Competitor Analysis?

This one is a loaded question, and an article or two could easily be spent answering this in-depth. But before we get too ahead of ourselves, it’s important to understand that competition is heavily SERP dependent but is not necessarily competitor dependent.

This means you have to look at the individual SERP itself for attributes that competitors have in common across their sites in order to rank. Semrush can help you do all of this and more using their comprehensive monitoring tools for many aspects of digital marketing.

Key Insights about Your Competitor’s Content

Looking at how your competitor optimizes content that’s appearing in the number one SERP position can offer you clues about how to optimize your own. For example, if they target people also ask queries, you may want to target similar ones. And if you find certain topics that the top three sites consistently write about, then you may want to target those topics too.

You can also figure out just how often your competition is publishing their content, along with identifying the topics they cover. You can also gauge, across multiple competitors, where they think their topic focus should be in relation to the audience they are trying to reach.

You will want to examine the following about your competitor’s overall content strategy:

  • How much content are they publishing per day (or week, or month)?
  • What kind of content are they writing?
  • Who is their content geared toward?
  • What topics are they actually writing about?
  • What other kind of content are they publishing (aside from text)? Videos, images, infographics, memes, or slide decks?
  • When are they publishing their content (i.e., at what time of day)?

This information will help you identify key tactics that you will want to implement across your overall content strategy.

Key Insights about Your Competitor’s Overall SEO

SEO is more than keywords and links. It’s also about technical SEO, as well as identifying content opportunities your competitors may be lacking. You shouldn’t just take what your competitors are doing and copy them, though—rather, you must identify new opportunities that they may not have thought of. Here is where you can become a truly unique resource that is a cut above everyone else.

On the other hand, you can also find clues about what your competitors are doing that are getting them to the number one position for that query. While you won’t necessarily figure out exactly what they are doing specifically, you will be able to figure out clues as to what may be benefiting them.

To discover your competitor’s SEO strategy, be sure to look into the following:

  • Links: What kinds of high-quality links are they getting?
  • Content: What kind of content are they writing?
  • Technical SEO: What is the state of their website now?
  • Keywords: What types of keywords are they targeting?
  • On-page optimization: How are they currently optimizing their content?

Questions like these will help you identify key things you will need to do to your own site in order to propel yourself to the next level in the SERPs.

Key Insights about Your Competitor’s Social Media Presence

Your competitor’s social media presence is also a great place to unearth data about their audience. Chances are, their audience and yours tend to criss-cross each other in the same industry, so their social media presence may offer clues about how they are achieving social media domination.

For example, they may be posting some content geared towards politics in order to entice buyers who have a political spin. Or, they may be posting humor on a more regular basis. Perhaps they are posting more thought-provoking TED-talk type content. Or, maybe they are posting funny memes all the time.

This is something you will want to know as you embark on your own social media strategy. Also investigate:

  • the day and time that their customers (and your customers) engage with content;
  • the kind of content your customers (and their customers) consume;
  • content that strikes a chord with the audience such as quotes, videos, thoughtful anecdotes and the like;
  • any frameworks that your competition may be using to create their social posting calendars; and
  • how much and how often your competition posts.

By identifying these key insights among your top competitors, you can get in front of their audience as well and really make your social media strategy take off.

Landscape

The landscape section of Semrush’s position tracking tool allows you to monitor and assess where your site currently is among the existing landscape of your competitors.

It’s useful for identifying your site’s visibility, its estimated traffic, average position, and many other metrics.

From here, you can make tweaks to your overall optimization strategy to improve your site’s positions.

It is important to note at the outset that Semrush uses estimates to determine a site’s overall visibility within the top 100 Google results. It doesn’t, however, take into account positions below 100.

Other tools like Ahrefs, may take this into account.

Visibility

Portion of an Semrush report titled "Visibility"

This aspect of the report is actually based on CTR (click-through rate). You must input your own keywords for this to work, so it’s not all-encompassing when it comes to the results.

In other words, it works by tracking your existing tracked keywords. It’s also relative and is based on how many keywords you are ranking for.

For example, if you keep rankings for 100% of all of your keywords, that would count for 100 percent visibility.

This visibility does not take into account how you may be performing compared to many thousands of other keywords in the index.

Estimated Traffic

Portion of an Semrush report titled "Estimated Traffic"

From what Semrush says, this estimation is based on average click-throughs over all positions in Google’s results multiplied by the volume and divided by 30 (the number of days in a month).

This also shows the probability that a user will click on a particular domain’s search result, based on its position in the SERP.

If you find that you have a lower traffic estimation, this report gives you clues as to what part of your strategy you will want to change next.

Average Position

Portion of an Semrush report titled 'Average Position.

This part of the report gives you the average ranking of all keywords you have entered into your tracking campaign. Here, a rank of 100 means that you are not ranking for that keyword.

If you click on the average position report link within Semrush’s report, you will get the following information:

Portion of an Semrush report displaying a chart of a site's average keyword rankings.

This report shows a graph of your average position performance over the past 30 days.

If you want to obtain more detailed insights from the report, you can add the domains of your competitors as shown:

Portion of an Semrush report where you can enter the domains you wish to analyze.

Just add your competitor’s domain(s) from the list that pops down when you click on Add domain, and Semrush will generate a component of the report that gives you more insight into these competitors.

A drop-down list within Semrush displaying the domains of several competitor sites.

The following section (shown below) provides general information regarding which keywords your site ranks for, whether or not it ranks in the top three, the top ten, the top 20 or even the top 100.

A portion of an Semrush report showing a site's rankings for various keywords.

All of these are important to understanding where your site ranks overall.

The rankings distribution section provides more information about specific distributions of rankings. These include how many keywords are ranking within the top three, the top four–ten, the top 11–-20, and finally the top 21–100.

A chart within an Semrush report titled 'Rankings Distribution.

The top 21–100 rankings do not really matter as much as the first three, and the top page two rankings don’t matter as much as those on page one. But, it is useful when it comes to gauging your own rankings performance.

This section is relatively simple—it shows the top performing keywords for your domain.

Screenshot of a list within an Semrush report titled 'Top Keywords.'

Both of these sections include, when data is available, the positive and negative trends of rankings over the time period that you have selected.

Two sections of an Semrush report, one titled 'Positive Impact' and one titled 'Negative Impact.'

In this section, you can see your top competitors and how their online visibility continues to improve (or not).

A portion of an Semrush report showing a site's top competitors, both in graph and list form.

By rolling over the Top Competitors circles in the left hand column, you can see their keyword ranking distribution, as well as other useful information.

Here, you will be able to identify the types of SERP features that your site is ranking for. SERP features are things like rich snippets, reviews, featured snippets and other types of snippets governed by Schema markup.

A bar graph within an Semrush report titled 'SERP features.'

The Pages section shows the landing pages that are ranking, how many keywords they are ranking for, their average position and the estimated traffic the landing page is receiving.

A list titled 'Pages' within an Semrush report.

Semrush landing page traffic estimates are different than those from Google Analytics and Google Search Console. Because they don’t have access to the same data, it’s important to take Semrush’s data with a grain of salt, and use its data points more as general indicators than as hard traffic numbers.

The cannibalization health section checks for keyword cannibalization. This occurs when you have more than one page optimized for the same keyword phrase and they are vying for the same attention in Google’s SERPs.

A chart within an Semrush report titled 'Cannibalization Health.

In a perfect world, everyone would use correct SEO best practices. However, we are not in a perfect world, and this happens more often than not.

Identifying keyword cannibalization issues can help prevent your site’s pages from competing for traffic that they both may deserve.

Position Tracking Overview Tab

In the position tracking overview tab, we find yes, even more ranking position tracking.

Here, to get the most out of your data, you will want to continue adding in your competitors. You can add up to four more competitors for a total of five entries.

There are also additional filters that you can select, in order to filter just the data that you want.

A list of competing domains within an Semrush report as indicated by a red arrow.

Filtering by top positions and changes, you can show exactly what keyword distribution you want to see. Whether it includes keywords in the top three, top ten or top 100, you may want to create a report from this data.

A list of filters available within Semrush.

Scrolling down to the next section, we can see all the data thus far that shows an overview of all competitor domains (including your own). If you roll over the lines, you will get a pop-up showing the overall online visibility percentage for all of your competitors.

This gives you insight into how your site is performing based on comparisons with your top chosen competitors.

Graph within an Semrush report titled 'Visibility.

Rankings Overview Section

This section of the position tracking overview tab allows you to see your rankings at a glance for all of your keywords. It also allows you to examine all of the types of snippets your keywords actually rank for.

A section within an Semrush report titled 'Rankings Overview 1-9.

They will show up based on the types of SERP features that exist in that SERPs. For example, if a keyword only has site links or people also ask answer boxes, they will show up in the SERP features column.

If they don’t exist, they will simply not show up in this list.

If you click on the keyword section you’ll see rankings for that specific keyword, in addition to those of all the competitors you have selected.

Graph within an Semrush report titled 'Positions.'

If you scroll to the right using the scroll bar up top, you will be able to see your competitors as well.

A list within an Semrush report titled 'Rankings Overview' with a red arrow selecting one of many options visible on the scroll bar.

This will give you tremendous opportunity for insight into how you, along with your competitors, are currently performing in the SERPs.

But what do the icons mean?

This one means that you have a local pack on that SERP (not that you’re ranking for it, just that the SERP feature appears on that SERP).

A grey location pin icon.

The graduation cap means that you have a knowledge panel showing up on the SERP.

A grey graduation cap icon.

The star means you have review snippets that are present on the SERP.

A grey star icon.

This little news icon over here means that you will see top stories on the SERP.

A grey newspaper icon.

This little questions icon shows that you will have People Also Ask answer boxes on the SERP.

A grey question mark icon.

This icon that looks like a graphic means that you have images showing up on the SERP.

A grey graphic icon.

The little ads icon up above means that you have AdWords ads showing up at the top of that SERP.

A grey icon that reads 'Ad' on top.

Also, this icon shows that there are AdWords ads showing up at the bottom of that SERP.

A grey icon that reads 'Ad' on the bottom.

The icon that looks like a link is exactly that—this indicates that sitelinks features are showing up on the SERP.

A grey chain link icon.
For you and your tracking efforts, this means that you can filter each keyword by that specific SERP snippet and figure out exactly where your site is performing for that SERP snippet.

The estimated traffic tab shows you estimated traffic for the keyword, along with any SERP features you want to filter.

A portion of an Semrush report with a tab titled 'Estimated traffic' indicated by a red arrow.

The visibility tab also shows you how the visibility of your site is being influenced by your specific keywords.

A portion of an Semrush report with a tab titled 'Visibility' indicated by a red arrow.

If you click on table settings to the right, you will be able to customize this to your needs. Whether you want to view a certain selection of columns or change the row height, you could sort by any data point listed here.

A menu within an Semrush report, with a button titled 'Table settings' indicated by a red arrow.

This will allow you to customize your reports based only on the data points that you need.

The rankings distribution tab shows you, well, the distribution of your rankings. First, the overall rankings distribution section shows how your site is doing against your competition.

A list within an Semrush report titled 'Rankings Distribution.
The accompanying columns are dedicated to:

  • estimated traffic;
  • traffic coming from keywords ranking in the top three;
  • traffic coming from keywords ranking in the top ten;
  • traffic coming from keywords ranking in the top 20; and
  • traffic coming from keywords ranking in the top 100.

All of the report’s sections below this are dedicated to keywords ranking in the top three, top ten, top 20 and top 100, respectively.

The next tab, or the pages tab, is a listing of your landing pages along with their ranking keywords, estimated traffic, average position and total traffic volume.

A list within an Semrush report titled 'Pages.

The competitors discovery tab is a fun twist on competitor analysis.

A graphic within an Semrush report titled 'Competition Map' with a tab titled 'Competitors Discovery' indicated with a red arrow.

This research will give you all of your organic competitors within the search results. You will be able to examine their visibility performance, estimated traffic performance, number of rankings keywords and overall average page position.

A list within an Semrush report titled 'Competitors.

The last tab is the featured snippets tab.

Graphics within an Semrush report titled 'Featured Snippets.'

While it’s not currently populated for the specific site used in this tutorial, this will allow you to see things like trends for featured snippets, which keywords have featured snippets on the SERP, available opportunities, and much more. Keywords and their featured snippets attributes will show up below as well when they begin ranking.

Comprehensive Rank Monitoring and Reporting Is Easier with Semrush

Whenever you embark on any new SEO campaign, comprehensive monitoring of your site’s performance is key to impressing your client. From traffic to rankings to keywords, many elements of SEO intersect to provide a data-plus-wisdom reporting philosophy.

Data is all well and good, but the client is not going to understand just data alone (unless they are also an SEO pro). That’s where wisdom comes in: It helps to distill the data into manageable chunks so the client understands what you’re talking about, why we’re doing it and how it results in an improvement to their bottom line.

If you don’t create comprehensive monitoring reports based on real-world data, you could end up with inaccurate interpretations. This wisdom plus data philosophy is critical to keeping your client on your side and singing your praises. As you put together comprehensive reports that help your client understand each piece of the puzzle, you will find that it really does beat creating reports that only take into account half of the picture.

How do you plan on using Semrush comprehensive monitoring and reporting to delight your clients?

Image credits:
Screenshots by author / February 2021

]]>
https://iloveseo.com/seo/semrush-guide-monitor-your-rankings/feed/ 0
Site Audit Notices Explained https://iloveseo.com/tools/semrush/semrush-site-audit-notices-explained/ https://iloveseo.com/tools/semrush/semrush-site-audit-notices-explained/#respond Mon, 01 Mar 2021 19:46:07 +0000 https://iloveseo.com/?p=1271 Site Audit Notices Explained on iloveseo.com by Brian Harnish

Semrush’s Site Audit tool is capable of uncovering many issues that may be affecting your site’s performance, all of which can be broken down into three categories: Errors are the...

]]>
Site Audit Notices Explained on iloveseo.com by Brian Harnish

Semrush’s Site Audit tool is capable of uncovering many issues that may be affecting your site’s performance, all of which can be broken down into three categories:

  • Errors are the most urgent types of issues and should be resolved as soon as possible.
  • Warnings are less severe issues and can be addressed after errors.
  • Notices may not be actively harming performance at all, but should still be inspected when you get the chance.

Here, you’ll learn about the notices that can appear in your Site Audit report and what you can do about them. Although you may be tempted to leave your notices on the back burner indefinitely, by addressing them you’ll be affecting your site’s rankings for the better.

Table of Contents

  1. Site Audit Notices
  2. Pages Have More Than One H1 Tag
  3. Subdomains Don’t Support HSTS
  4. Pages Are Blocked From Crawling
  5. URLs Are Longer Than 200 Characters
  6. Outgoing External Links Contain Nofollow Attributes
  7. Robots.txt Not Found
  8. Hreflang Language Mismatch Issues
  9. Using Relative URLs in Place of Absolute URLs
  10. Your Site Has Orphaned Pages
  11. Pages Take More Than 1 Second to Become Interactive
  12. Pages Blocked by X-Robots-Tag: Noindex HTTP Header
  13. Blocked External Resources in Robots.txt
  14. Broken External JavaScript and CSS Files
  15. Pages Need More Than Three Clicks to Be Reached
  16. Pages Have Only One Incoming Internal Link
  17. URLs Have a Permanent Redirect
  18. Resources Are Formatted as a Page Link
  19. Links Have No Anchor Text
  20. Links to External Pages or Resources Returned a 403 HTTP Status Code

Site Audit Notices

Creating a website involves several technologies and processes, all of which are not always perfect. During the development process, errors can seep through. These errors, while not necessarily indicative of a site gone seriously wrong, can be indicative of lazy development processes.

Thankfully, there is Semrush. With Semrush, there are a number of high-priority errors and less urgent warnings you can uncover using their Site Audit tool.

Here, we’ll go over Site Audit notices. These are the lowest priority of items you must identify and repair in order to ensure that your site functions well from both a search and development perspective.

Below is a screenshot showing the notices section of Semrush’s Site Audit tool. Click on it and you’ll see all the notices Site Audit has found for your site:

Semrush's Site Audit Tool Dashboard

While these notices aren’t necessarily critical issues that you have to address right away, you’ll still want to check on them whenever you get the chance. If left unchecked, some can even cause significant technical performance issues down the line. Ahead, we’ll examine some of the most important notices the Site Audit tool can serve.

Pages Have More Than One H1 Tag

When developing a site, developers may have one goal in mind: to develop the site using whatever code will get the job done, even if it’s not compact or even 100% percent correct. This is because for most sites, developers want to ensure that the final product is delivered to the client on time. Sometimes, this leads to cutting corners. What happens in these instances is the addition of more than one tag that may not be semantically accurate. Semrush does not recommend having more than one H1 tag. While Google is on record as saying that having more than one H1 tag will not harm your SEO efforts, to keep things simple, accurate and confusion-free on the part of the search engine you’d still be wise to stick to a single H1 tag. This is because the H1 header tag establishes the main thematic topic of your page. The H2 header tag establishes supporting topics, and the H3 header tags may establish supporting topics (such as lists). In addition, another issue arises when H1 tags are not properly considered in their semantic order. This issue occurs when developers use H1 tags as decoration, rather than structural tags. It’s not considered correct, developmentally-speaking, to use header tags as tag styles.

Subdomains Don’t Support HSTS

HSTS, or HTTP Strict Transport Security, is a critical component of a correct, secure HTTPS implementation. However, it’s not always enabled on a server by default. In order to ensure the best possible security, you must make sure that HSTS is a part of your HTTPS system. AsMozilla explains, the HSTS header “informs the browser that it should never load a site using HTTP and should automatically convert all attempts to access the site using HTTP to HTTPS requests instead.” The best way to repair this issue is to add an HSTS directive to your site’s server files. On Apache servers, this is usually done in the .htaccess file. As GlobalSign elaborates, this can be accomplished simply by pasting the following code into your .htaccess file:

# Use HTTP Strict Transport Security to force client to use secure connections only Header always set Strict-Transport-Security “max-age=300; includeSubDomains; preload”

On NGINX, you may need to add a more advanced directive.

Either way, the end result is the same: HSTS will be implemented on the site.

Pages Are Blocked From Crawling

This is an issue that must be addressed at some point, although urgency depends on severity. If you have a significant portion of the site blocked, then you may want to consider unblocking it. For example, category pages shouldn’t always be blocked from crawling by Google. On the other hand, individual pages that are traditionally blocked—such as thank you pages and the like—are not a concern if they are shown in this list. Semrush will bring up everything falling under this issue without discrimination. It’s up to you to analyze the blocked pages and determine which are posing a critical issue that must be addressed. This is why just blindly following issues and checking them off in Semrush is not something that is to be taken lightly. For example, let’s say you find that approximately 15,000 pages are being blocked on an e-commerce site. The site only has 50 pages and 1500 products, so there should be no way that it should have so many pages to begin with. As it turns out, its pages used to number in the 1,000s because of parameter-related issues, and all of the parameter pages were simply dumped into the robots.txt file and blocked. So, you may want to leave the block intact. However, if the pages were blocked unnecessarily and they no longer exist, then you can repair the issue by unblocking them—Google will automatically discount 404 pages anyway.

URLs Are Longer Than 200 Characters

John Mueller has stated in no uncertain terms that URL length doesn’t affect pages’ Google rankings:

John Mueller URL Length

URL length is not a ranking fraction.

🍌 John 🍌 (@JohnMu) February 5, 2020

Outgoing External Links Contain Nofollow Attributes

There is an ongoing belief among SEO practitioners that external links require a nofollow attribute. However, this is simply a myth. Unless your page contains external links to disreputable sites that you do not endorse, such as those that you’ve paid for a backlink (please don’t do this!), you don’t have to add a nofollow attribute to those external links at all. After all, you always want to have an editorial vote of confidence for the links you place on your page. If you don’t, you have a problem. There are also new nofollow rules to consider. For example, nofollow is treated as a hint rather than a directive as of March 2020. This change brought with it additional attributes you can add to your links, such as rel=”ugc” (to indicate that the link points to user-generated content) and rel=”sponsored” (which indicates that the link is paid). You can see them all in Google’s guidelines for qualifying outbound links. As a rule of thumb, just remove the nofollow attribute from external links. If the site you’re optimizing is fairly large, and you inherited it from a shady SEO practitioner who wanted to add a nofollow attribute to all external links no matter what, you can simply perform a find and replace in the WordPress database. Otherwise, you can edit HTML files in this manner. For instance, you’ll be able to do so with the JavaScript String replace() method.

Robots.txt Not Found

The robots.txt file is a file that sits at the root of your web server. It is responsible for telling bots like Googlebot how they are allowed to crawl your website. If your site is missing a robots.txt file, this isn’t an issue in and of itself. But if you typically have a robots.txt file on your website, and you also have significant issues from bots and spiders, then you may want to restore it. On the other hand, if there are no issues from bots and your site is being crawled and indexed just fine, then it’s probably not a critical issue. In the interest of being thorough, though, you will want to include a robots.txt file with the following lines at the root of your server. Just create a new text file and insert these lines of code:

Disallow:

Sitemap: https://www.domainname.com/sitemap.xml

This allows all spiders to crawl your site without issue, and also allows them to easily find your sitemap file. Be careful with your syntax. Unless you really mean it, do not ever use Disallow: / with a forward slash. This will cause your server to exclude all search engines, which is almost certainly not a good thing.

Hreflang Language Mismatch Issues

You would be surprised how often hreflang tagmismatch issues come up in audits. Whether they are caused by including the wrong language code in the header or the improper implementation of hreflang links, these can be disastrous for your hreflang implementation as a whole. Errors that can impact your hreflang implementation include the following:

Incorrect Language Code

Using the wrong language code is one of the most common (and easier) mistakes to make when it comes to hreflang errors. Ideally, the code must be inISO 639-1 format. If it is not in this format, then it will be considered an invalid language code.

Self-Referencing (or Return) Hreflang Tag is Missing

You must also make sure that you include a self-referencing hreflang tag when you create your hreflang implementation. Google discusses the issue in-depth here, but John Mueller summed it up best in 2018 when he stated that self-referencing hreflang tags aren’t necessarily required but are nonetheless a good idea to include:

Using Relative URLs in Place of Absolute URLs

As a rule, you must use absolute URLs rather than relative URLs in all links that are listed within hreflang tags. Many issues arise if a site uses relative URLs because they do not indicate to bots which URLs you actually want to have crawled. If you don’t use absolute URLs, you also run the risk of creating code errors. When they get out of hand, all of these issues can cause serious problems when you implement them incorrectly on-site. This is where regular checking of notices will come in handy. The following example shows the difference between correct and incorrect versions of these URLs:

https://www.domainname.com/en-us/pagename/ (correct)

/en-us/pagename/ (incorrect)

Your Site Has Orphaned Pages

By definition, orphaned pages are those that have no internal links pointing to them. Ideally, all pages on your site should have at least one internal link. The reason why orphaned pages are a problem is because if you have a page that has valuable content but it has no links, then you’re missing an opportunity to drive link equity to that page. The recommended fixes from Semrush for orphaned pages are basically to remove the pages if they’re no longer needed. But if the page is valuable and is bringing in plenty of traffic, you will want to instead add an internal link to it elsewhere on the site. Of course, the other perspective is also true – if the page is needed, but it doesn’t require any internal linking whatsoever (such as a Terms of Service or thank you page) you are free to leave it as-is.

Pages Take More Than 1 Second to Become Interactive

In general, page speed should be limited to no more than two to three seconds. However, Google’s Maile Ohye stated that although two seconds has been cited as an acceptable threshold for ecommerce website acceptability, Google aims for a response time of half a second or less: While this is a quote from 2010, it illustrates the fact that even back then, page speed was both a key performance metric and an important factor of the user experience. Don’t just blindly view a loading time of two to three seconds (or a score of 70 or more on Google’s PageSpeed Insights tool) as the be-all and end-all of page performance. Instead, look at different tools and use those tools as benchmarks for how your site is really performing. One tool could be very different from another, so it’s important to understand some of the inner workings about said tool as you work on page speed. In order to check whether or not you have issues with this, you must connect your Google Analytics account with Semrush. This will import data into Semrush, allowing you to diagnose issues within the pages triggering this error. Using PageSpeed Insights, here is an example of a website that has a 19.8-second time to interactive:

Screenshot of a report from PageSpeed Insights, with the 'time to interactive' field outlined in red

Pages Blocked by X-Robots-Tag: Noindex HTTP Header

When you have this issue, you have an X-Robots-Tag blocking pages from being indexed in your HTTP header code via the noindex tag. Google offers the following code as an example of the X-Robots-Tag:

Portion of HTML code including the X-Robots-Tag and the 'noindex' directive

Google also lists several additional directives that you can use in your HTTP headers. They are all useful, and can be used in place of your standard noindex tag.

Blocked External Resources in Robots.txt

Believe it or not, there are situations that arise during audits where there are external resources being blocked in robots.txt. In most cases, there isn’t a valid reason for doing this. Usually, they are simply accidental blocks. However, there are a few reasons why you might want to intentionally do something like this. For instance, you may have:

  • files on another server that are necessary for the operation of the design, but do not necessarily affect the existing display of the site;
  • external font files that you don’t want to send a search engine to; or
  • other external plugin files that you don’t want to send a search engine to (such as critical files that may interfere with the operation of a plugin).

Depending on what your audit reveals about the severity of the issue, you may want to pass and focus on other things that will drive more of an impact. Semrush also recommends that if these resources are unnecessary, you can just ignore them. In addition, there are certain situations where you may run into issues with noindexed files. For instance, let’s say you have a page with the noindex meta tag. But, you block it with robots.txt. It’s possible that it could still show up within the SERPs, simply because the search engine has no chance to read the noindex meta tag within the page to begin with.

Broken External JavaScript and CSS Files

There are situations that come up when your JavaScript (JS) and Cascading Style Sheets (CSS) files are entirely broken. This may be due to updated plugins that simply don’t play right with each other, or it could be because the resource serving the external CSS file (such as a font file) is no longer active. In this case, you’ll need to remove the code responsible for calling these files. Other situations where broken external JS and CSS file repairs may be warranted include those where:

  • external JS files are being controlled by a third party;
  • updated plugins no longer work correctly with the old implementation; and
  • any JS or CSS files hosted on a content delivery network that are now broken.

In each of these cases, you must do a thorough audit and make sure that any files you want to be properly processed and rendered are repaired. Otherwise, these broken files can cause serious display issues with your site.

Pages Need More Than Three Clicks to Be Reached

This issue comes down to your website’s architecture. The theory goes that the shallower the crawl depth, the better (and easier) it will be for Google to crawl your site. In SEO terms, excessive crawl depths can pose significant issues for your SEO efforts. This is why Semrush recommends that pages with more important content should never be more than 3 clicks deep from your home page. John Mueller has echoed this, and has mentioned that while Google doesn’t count the slashes in URLs, it does view pages as being less important if they take many clicks to reach from the homepage: In other words, the less clicks it takes for a user to get to a page from your site’s home page, the better. It’s up to you to determine how to keep crawl depth under control while simultaneously maintaining a logically organized site structure. Also keep in mind that making sure that your content is approximately three clicks away from the home page does not mean that you’re not using a siloing architecture—this is an important distinction. You can still have content three clicks away from the home page and also have a siloing architecture at the same time.

Pages Have Only One Incoming Internal Link

When your pages have only one incoming internal link, you’re losing opportunities to achieve traffic from more heavily-trafficked pages. In other words, don’t bury important pages deep within your site. After all, the main goal of internal linking is to make sure that Google is capable of finding the pages you want it to. You can’t expect Google to randomly find your page without an adequate number of internal links (unless you submit a sitemap file in Google Search Console on a regular basis, but that’s another issue altogether).

URLs Have a Permanent Redirect

Permanent redirects are fine. They are a critical component to good SEO, especially during a site migration. Migrating a site may require several redirects at one time. The key is making sure that you don’t have more than two or three hops—five at the most. Anything beyond that becomes excessive and can cause significant confusion in Google’s eyes. In January 2020, John Mueller advised site owners that less than five hops in a redirect chain is preferred. In response to a Reddit post from the user dozey:

Resources Are Formatted as a Page Link

This issue occurs when resources like images are added in WordPress, and a second link to another page is added in addition to that resource. When using links in this manner, link equity is going to waste. Why? Instead of powering another page, link equity is sent to a resource that may or may not appear on Google. It’s better to strategize your link equity and where it goes to avoid these resources receiving link equity that would be better spent elsewhere. Here’s a brief example of this error:

Resources formatted as a page link

In about 99 percent of cases, it does not make sense to include a link to the image in such a fashion.

Links Have No Anchor Text

Links may be the foundation of the internet itself, but it’s anchor text that determines how users view them. This is what anchor text looks like to your site’s visitors, as seen in our article on Google’s domain information feature:

A screenshot of text from an iloveseo.com article, with two instances of anchor text outlined in red Anchor text helps power the context of your links, so strategically thinking about them in this manner is critical. There is another point to consider on this as well: Using generic anchor text like read more is not optimal. Instead, it’s better to use the page’s title or a descriptive phrase that gives users a clear idea of what they’re clicking on. When you include useful anchor text that’s readable by both users and search engines, your links can benefit.

Links to External Pages or Resources Returned a 403 HTTP Status Code

This is considered a notice because it may not be an actual client or server error in the first place. Instead, what tends to trigger 403 status codes are issues where external third parties such as Cloudflare may block Semrush. In order to fix this, you would simply need to whitelist the original name of the User-Agent for the bot so it can crawl the site. This is true for other crawlers like the Screaming Frog SEO Spider as well. You may also need to create a “crawl policy.” Such a policy allows you to specify which bot can crawl the site, and when it’s permitted to crawl. Semrush’s recommended fix for this issue is ensuring that all pages on your site are properly crawlable and indexable by all users and search engines.

Addressing Semrush Notices Can Be Tedious and Complex, but It’s Worth It in the End

When working on Semrush notices, you may think: “I really don’t have time for this. Why do I have to go through and deal with things that may or may not generate results?” To that, we say: “Why wouldn’t you?” If you’ve already resolved your site’s errors and warnings, you likely won’t have too many notices to fix anyway. And while they may not be critical, don’t underestimate the cumulative impact that the changes you make can have on your overall website performance. In the end, your site’s performance (and rankings) may thank you.

Image credits

Screenshots by author / February 2021

]]>
https://iloveseo.com/tools/semrush/semrush-site-audit-notices-explained/feed/ 0
31 Things You Can Do With Semrush’s Site Audit Warnings Tool https://iloveseo.com/tools/semrush/31-things-you-can-do-with-semrush-site-audit-warnings/ https://iloveseo.com/tools/semrush/31-things-you-can-do-with-semrush-site-audit-warnings/#respond Mon, 08 Feb 2021 16:42:51 +0000 https://iloveseo.com/?p=1173 31 Things You Can Do With Semrush’s Site Audit Warnings Tool on iloveseo.com by Brian Harnish

 When you’re using Semrush’s Site Audit Warnings tool, you can be presented with three types of issues: Errors are the most severe and must be immediately addressed. Warnings are of...

]]>
31 Things You Can Do With Semrush’s Site Audit Warnings Tool on iloveseo.com by Brian Harnish

 When you’re using Semrush’s Site Audit Warnings tool, you can be presented with three types of issues:

  • Errors are the most severe and must be immediately addressed.
  • Warnings are of moderate severity and should be rectified after any errors.
  • Notices are issues that may not be creating problems but should still be further examined when possible.

Here, we’ll take a closer look at warning-level issues specifically. While not quite as urgent as their error-level counterparts, it’s still vital to promptly fix any warnings that may be affecting your site’s search engine results page (SERP) performance.

Semrush’s Site Audit Warnings Tool

As soon as you’ve addressed the errors uncovered by Semrush’s Site Audit, you’d be wise to start working on warnings. Though you don’t exactly need to fix them, you can absolutely benefit from doing so.

These are the most important warnings you may find in your Site Audit report.

Pages Don’t Have Meta Descriptions

It’s been known for years now that meta descriptions do not have a direct influence on rankings. They can, however, have an impact on conversions.

A well-written natural description will beat an auto-generated description every time, assuming that you tailor it to your audience and you are working to garner a positive response from them.

That’s because a customized description will tell users exactly what your page is about and entice them to click on it. But if a page doesn’t have a meta description, Google will automatically create one for it that may or may not be accurate.

Avoid turning away potential visitors with auto-generated descriptions by creating a unique one for every page. As you do, remember to keep your meta descriptions at 155 characters or less and always include the primary keyword.

The more effort you put into your meta descriptions while working with the preceding best practices, the better!

Your Pages Are Not Compressed

Semrush says that this warning is triggered when the Content-Encoding entity is not present in the response header.

What, exactly does this mean? As Mozilla explains, the Content-Encoding entity is responsible for compressing the media-type and telling browsers how to decode it. So, this warning means that your site’s pages aren’t being compressed or properly decoded, which translates to longer loading times and a worse user experience.

To remedy uncompressed pages, Google recommends enabling GZIP text compression, in part because all modern browsers both support and automatically request it.

Since proper page compression can have such a positive impact on the user experience, it’s an important optimization pain point to include in your best practices.

Links on Your HTTPS Pages Point to HTTP Pages

When links on newer HTTPS pages direct search engine crawlers to older HTTP pages, the search engines can find it difficult to determine whether they should rank the HTTPS or HTTP version.

Luckily, the fix for this problem is very simple: Make sure that all your site’s HTTP links are updated to HTTPS. That’s it!

If you’re unfamiliar with HTTPS and why you should use it for your site, Google has plenty of documentation on the topic.

Your Pages Have a Low Text-to-HTML Ratio

When we talk about text-to-HTML ratio, we’re talking about each page’s ratio of visible text to invisible HTML code. The former includes the text that’s actually shown to users, while the latter includes HTML elements that only website administrators can see.

A low text-to-HTML ratio, i.e. lots of HTML code but only a little text, is not a direct ranking factor—Google’s John Mueller confirmed as much in a 2016 Webmaster Central hangout.

However, a text-to-HTML ratio of ten percent or less could indicate possible HTML bloat, which in turn could be slowing down page speed and negatively affecting the user experience.

So if this warning is triggered in Semrush, be sure to test the affected pages’ speed and check for any extraneous or unminified HTML code.

It’s still a good idea to make your pages leaner in terms of code, especially when it concerns your users—faster pages will always make your users happier.  

Your Images Don’t Have Alt Attributes

Images’ alt attributes, or alt text, serve to help both search engines and users with visual impairments understand their contents.

As such, images without alt text are missing out on opportunities to rank better with search engines and be more accessible to more users.

Within a page’s HTML code, alt text looks like this:

<img src=”IMAGE-FILE-NAME.JPG” alt=”ALT TEXT HERE”>

Always make your alt text as clear and descriptive as possible and keep it under 125 characters. By following these simple guidelines, you can improve your pages’ accessibility and rankings.

Pages Have a Low Word Count

As Semrush explains, this issue arises when any of your pages contain less than 200 words. This is because the amount of content on a page acts as a quality signal to search engines: It’s not necessary to have thousands of words on every page, but each page does need to have enough content to satisfy its main purpose.

As Google specifies in its Search Quality Rater guidelines, one of the attributes of a low-quality page is “an unsatisfying amount of main content for the purpose of the page.”

In other words, whether a page’s purpose is to report on a news story, describe a product or something else entirely, it needs to have enough content to achieve that purpose.

And while there’s no such thing as an ideal word count, a study from Backlinko and Buzzsumo found that on average, long-form content gets 77.2 percent more links than short-form content. So, focus on creating enough content for each page to fulfill its purpose, and aim for long-form content whenever possible and appropriate.

In the end, though, it all depends on the topic in question. While longer and more in-depth content may get more links, other intent-based search queries—such as those concerning money or transactions—may lend themselves better to short-form content.

Whether you create long- or short-form content, what’s most important is that it serves to achieve the page’s purpose in full.

Your Pages’ JavaScript and CSS Files Are Too Large

In Site Audit, this issue occurs when the total transfer size of the JavaScript and CSS files on a given page is larger than 2MB.

Like many warning-level issues, this doesn’t have a direct impact on your pages’ rankings. It can harm the user experience, though, which may lead to fewer satisfied visitors and lower SERP performance.

Such a scenario is possible because the larger a page’s JavaScript and CSS files, the longer it will take to load. Since page speed has a significant impact on the user experience, you should strive to keep each page’s files as lean as possible.

After all, as page load time moves from just one to three seconds, users’ probability of leaving the page altogether increases by 32 percent:

Bar graph showing how users' bounce probability increases with page load time

To avoid this, be sure to remove any unnecessary JavaScript and CSS files and keep your site’s plugins to a minimum.

You Have Unminified JavaScript and CSS Files

One way you can dramatically reduce the size of your pages’ JavaScript and CSS files is by minifying them. As such, Semrush will alert you if any of your site’s files haven’t been minified.

Not familiar with minification? As Mozilla puts it, “minification is the process of removing unnecessary or redundant data without affecting how a resource is processed by the browser.” This can mean removing things like white space, unused code and comments.

By removing nonessential elements such as those, you can help users’ browsers process and load each page’s JavaScript and CSS files more efficiently.

But I’m an SEO practitioner,” you might be saying. “I have no idea what the heck all this means. Do I really have to learn to code?” No, you don’t have to learn to code to be an excellent SEO. In fact, issues like these can be remedied with little to no coding knowledge.

If your site uses JavaScript and CSS files hosted on an external site, Semrush recommends asking the site’s owner to minify them.

If they’re hosted on your own site, you can simply use a browser-based minifier that does all the heavy lifting for you.

Your Site Contains Broken External Links

While broken external links won’t make or break your site’s performance, they can become a pain for users. No one wants to waste their time clicking on a link that doesn’t work, and your site’s visitors are no exception.

As such, multiple broken external links can degrade the user experience and may cause your site’s SERP rankings to drop as a result.

If the Site Audit tool reports any external links as broken, follow them yourself to see if they lead to a nonexistent page. If they do, replace the faulty links with functional ones. If they don’t, you may want to make the other site’s webmaster aware of the problem.

Your Site Contains Broken External Images

If your site contains images of any kind, chances are one or more of them will fail to appear at some point in time. This can happen because the image no longer exists, has been relocated to another URL or was linked to via an incorrect URL.

Whatever the case, broken external images aren’t appealing to viewers and therefore won’t be any help to your SERP rankings, either.

To fix a broken image, you can replace it with a new one, remove it altogether or change its URL to a correct one. And to improve the user experience when images do inevitably break, always include alt text for each image and consider serving a placeholder or default image instead.

Your Pages Have Too Much Text in the Title Tags

Just as with meta descriptions, you want your pages’ title tags (i.e. meta titles) to clearly communicate each one’s topic and purpose. This way, users can tell what your pages are about right from the SERPs, as is beautifully demonstrated by an article from Taste of Home:

Search result for a 'Taste of Home' article about healthy dinner recipes, with the title highlighted.

But since Google tends to truncate title tags longer than 60 characters, some title tags may be too long for their own good.

So if your Site Audit report serves you this warning, be sure to trim each page’s title tag to an appropriate length while retaining as much detail and clarity as possible. And if you’re having trouble coming up with the perfect titles, refer to Google’s best page title practices for guidance.

Your Pages Don’t Have <h1> Headings

HTML heading tags, which move in decreasing order of importance from <h1> to <h6>, tell both readers and search engine bots how a page’s content is organized and what each section is about.

Since it signifies the greatest level of significance, each page’s <h1> heading should be intact and well thought-out. As Semrush warns, if a page has no h1 heading then search engines may rank it lower than they would have otherwise.

Be sure each page’s h1 tag is properly implemented, too. While developers may believe that heading tags can be used interchangeably, these tags lose their significance when they are implemented outside normal semantic meaning.

What is their semantic meaning, you might wonder? It means that these header tags are being used in their proper order:

  • <h1> for the page title;
  • <h2> for the first sub-headings throughout the document;
  • <h3> for certain relevant subtopics;
  • and so on through the <h6> tag.

Adhere to these guidelines and your site’s SERP rankings are sure to thank you.

Your Pages Have Duplicate <h1> and Title Tags

While using the same title in your <h1> tag and title tag may sound like a good idea, doing so can backfire when search engines view your page as being over-optimized or keyword-stuffed.

Plus, you’ll also be missing out on an opportunity to include a greater variety of keywords for search engines and users to see.

While some SEO practitioners think that it’s acceptable to use more than one h1 tag, we disagree. In our experience, the proper semantic, hierarchical structure should be used for content. That means <h1> tags for the page title and <h2> tags for sub-headings.

Just don’t duplicate your title tags and <h1> tags—make sure that you create unique content for both.

You Have Too Many On-Page Links

Here, Semrush will give you a warning if any pages have too many on-page links. Their magic number is 3,000.

I disagree. While it has been reported in the past that 100 links per page is the maximum number that you can have before causing problems with search engine crawlers, the reality is that there really is not much of a limit at all.

In a Google Search Central YouTube video, Matt Cutts explained that there is no concrete maximum amount of links, and webmasters should simply stick to “a reasonable number:”

Just keep in mind that the more links a page has, the less link equity each linked page will receive. So if you have Semrush’s upper limit of 3,000 links on one page, each link will only get one 3,000th of the original page’s authority.

So, I think it’s best to err on the side of caution and stick to a couple hundred links or less. Provided those links make sense and provide value to users, this will ensure there’s no chance of your site sending a spam signal to Google.

Your Pages Have Temporary Redirects

When a page has been temporarily moved to another location, it will return an HTTP status code indicating a temporary redirect, such as status 302.

Search engines will still index the redirected page, though, and no link equity will pass to the new page. In this way, unintentional temporary redirects can hurt your rankings and block link equity.

To ensure that all the pages you want to rank can do so, verify that any temporary redirects Semrush alerts you to are both intentional and appropriate. For instance, 302 redirects are useful when A/B testing, redesigning or updating the site or fixing a broken page.

And remember, 302 redirects are not a permanent solution. If you want to redirect users to a new page for good, simply use a 301 redirect instead.

Your URLs Have Too Many Parameters

This warning is triggered not just when your URLs have lots of parameters, but when they have any number more than one.

Semrush says this is because multiple parameters make URLs less appealing to visitors as well as more difficult for search engines to index.

But what are URL parameters to begin with? As Google puts it, they allow you to track information about a click. They’re composed of a key and value separated by an equal sign (=) and joined by an ampersand (&), and the first parameter always follows a question mark (?). For example:

URL typed in browser bar with the parameter '?fg=1' highlighted."

While such parameters can be genuinely useful (when keeping track of which platform users are coming from, for instance), too many can create problems.

Google states in its URL structure guidelines that excessively complex URLs, especially those containing several parameters, can hinder crawlers by “creating unnecessarily high numbers of URLs that point to identical or similar content on your site.”

The solution is to keep your URLs’ number of parameters to a minimum, and only use them if you know they’re necessary. If you do need to use multiple parameters, consider using Google’s URL Parameters tool to prevent its bots from crawling parameterized duplicate content.

You’re Missing Hreflang Tags and Lang Attributes

The Site Audit tool will automatically warn you if your pages don’t have either hreflang tags or lang attributes.

Both serve to tell search engines which language is being used on a given page, and both are essential for multilingual sites (if you don’t have a multilingual site, of course, there’s no need to implement either).

By implementing hreflang or lang, you’ll ensure that search engines are able to point users to the correct version of each page depending on their location and preferred language. As a bonus, you’ll also help search engines avoid categorizing your pages as duplicate content.

Want to create some hreflang tags as quickly as possible? Aleyda Solis’ handy hreflang generator is the perfect place to start.

There Is No Character Encoding Declared on Your Site

The purpose of character encoding is simple: to tell browsers how to display raw data as legible characters. This is typically done by assigning letters to numbers, as seen in this ASCII code chart:

Basic ASCII code chart with numbers, letters and symbols.

Having such encoding in place is important because without it, browsers may not display pages’ text correctly. This can harm the user experience and result in lower search engine rankings.

As per Semrush, you can fix this issue either by specifying a character encoding in the charset parameter of the HTTP Content-Type header or using a meta charset attribute in your site’s HTML code.

Your Pages Don’t Have a Doctype Declared

When your page does not include a doctype, this causes your browser to enter what is called quirks mode. This is where the browser follows older coding, and will imitate what you would have achieved back in the late 1990s should you have used this non-standard method of coding.

You can avoid this scenario entirely by simply declaring a doctype. As W3Schools explains, the <!doctype> declaration is not an HTML tag—rather, it serves to tell the user’s browser what type of document it’s dealing with.

To ensure all your site’s pages display correctly, all you need to do is add a <!doctype> element to the top of every page source.

If you’re using HTML5, that element will look like <!doctype html> (the declaration is not case sensitive, so capitalize it any way you’d like).

Your Pages Use Flash

Yes, Flash is the shunned development platform of the digital age. It’s horribly clunky, it presents issues with crawling when not properly implemented, and it’s just inconvenient to have to update a third-party plugin six hundred thousand times.

While once invaluable for adding interactivity and animations to a site, its functions can now be accomplished with HTML5, rendering it entirely obsolete. This is especially true now that Adobe officially stopped supporting Flash as of December 31, 2020.

If your pages still use Flash, they:

  • can’t be properly crawled or indexed;
  • don’t perform as well as they would otherwise;
  • don’t display properly on mobile devices; and
  • pose a security risk to your site.

All in all, it’s just not worth it to keep Flash around. Avoid using Flash in the future, and remove it from wherever it exists on your site.

Your Pages Contain Frames

HTML frames used to be the bane of an SEO pro’s existence before HTML5. It seemed like every single website was rendered in frames, making for a clunky navigation experience. But, because it was the new “in” thing, everyone used frames to put together their website. A 2001 tutorial from IBM shows how frames could be used to organize a site’s layout:

alt="Screenshot of an early 2000s-era Internet Explorer window displaying a site with multiple frames.

But there’s a problem with new “in” things: They are seldom the best way to do things, and they can negate everything that SEOs try to achieve when optimizing a website.

Semrush agrees, saying that “<frame> tags are considered to be one of the most significant search engine optimization issues.”

The fix is quite simple—just don’t use frames! HTML5 doesn’t support them anyway, so both your site’s performance and user experience will be better for it.

Your URLs Have Underscores

There has always been a long-running debate about underscores (_) versus hyphens (-) in URLs. Which is better?

As Semrush says, search engines may consider words separated by an underscore to be one long word. With hyphens, though, each word will be viewed as distinct. That explains why Google itself recommends using hyphens over underscores.

So if you see this issue appear in your Site Audit report, simply remove any underscores from your URLs and replace them with hyphens.

Your Internal Links Contain the Nofollow Attribute

In the SEO world, the nofollow attribute counts as a downvote of the editorial capacity of a link. In other words, it tells search engines whether or not it is legitimate and whether you do (or do not) want to pass value.

Since in most cases you do want to pass value to all your internal links, Semrush will notify you if any of those links contain the nofollow attribute. If you see this warning, check to ensure that the attribute is there intentionally.

A good rule of thumb? If you don’t have a very good reason for using the nofollow attribute, don’t use it at all.

Your Sitemap Is Not Indicated in Your Robots.txt File

Semrush will display a warning if your robots.txt file doesn’t contain a link to your sitemap. This is because when you do include such a link, it’s easier for search engine bots to understand which pages they should crawl and how to navigate to them.

You can fix this issue by, you guessed it, adding a sitemap directive to your robots.txt file. When you have, it will look something like this example from Google’s documentation:

alt="Screenshot of a snippet of text from a robots.txt file with a link to the sitemap file highlighted.

As you can see, all it takes is a simple link to help make your site even more crawlable than it was before.

You Don’t Have a Sitemap

Even worse than not having a sitemap linked in your robots.txt file is not having a sitemap at all. That’s because a sitemap acts as a directory of all your site’s pages that search engine bots can use to achieve more efficient crawling and indexing.

So if Semrush alerts you that no sitemap file can be found, it would be worth your time to create one. To get started, read Google’s guidelines on building and submitting a sitemap. Note that while sitemap files are commonly referred to as sitemap.xml files, you don’t have to use the XML format. If you wish, you can instead use:

  • an RSS, mRSS and Atom 1.0 feed URL;
  • a basic text file; or
  • a sitemap that’s been automatically generated by Google Sites.

Your Homepage Lacks HTTPS Encryption

For quite a few years now, Hypertext Transfer Protocol Secure (HTTPS) has been a ranking signal in Google’s algorithms. It was in 2018, however, that HTTPS really became a significant attribute of Google’s Chrome browser. This was when Google began labelling all HTTPS sites as not secure with the release of Chrome version 68:

Screenshot of how Chrome 64 treated HTTP pages as secure while Chrome 68 treats them as not secure

As a result, Semrush says that websites without HTTPS support may rank lower on Google’s SERPs, while those that do support HTTPS tend to rank higher.

So if you haven’t already switched your site from HTTP to HTTPS, take a look at Google’s guide on how to do so.

As with other SEO efforts, it’s important to assess your situation, test what’s right for your site and move forward from there if it makes the most sense for you.

Your Subdomains Don’t Support SNI

Server Name Indication, or SNI for short, is an extension for the widely-used Transport Layer Security (TLS) encryption protocol. With SNI, a server can securely host multiple TLS certificates for several different sites under just one IP address.  

This sounds awfully technical, but it all boils down to a simple idea: As Semrush puts it, using SNI “may improve security and trust.” What’s not to like about that?

To fix this issue, all you need to do is implement SNI support for your subdomains. If you don’t know how, contact your site’s security certificate provider or ask the nearest network security specialist.

Your Robots.txt File Contains Blocked Internal Resources

CSS, JavaScript, image and video files are all examples of internal resources that can be blocked within a site’s robots.txt file through the use of the disallow directive.

When that directive is in place, search engines are unable to access the affected files, and are therefore unable to crawl, index and rank them, too.  

Usually, unless you have a very specific situation that requires it, there are no reasons to block internal resources in your robots.txt file.

If you receive this warning from Semrush, you may want to confirm that the resources in question are indeed blocked by using Google’s URL Inspection tool. If they are blocked and you don’t have a concrete reason for doing so, remove the disallow directive to re-enable crawling and indexing.

Your JavaScript and CSS Files Are Not Compressed

Just as page compression can improve page speed and your site’s performance, so too can the compression of JavaScript and CSS files.

As Semrush says, compressing JavaScript and CSS files “significantly reduces their size as well as the overall size of your webpage, thus improving your page load time.”

If your site contains JavaScript and CSS files hosted on another site, contact that site’s webmaster and request that the files be compressed. If you’re able to compress the files yourself, though, by all means do so.

Useful tools for quickly and easily enabling compression include JSCompress and CSS Compressor, both of which only require a simple copy and paste.

Your JavaScript and CSS Files Are Not Cached

We’re not done with JavaScript and CSS files just yet. If browser caching isn’t specified in your page’s response header, users’ browsers won’t be able to cache and reuse JavaScript and CSS files without having to reload them completely.

In other words, uncached JavaScript and CSS files equals slower load times and a worse user experience.

If your site uses JavaScript and CSS files hosted on an external site, then you’ll once again need to contact the site’s webmaster and request they be cached. If your site hosts its own files, though, simply enable caching in the response header.

Your Pages Have Too Many JavaScript and CSS Files

Even if all your site’s JavaScript and CSS files are both compressed and cached, your pages’ performance can be negatively affected if there are too many of those files to begin with.

While Semrush will display a warning if a page has more than 100 JavaScript and/or CSS files, you’d be wise to have far fewer than that. After all, for each file a page has the user’s browser has to send one more request to the server, thereby slowing down the page speed with each additional file.

So, pare down your pages’ JavaScript and CSS files wherever possible. For instance, you can start by removing any unnecessary WordPress plugins from your site.

Heed Site Audit’s Warnings for Better Rankings

When Semrush delivers warnings in its Site Audit report, it doesn’t mean you need to act fast—that’s only true of error-level issues. But if you take the time to dig into your site’s warnings and make prudent changes as needed, you stand to gain a significant amount of traffic and drastically improve the user experience.

For that reason, don’t view Site Audit warnings as urgent problems. Instead, view each one as an opportunity to get more (and happier) visitors.

Image credits

Think with Google / February 2018

Screenshots by iloveseo.com / January 2021

Scott Granneman / Retrieved January 2021

IBM / April 2001

Google Search Central / January 2021

Google Security Blog / February 2018

]]>
https://iloveseo.com/tools/semrush/31-things-you-can-do-with-semrush-site-audit-warnings/feed/ 0
33 Ways to Use Semrush’s Site Audit Tool to Solve Site Errors https://iloveseo.com/tools/semrush/33-ways-to-use-semrush-site-audit-tool-to-solve-site-errors/ https://iloveseo.com/tools/semrush/33-ways-to-use-semrush-site-audit-tool-to-solve-site-errors/#respond Mon, 11 Jan 2021 23:36:08 +0000 https://iloveseo.com/?p=1094 33 Ways to Use Semrush’s Site Audit Tool to Solve Site Errors on iloveseo.com by Brian Harnish

Semrush’s Site Audit tool is capable of uncovering a wide array of issues. These issues can be grouped into three categories: errors; warnings; and notices. To make the most of...

]]>
33 Ways to Use Semrush’s Site Audit Tool to Solve Site Errors on iloveseo.com by Brian Harnish

Semrush’s Site Audit tool is capable of uncovering a wide array of issues. These issues can be grouped into three categories:

  • errors;
  • warnings; and
  • notices.

To make the most of Site Audit, you should get familiar with the differences between the three:

  • Errors include technical issues that are of the highest severity on the site. These issues should be fixed and placed at the highest priority in SEO audit recommendations.
  • Warnings include issues that are of a medium severity on the site. These issues should be fixed and next in line for priority purposes.
  • Notices are not considered major issues, but they can present issues if they are not managed appropriately. These issues should only be addressed after all a site’s errors and notices have been resolved.

Let’s take a look at errors specifically so you can learn more about the most urgent issues that may be affecting your site.

Semrush Site Audit Errors

As mentioned above, errors found by the Site Audit tool should be set at the highest priority for repair. In other words, don’t ignore them, or you may pay the price with the performance of your site in the SERPs. Errors can reveal a vast array of issues, each one of which is critical to understand.

Here, we’ll look at some of the most important ones.

Pages Returning a 4xx Error Code

4xx errors (such as errors 400, 404 and the like) are all standard. If you are not familiar with SEO errors, these types of errors limit crawling and can impair your site’s crawlability and ability to achieve indexation.

If you impair the basic building blocks of search—crawling and indexing—you hurt your chances of ranking well. And if you have too many of these errors, they may result in what are called crawler traps, i.e. webs of errors that prevent search engine bots from properly crawling your site.

4xx errors are just one category of a family of HTTP request status codes:

  • 1xx (informational responses): The request was received and the information is being processed.
  • 2xx (successful response): The request was successfully received and accepted.
  • 3xx (redirection response): The request must be redirected in order to be completed.
  • 4xx (client error response): The request itself was flawed and cannot be understood or accepted.
  • 5xx (server error response): The request was valid but the server was unable to understand or accept it.

But 4xx errors are particularly dreaded by SEO practitioners because of their ability to create a poor user experience. After all, no user wants to click on a link only to be shown an error message.

So to address any 4xx errors that Semrush’s Site Audit tool finds, you can try implementing 301 redirects, creating a custom 404 page to keep users on the site or updating the page’s URL if necessary.

Internal Broken Links

When you have internal broken links, you can not only cause 4xx errors, but you can cause a variety of issues related to crawling. From crawler traps to broken pages to indexation problems, they can be the bane of any SEO pro’s existence.

Thankfully, with Semrush’s internal broken links report, you can catch them all and get your internal linking strategy back on track.

From the Site Audit tool’s report, navigate to the Errors section and click on the row pertaining to internal links:

Screenshot of an Semrush Site Audit page titled 'Errors' with the phrase '1 internal link is broken' highlighted

 

You’ll then arrive on the following page, where you’ll find all the details of the internal links that were found and guidelines on how to fix them:

Screenshot of an Semrush Site Audit page titled '1 page returned a 4xx status code

As an added bonus, Semrush is integrated with Trello, a task management system. If you use Trello, this adds a much easier way of implementing the proper fixes within your project management paradigm.

But, how do you really repair internal broken links? There are several ways, with the first being the most obvious—just go to the page with the broken link on it and change it to the correct URL that returns a 2xx (i.e. successful) status code.

Pretty simple, right?

Your other options are to:

  • delete the page if the page no longer provides any value;
  • implement 301 redirects if the page may still be able to pass link equity; or
  • recreate the page to contain the same content as it did before.

As you address each broken internal link, be sure to keep track of your progress for future reference.

Once you successfully fix all your site’s broken internal links, you’ll improve the site’s user experience and thus its chances of ranking higher and gaining even more visitors.

Your Sitemap Has Formatting Errors

There are many potential reasons for an XML sitemap to have formatting errors. For instance, perhaps it:

  • doesn’t include the required XML declaration narrative;
  • has a line break at the top of the file that’s preventing proper reading;
  • has been saved in a format that doesn’t match the proper text encoding declared at the beginning of the file; or
  • contains quotes surrounding URLs or other sitemap elements that were added in a program like Word or Excel.

Many of these issues are critical and can make or break your website’s performance in the SERPs.

As if this weren’t enough, there are different kinds of sitemaps that you must maintain, most of them with their own requirements in terms of formatting and creating error-free production files:

  • XML sitemaps: These sitemaps are the gold standard for sitemaps nowadays and are easy URL fodder for Google Search Console (GSC).
  • RSS, mRSS and Atom 1.0: Blogs with a Really Simple Syndication (RSS) or Atom 1.0 feed can use its URL as a sitemap. A media RSS (mRSS) feed can be used for the site’s video content.
  • Text: Sitemaps containing only page URLs can be formatted as standard text documents.
  • Image sitemaps: These sitemaps are used to crawl all of the images on your site. New images should be added as they are added to your site.
  • Video sitemaps: Video sitemaps should include all of the videos that you want Google to crawl and index on your site. New videos should be added as they are added to your site.
  • News sitemaps: This sitemap is for the explicit inclusion of news articles for Google News. It should also be kept updated on a regular basis with new content.
  • Mobile sitemaps: These types of sitemaps are built specifically for mobile phones. However, since Google no longer recommends you have separate mobile URLs at all, there’s no need to have a separate mobile sitemap either.  

No matter which type of sitemap (or sitemaps) you have, it’s important to make sure that it’s properly formatted and maintained 100 percent of the time.

To help solve the sitemap-related issues that Semrush uncovers, you may want to try using a tool like the XML Sitemap Validator.

Pages Returning a 5xx Error Code

After a 4xx error, the worst kind of HTTP status code you can get is a 5xx error. That’s because they indicate serious bottleneck issues with your server, so they should also be rectified as quickly as possible.

There are many 5xx error codes that you must learn how to decipher, including those related to issues with your HTTPto HTTPS transition.

For example, if you perform a transition and you have 5xx errors, it could be due to your SSL certificate not being live yet. In this case, you may want to contact your server administrator to decipher exactly what’s causing these errors.

Other more serious issues can point to your server itself being misconfigured, and your server throwing errors due to scripts not playing well with each other. One such situation that can cause this is when a plugin is updated on a WordPress site, and that plugin conflicts with another.  

As with 4xx errors, 5xx errors can harm both the user experience and your site’s rankings. That’s because they make the site more difficult to navigate for users, as well as more difficult to crawl and index for search engine bots.

5xx errors come in a variety of shapes and sizes, including:

  • 500 (internal server error): This is a general error that is generated when the server is unable to fulfill the client request, either due to an unexpected issue or otherwise.
  • 501 (not implemented): If a request cannot be processed because it doesn’t support the functionality required to process it, or if the request method is not recognized, you will get a 501 error. It means that some bottleneck is causing the server to not support the request itself.
  • 502 (bad gateway): This error displays when the server acts as a gateway, and an invalid response from the upstream server is received when attempting the request.
  • 503 (service unavailable): This error happens when a server is temporarily unable to handle the request, especially if it is due to short-term overloading or maintenance.
  • 504 (gateway timeout): If a timely response has not been received by the server, this error will be displayed.
  • 505 (HTTP version not supported): If this response is served by a web server, this means that there is no support for the HTTP version used in the message request.
  • 506 (variant also negotiates): This status code means there is an internal configuration error within the server and that the chosen errant resource is what’s actually configured to engage in the content negotiation. The end result is an incorrect endpoint within the process of negotiation.
  • 507 (insufficient storage): When there is insufficient storage, this means that the request cannot be completed because there wasn’t enough storage space to store the required information. The status code is user-based, and thus is only a temporary error.

Other 5xx error codes include 508 (loop detected), 510 (not extended) and 511 (network authentication required).

Whichever 5xx errors you may have, add them to the top of your list of priorities to ensure a positive user experience and efficient crawling.

Missing or Duplicate Title Tags

HTML title tags are a fundamental SEO necessity. If your pages don’t have title tags, Google will be forced to automatically generate titles in the SERPs for them.

This is not an optimal situation because Google can (and does) get title tags wrong.

So it can mess up your conversions in a big way, especially if people skip over your title for not being relevant to their query when in fact your page actually is.    

This is why you must have a unique, manually written page title. If you don’t, you increase the possibility that Google will auto-generate a subpar one for you.  

When writing your title tags, keep current best practices in mind:

  • Stay under 60 characters: Google typically displays the first 50 to 60 characters of a title tag, so you’d be wise to create ones that are 60 characters or less.
  • Write for the user experience: Each page’s title tag is the first thing users will see in the SERP results, so you want them to be well-written and intriguing.
  • Get creative: Each page should have its own unique title tag to differentiate it and its purpose from all your site’s other pages—no duplicate title tags should exist.  
  • Include primary keywords: This will help users understand what the page is about.
  • Include your brand name: If you want to build brand awareness and recognition, start putting your brand name in each title tag.

Duplicate Content Issues

Duplicate content is a serious but all-too-common issue. From things like using the same location information across all a site’s location pages to copying blog posts into an archive, it’s easy to inadvertently create duplicate content that may harm your rankings.

But because duplicate content is so common, there are many straightforward ways to fix it. Depending on the situation, you can:

  • use the canonical tag;
  • use 301 redirects;
  • delete the duplicate content entirely;
  • avoid using separate mobile URLs; or
  • tweak the settings on your content management system (CMS).

Also bear in mind that duplicate content is an ongoing concern for every website, so it’s important to keep track of it over time too.

Pages Could Not Be Crawled

This issue arises when search engine bots are unable to properly crawl your site’s pages. Since site crawling forms the very foundation of how search engines work, you can’t afford to let crawling issues go unaddressed.

To enhance your site’s crawlability, try:

  • ensuring your CMS isn’t disabling or discouraging crawling or indexing;
  • publishing new and unique content on a regular basis;
  • preventing duplicate content;
  • steering clear of Flash (hey, I’m not thrilled that a part of my college certificate is now invalid either);
  • checking to ensure that you haven’t blocked crawling in your robots.txt file;
  • keeping an eye out for index bloat, which can eat up your crawl budget;
  • improving your site’s page speed;
  • striving to achieve mobile-friendliness;
  • organizing your site’s pages in a logical manner;
  • optimizing your site’s JavaScript and CSS elements; or 
  • streamlining your sitemap.

Once your pages are perfectly crawlable, you’ll have one of the biggest SEO hurdles out of the way.

Broken Internal Images

Even internal broken images can be a big issue when not handled correctly.

Why do images tend to go missing on websites? The reasons are varied, but there are several. Images can:

  • be deleted or moved from their initial location;
  • get renamed; 
  • experience permission changes.

If any of those things happen, then your website’s users will have problems any time they even attempt to view content with missing images.

There are multiple ways to ensure your site doesn’t suffer from the consequences of broken images:

  • If an image does not exist at its regular location, be sure to update the URL.
  • If an image was deleted, damaged or is otherwise irreparable, replace the image with a new one.
  • If you no longer need an image, just remove the image from the page’s code.

Integrating these best practices into your overall workflow will help make sure that your site doesn’t suffer from major issues that can occur when broken images get out of hand.

Pages Have Duplicate Meta Descriptions

Meta descriptions that are duplicate have similar effects as duplicate title tags and are a definite no-no in terms of SEO best practices. You should always use completely unique page titles and meta descriptions for every page you create.

So what are the standard best practices for creating awesome meta descriptions?

Write a Unique Meta Description for Each Page

This practice is important enough that it bears repeating: No matter what, you should create a one-of-a-kind meta description for every page you publish.

If you have not yet integrated this essential best practice into your workflow, you should be. If you have already published many many hundreds of pages without doing this, start with your most important ones and work your way through.

With a site built on WordPress, you can install Yoast’s SEO plugin to help. It’s an easy, simplified way to make sure that you write meta descriptions with the right length. It will even show you a preview of your snippet in the SERPs, so you can figure out whether or not it will work for you ahead of time.

Don’t be afraid to experiment with different phrasing, keyword phrases, and calls to action. There are as many variations of the meta description out there as there are people, and you will only hurt yourself if you stick to creature-of-habit thinking.

Make Sure Your Meta Description Is Between 150–155 Characters

If your meta description is not the right length, it will be cut off by an ellipses (…) within the SERPs. This is not a good thing as you want your readers to see the full and complete meta description when they encounter your listing.

When the meta description is cut off in the SERPs, it creates a layer of mystery and intrigue that is not beneficial from a marketing perspective.

Also consider that the length of your meta description in the SERPs will change based on things like your user’s screen size, the type of device your audience is using, not to mention other technical factors.

Moz has seen Google cut off meta descriptions exceeding 155–160 characters in length, so we recommend staying in the 150–155 range to be safe.

Include the Page’s Targeted Keyword in Your Descriptions

While Google doesn’t use keywords in meta descriptions as a ranking factor, it does emphasize them in the SERPs:

Google search results with the phrase 'personal injury lawyer' highlighted in the snippets

This is especially true when the user’s phrasing is matched. This doesn’t prove anything conclusively in terms of ranking, but it does indicate that your users will notice your result more easily when they are scanning the page looking for results that are aligned with their search query.

So, including pages’ primary keyword in their meta descriptions can help create a better user experience.

Improper Robots.txt Formatting

Even formatting errors in your robots.txt file can cause major issues with your SERP performance.

From indexing issues and not being included in the search results, these can wreak havoc on your site if left unchecked.

You should review your robots.txt file on a regular basis in Semrush and make sure that you don’t have any major issues.

Aside from that, the most common robots.txt issues include the following:

Incorrect Syntax

It is important to note that a robots.txt file is not required to enable standard crawling behavior. However, normal crawling behavior can be inhibited and even stopped completely when the wrong robots.txt syntax is present.

For instance, let’s say you’re using the following code:

User-agent: Googlebot

Disallow: /example_directory/

In this example, you have disallowed Google’s crawler from crawling the entire /example_directory/. To apply this rule to all crawlers, you could use the following code:

User-agent: *

Disallow: /example_directory/

Basically, the asterisk is a wild card. This serves as a variable in the user-agent declaration that says all spiders. So if you wanted to stop Googlebot from crawling but didn’t want to stop Bingbot, just one small asterisk could land you in hot water.

One other major issue that is commonly found in robots.txt files (at least, those that I run into regularly in website audits) involves code that looks something like this:

User-agent:  *

Disallow: /

Just the addition of the forward slash here prevents the entire website from being indexed.

This is why it is so important to know your syntax and make sure that what you are doing won’t have unintended consequences.

To make doubly sure your syntax is correct, you’d be wise to use Google Search Console’s robots.txt testing tool before making any changes.

The Robots.txt File Isn’t Saved to the Root Directory

One way you can truly mess up your site’s crawlability is by not saving its robots.txt file to the root directory.

On some servers, the root directory is public_html. On others, it can be /root. On still others, it can be /your_username/public_html/.

Whatever the case, you must understand your web server’s structure and make sure that you are saving your robots.txt file to the right place. Otherwise, even this simple error can cause major issues.

Your Sitemap Contains Incorrect Pages

We’ve already covered improper sitemap formatting, but it’s equally important to ensure your sitemap doesn’t contain incorrect page URLs.

Semrush says that “only good pages intended for your visitors should be included in your sitemap.xml file.”

Errors that trigger this error in Semrush’s Site Audit tool include:

  • URLs leading to web pages with the same content;
  • URLs redirecting to different pages; or
  • URLs returning a non-2xx HTTP status code.

When these types of URLs exist in your XML sitemap, Semrush says that they “can confuse search engines, cause unnecessary crawling or may even result in your sitemap being rejected.”

So, the creation of your sitemaps should be fairly straightforward. Following these rules should lead to a mostly-perfect XML sitemap almost every time:

  • Always make sure all URLs return 2xx status codes, and never include URLs that have 4xx errors or 5xx server errors.
  • Do not include URLs that are redirects.
  • Do not include URLs that are soft 404s (these are URLs that return 2xx codes but do not return any content).
  • Do not forget to check for any syntaxial errors and coding errors.

Your Pages Have a WWW Resolve Issue

Usually, you can successfully access a webpage whether you add WWW to the domain name or not.

Nevertheless, if both versions are crawled and you allow both to be displayed on your site, you leave open a major duplicate content issue that can in turn cause SERP performance issues. Doing so will also dilute link equity by splitting it between each version.

So you should be specifying the version (either with the WWW prefix or without) that should be prioritized, so that only one version is crawled.

Your Pages Have No Viewport Tag

If you are not familiar with web development, the meta viewport tag is what allows you to control your webpage’s size and scale when using a mobile device.

This tag is a best practice for modern web development, and it’s required if you want to take advantage of a responsive site design.

So, be sure that each page has one and is displaying properly on mobile devices.

W3Schools also recommends that you:

  • refrain from using large fixed-width elements;
  • don’t allow content to rely on a specific viewport width to render properly; and
  • use CSS media queries to apply appropriate styling for screens of varying sizes.

Your Pages’ HTML Is Too Large

The full HTML size of a page refers to all of the HTML code that is contained on that page.

If the page is too large (exceeding 2MB, for example) this can lead to drastically reduced page speed, results in bad user experience, and a lower search engine ranking.

As someone who began my career in web development, I am of the opinion that any developer should get started by creating pages with tight requirements such as the following:

  • the page size is less than 32KB;
  • the page only uses 256 colors;
  • all solid-color vector graphics should be coded where it makes sense in the design (no images);
  • the page loads in less than half a second on a 56K modem;
  • the page is only allowed to use OS-based fonts (no Google fonts;
  • JavaScript is limited to one file; and
  • CSS is limited to 1 file.

For example, a three-row, two-column page layout structure should only have five <div> elements, entirely coded to be responsive with CSS in the linked stylesheet. The total lines of code should max out at 50–100.

Once this is accomplished, the developer can then move on to more complex layouts and sites, content in the knowledge that they can optimize said files down to that level if they need to.

There are other best practices you should follow if you want to create the cleanest site with the least amount of code possible, such as those outlined by WebFX. You can also further reduce the size of your pages’ code by using an HTML minifier.

Your AMP Pages Have No Canonical Tag

To prevent duplicate content issues with your Google AMP and non-AMP pages, it’s crucial to use the canonical tag to indicate which version should appear in search results and which should not.

If you’ve already adopted AMP without implementing the canonical tag, then you may have some work to do to get your site to a healthier place.

If you haven’t adopted AMP but are considering it, be sure to weigh the pros and cons first. For instance, while AMP is capable of significantly increasing page speed on mobile devices, it also comes with severe design restrictions.

Read through Google’s own AMP guidelines to fully understand all the other requirements that AMP entails.

There Are Issues with Hreflang Attributes

For sites that are going international, hreflang attributes are a must. That’s because when they want to have multiple locations, they solve their issues with multiple pages for multiple languages by using hreflang.

This implementation helps differentiate language pages so that they are not seen as duplicate content.

When used correctly, hreflang can be a powerful addition to your site. But when errors come up, it can cause big issues.

Semrush will alert you if:

To solve those issues, ensure both your country and language codes are properly formatted according to the listed ISO standards.

Note that Semrush will also generate an error if your hreflang attributes conflict with your source code. To avoid such a situation, always avoid:

  • conflicting hreflang and canonical URLs;
  • conflicting hreflang URLs; and
  • non-self-referencing hreflang URLs.

Finally, also make sure that none of your site’s hreflang URLs point to broken pages.

There Are Non-Secure Pages

While it is considered a standard SEO practice nowadays to use the HTTPS protocol on websites, it doesn’t always happen. In fact, some pages exist where the webmaster includes a contact page on an insecure site!

This is the reason why this issue triggers an error from Semrush when it shows up in an audit. If a site’s pages are not secure but have elements that should be secured, they present big risks for users who submit their information or otherwise interact with them.

Google Chrome will even display a “Not Secure” warning in users’ address bar when they access a non-HTTPS site, as will Firefox and other popular browsers.

So, you need to secure your website with HTTPS if you want users to feel comfortable using it.

Your Security Certificate Is Expired or Expiring

This issue illustrates why it’s so important to keep up with maintenance on your SSL certificates.

If your security certificate is expired or is about to expire, users visiting your site will be issued a warning from their browser. On Chrome, for instance, users will see a page proclaiming your connection is not private when attempting to visit a site with an invalid security certificate:

Warning page from Google Chrome titled 'Your connection is not private

The best way to fix this issue is to make sure that you either fix the expired or expiring certificate yourself, have your web server set to automatically do it or have your webmaster do it for you.

There Are Issues with an Incorrect Certificate Name

Yes, even seemingly insignificant details in registering your SSL certificate will cause issues with your website.

If you have one letter off, causing the certificate to not match your website, web browsers will not display your site to users. Instead, a name mismatch error will be displayed, and this can turn away visitors and have a negative impact on your organic search traffic.

If your SSL certificate isn’t issued to your site’s exact URL, Semrush will display an error so you can address the problem accordingly.

There Are Issues with Mixed Content on Your Pages

As Google explains it, “mixed content occurs when initial HTML is loaded over a secure HTTPS connection, but other resources (such as images, videos, stylesheets, scripts) are loaded over an insecure HTTP connection.”

How do you fix this kind of issue? The typical way is to change all HTTP-based content to HTTPS-based content. You can also avoid this by using relative URLs when linking to your on-site assets rather than absolute URLs.

To discover more techniques for fixing the mixed content Semrush brings to your attention, check out Google’s detailed guide on the topic.

Remember, most browsers will block mixed content altogether in the interest of users’ safety, so it’s vital for you to remedy it as soon as possible—your conversions may depend on it.

There Is No Redirect or Canonical Tag Pointing to the HTTPS Homepage from the HTTP Version

When you have more than one version of your homepage serving the same content, this creates duplicate content and confuses the search engines trying to index it. They are unable to make a decision on which page it should index, and which one to prioritize in the SERPs.

This is a critical issue with any type of content, but especially with a site’s homepage. If search engines don’t know which version of the homepage to show users, the site may lose both rankings and traffic.

So, be sure to use a 301 redirect or canonical tag to instruct search engines to index the HTTPS version of your site’s homepage. This will help ensure that you have a high-performing site and don’t experience any major conflicts during your own HTTP to HTTPS migration.

Remember, even seemingly small details in technical implementations can have a major sitewide impact if you are not careful.

You Have Issues with Redirect Chains and Loops

Let’s not forget that performing 301 redirects—redirecting one URL to another—is the right thing to do in many situations.

However, if your redirects are incorrect, they can lead to serious, disastrous results. Examples of these kinds of errors include redirect chains and loops, also known as spider or crawler traps.

In other words, when a search engine bot gets stuck in a cycle of redirects, it can’t properly crawl the pages it needs to and as a result the site won’t be properly indexed.

Not sure why you’re getting this error in the first place? Semrush says that “if you can’t spot a redirect chain with your browser, but it is reported in your Site Audit report, your website probably responds to crawlers’ and browsers’ requests differently, and you still need to fix the issue.”

Your Pages Have Broken Canonical Links

You already know that the canonical tag is a fantastic tool for ensuring that search engines display the correct page in search results. But you also need to remember to check that the URLs you add to your canonical tags aren’t broken.

When canonicals don’t lead to 2xx-OK URLs, it complicates the crawling process because when there are too many 4xx or 5xx errors in the canonicals, crawlers hit a brick wall and can go no further. You can only imagine what this can do to your crawl budget.

Broken canonical links also present a problem in terms of link equity, since a page can’t pass authority to one that doesn’t exist.

Pages Have Multiple Canonical Tags

Yes, even pages with more than one canonical tag can be major issues for a site. Here’s why: Multiple canonicals make it nearly impossible for any search engine to identify which URL is the preferred one.

This leads to confusion and unnecessary loss of crawl budget and can result in the search engines either ignoring the tags or picking the wrong one.

In Google’s case, all the canonical tags will simply be ignored. As stated on the Search Central blog, “specify no more than one rel=canonical for a page. When more than one is specified, all rel=canonicals will be ignored.”

If Semrush alerts you that some of your site’s pages have multiple canonical tags, it’s imperative that you immediately eliminate all but one.

There Are Broken Internal JavaScript and CSS Files

While we have gone over other issues with JavaScript and CSS files already, it’s important to highlight the fact that these files are critical to making sure that your site is working and displaying properly.

If you don’t do proper maintenance and a script stops running on your site, you can run into major issues down the line if it is not fixed immediately.

While it may not immediately result in a rankings drop, these files can eventually impact search performance because search engines will not be able to properly render and index your pages.

Several issues can arise as a result of broken JavaScript and CSS files. For example, let’s say you’re using two JavaScript plugins for a WordPress site. One plugin is updated, while the other is not. Suddenly the two plugins don’t agree with each other and the worst happens: The site goes down until the problem is resolved.

That’s why I can’t stress enough the importance of making sure that your JavaScript and CSS files play nice with each other. If you don’t, you run the risk of this type of issue happening, and potentially taking down your site overnight.

Secure Encryption Algorithms Aren’t Supported by Your Subdomains

Semrush says that this issue is triggered by their software when they connect to the server and find that it’s not using updated encryption algorithms.

It is likely that some browsers will warn your users of the issue when they are accessing insecure content, and this can then translate to public mistrust, which will cause them to not feel safe enough to use your site.

So, it’s vital to not only implement but stay up to date with the best encryption methods. These include the Advanced Encryption Standard (AES), the Rivest-Shamir-Adelman (RSA) algorithm and the Twofish algorithm.

Your Sitemap Files Are Too Large

This issue comes from two limits that are in place for sitemap files. These files should not exceed the following, otherwise they will show up on your Site Audit report as being an issue:

  • Your sitemap should not contain more than 50,000 URLs.
  • Your sitemap file should not exceed 50MB.

Either issue can weigh down your XML sitemaps, therefore causing further issues with crawling and indexing.

If your site is simply too large to get your sitemap down to an acceptable size, take Google’s advice and use several smaller sitemaps instead of one giant one.

Your Pages Load Too Slowly

Google introduced page speed as a ranking factor for desktop searches in 2010, and did the same for mobile searches in 2018. If that doesn’t tell you that page speed matters a lot, we don’t know what will.

So if your Site Audit report returns an error about slow page speed, you need to find a solution as quickly as possible. You can start by:

Fix Lots of Errors in Semrush’s Site Audit Tool to Get One Great Result

When you’re learning about and working to resolve the issues revealed by the Site Audit tool, it’s easy to get overwhelmed.

But rest assured that the fruits of your labor will be worth it: By remedying whatever errors are present, you’ll be able to enjoy better rankings, more satisfied visitors and even higher conversions.

Image credits

Screenshots by author / January 2021

]]>
https://iloveseo.com/tools/semrush/33-ways-to-use-semrush-site-audit-tool-to-solve-site-errors/feed/ 0
Achieve Next-Level Technical SEO with Semrush’s Site Audit Tool https://iloveseo.com/tools/semrush/achieve-next-level-technical-seo-with-semrushs-site-audit-tool/ https://iloveseo.com/tools/semrush/achieve-next-level-technical-seo-with-semrushs-site-audit-tool/#respond Mon, 28 Dec 2020 09:14:10 +0000 https://iloveseo.com/?p=1058 Achieve Next-Level Technical SEO with Semrush’s Site Audit Tool on iloveseo.com by Brian Harnish

Semrush’s suite of tools is unrivaled in terms of keyword and competitor research, but did you know it can help you enhance your site’s technical SEO too? One of its...

]]>
Achieve Next-Level Technical SEO with Semrush’s Site Audit Tool on iloveseo.com by Brian Harnish

Semrush’s suite of tools is unrivaled in terms of keyword and competitor research, but did you know it can help you enhance your site’s technical SEO too?

One of its most powerful features for doing so is Site Audit, a tool that provides a comprehensive high-level overview of any site’s technical SEO health.

Site Audit in a Nutshell

Once you’ve entered your site’s URL into the Site Audit tool and given it a minute or two to complete a full crawl of your site, its control panel will look something like this:

unnamed 4

Let’s look at each of these sections and what they mean:

  1. Total Score: This is the total combined score that Semrush assigns based on the issues that are correct vs. the issues that you need to fix. While ambiguous, it does give you a good bird’s-eye overview of the overall health of your site, and where it’s at for fixes.
  2. Crawled Pages: Below the total score, we can see that it has a section called crawled pages. This total number of crawled pages in the audit.
  3. Errors: This section will generate reports that show the amount and type of errors that Semrush was able to uncover. The number here indicates the number of issues of the highest severity that Semrush discovered when it crawled your site.These types of errors include redirects, 4xx errors, 5xx errors, internal broken links, formatting errors in your XML sitemap file, AMP-related issues if you have the business subscription tier, pages that don’t have title tags, pages that have duplicate titles, pages that couldn’t physically be crawled, internal broken images, and much more.
  4. Warnings: These are not exactly errors but they are issues that can impact your crawlability, indexability, and rankings. They are issues that are more of a medium severity and as such they should be paid attention to.These types of errors include things missing meta descriptions, pages that are not compressed properly, mixed protocol links (HTTPS links on HTTP pages and vice-versa), no image alt text, JS and CSS files that are too large, external broken links, short title tags, title tags that are too long, pages with duplicate H1 headings, pages that don’t have an H1 heading tag, and more.
  5. Notices: while these are not actual errors, they are issues that are necessary to fix, and Semrush
    recommends that you fix them. Depending on who you talk to in the SEO industry, some may or may not agree with the issues listed here. Don’t let that stop you from getting them fixed, as while they may not be direct ranking factors, they can contribute to rankings indirectly.These notices include issues like weighty permanent redirects on-site, subdomains that don’t support HSTS (a critical component to a successful HTTPS transition), pages that are blocked from crawling, URLs that are too long, whether or not the site has a robots.txt file, hreflang mismatch issues with language, orphaned pages within sitemaps, broken external JS and CSS files, pages needing more than three clicks to reach and many more.
  6. Crawlability: Whether or not your site is crawlable has very important implications to your rankings. In fact, Gary Illyes said in a Reddit post that, failing anything else, SEO practitioners should “MAKE THAT DAMN SITE CRAWLABLE.” His choice to use all caps speaks for itself.
    In this section, Semrush addresses several critical issues that can impact crawlability if they are present on a large enough scale, and gives you suggestions on how to fix them.
  7. HTTPS: This page will tell you how well your HTTPS implementation is, as well as what you will need to fix to ensure that your security certificate, server and website architecture is up to snuff.
  8. Site Performance: This is a critical SEO factor to get right. When your site performs well, you can expect an increased ranking benefit as a result, especially when you have great content and the links to support it.
  9. Internal Linking: When you create an effective internal linking structure, you help both users and search engine bots navigate your site with ease. That means a better user experience for visitors and more efficient indexing for search engines.
  10. International SEO: If your site has visitors from outside your country of residence, it’s important to ensure that your international SEO is in tip-top shape. This section helps you do just that by providing an overview of your site’s hreflang tags, links, and issues.
  11. Top Issues: This section is fairly self-explanatory. Here, you’ll see a quick rundown of your site’s most pressing issues, whether they have to do with crawlability, HTTPS implementation, performance, or any other area covered in Semrush’s audit.
  12. Robots.txt Updates: Your site’s robots.txt file serves to tell search engine crawlers how to crawl its pages. So, it’s easy to see how it can significantly affect website indexation. Luckily, this section will notify you of any changes made to the file since the previous crawl.

To better understand the kinds of technical SEO insights the Site Audit tool can provide, we’ll take a closer look at the Crawlability, HTTPS, Site Performance, and Internal Linking sections.

Crawlability

Click on the Crawlability section and you’ll soon see a collection of essential crawl-related metrics:

image3 4

As you can see, those metrics include:

  1. Site Indexability: This score gives you a ratio of your pages that can’t be indexed to the pages that can be indexed.
  2. Crawl Budget Waste: This graph gives you ten reasons that your crawl budget is currently being wasted, including temporary redirects, permanent redirects, duplicate content and more.
  3. Pages Crawled: This graph gives you an overall view of how many pages were crawled, along with the timeframe of the crawl.
  4. Incoming Internal Links: Not much to say here, except this graph shows you incoming links to the site. You can click on a bar within the graph to view more data, which includes unique page views of these pages (assuming you have your Google Analytics account linked with Semrush), their crawl depth, and the number of issues for the page, along with link data like incoming and outgoing internal and external links.
  5. Page Crawl Depth: This table will show you the total number of pages on the site and how many clicks it takes to get to each page, separated by number. This is useful for diagnosing crawl depth issues.
  6. Sitemap vs. Crawled Pages: This number gives you the pages in the sitemap compared to the pages that have actually been crawled. This will help you assess whether or not there are any issues with crawlability by way of the sitemap.
  7. HTTP Status Code: This section provides a graph with a total tally of all the possible errors on the site, complete with the following: 5xx errors, 4xx errors, 3xx redirects, 2xx OK status codes, 1xx status codes, and pages that return no status.

HTTPS

From the overview page, navigate to the HTTPS section to see a variety of security-, server- and architecture-related metrics:

image2 4

Specifically, you’ll see whether:

  1. All certificates are up to date: With this issue, you want to make sure that your certificates are updated properly. Outdated certificates, especially those that have not been renewed, can result in a site not displaying or loading properly.
  2. All certificates are registered to correct names: This means all certificates must be registered to the same domain name that is being displayed in the address bar. If they’re not, then you risk the page not being displayed in the browser when it detects that the domain name is not correct in the Secure Sockets Layer (SSL) certificate.
  3. Security protocols are up to date: This issue refers to running outdated security protocols, like old SSL or old Transport Layer Security (TLS). These are big security risks for your site, because older protocols carry greater risks in terms of being hacked. It is considered a best practice to always implement the latest security protocol version.
  4. All subdomains support secure encryption algorithms: When secure encryption algorithms are not properly supported, subdomains can show up as insecure. This can usually be rectified by making sure that all subdomains are included in your SSL certificate when you purchase it.
  5. All subdomains support SNI: Server Name Indication (SNI) is an extension of TLS which enables users to reach the exact URL they’re trying to get to, even if it shares an IP address with other domains on the same web server.
  6. All subdomains support HSTS: HTTP Strict Transport Security (HSTS) works to let web browsers know that they are allowed to talk to servers only when HTTPS connections are in place. This way, you don’t inadvertently serve unsecured content to your audience.
  7. No pages have mixed content: This refers to any pages with content being served by mixed security protocols. An example of this error would be pages that have images being served by HTTP protocol instead of HTTPS protocol, despite belonging to an HTTPS site.
  8. Links on HTTPS pages lead to HTTP pages: This is another example of mixed content, specifically when links on HTTPS pages lead to those with HTTP protocols.
  9. No non-secure pages exist: Non-secure pages can cause problems on a site that is otherwise secure, again because of the mixed protocol issue mentioned above.
  10. The HTTP homepage is redirected to the HTTPS version: When implementing HTTPS, it is considered a standard best practice to redirect all HTTP pages to their HTTPS counterparts. You can do this by adding redirects via cPanel, your .htaccess file, a WordPress redirect plugin or a similar tool, depending on your preference and level of technical ability.
  11. No HTTP URL is in the sitemap.xml: You should not have any mixed URL protocols in your Sitemap either because this triggers duplicate content issues by showing two URLs for the same content (one URL In the sitemap, and another one on the site).

You’ll also confuse the search engine spiders by having your content set up this way. So it’s a good idea to always make sure that your site is 1:1, whether it is HTTP or HTTPS.

Site Performance

Site performance is a critical SEO factor to get right. When your site performs well, you can expect increased rankings as a result, especially when you have great content and the links to support it.

To that end, the Site Audit tool’s Site Performance section provides a wealth of critical performance-related information:

image5 4

We’ve got lots of metrics to unpack here:

  1. Page (HTML) Load Speed: It’s considered an SEO best practice nowadays to adto improve page speed until each page loads in less than five seconds at the most.This item in the site audit report shows groupings of pages and their range of page speed, so you can diagnose and troubleshoot any page speed issues that may come up in an audit.
  2. Average Page (HTML) Load Speed: This section details the average page load speed of a group of 30 pages on your site. For this site, the average page load speed across 30 different pages is 0.5 seconds.
  3. Number of JavaScript and CSS: This section shows an average number of JS and CSS files spread across your site. As a rule, if you want an extremely lightweight site, you shouldn’t have more than three JavaScript (JS) or Cascading Style Sheets (CSS) requests combined. This is because server requests are a big deal in terms of site performance. If your server is from the stone age, you don’t want to overload it with empty requests. Keeping JS and CSS requests to a minimum can help avoid this.
  4. JavaScript and CSS size: This section gives you an idea of a group of pages on-site and what range the size of your JS and CSS files fall into. When it comes to JS and CSS file size, you can never be too careful. Maintaining the look of your page is important, but it is possible to do so while keeping things lean and mean enough for serious performance.To reduce the size of JS and CSS files, you can try using fewer fonts or more standard fonts, using CSS in place of images, using following modern coding practices such as CSS Flexbox and CSS Grid.
  5. Large HTML Page Size: When you have a large HTML page size, you create bottlenecks in potential site performance. While Semrush’s page size for reporting is approximately two MB, I recommend ensuring that your page size does not exceed 100 – 200 KB. The smaller, the better, depending on the type of site. Even at the upper end of averages, you should never have a page that exceeds one MB. If you do, then something is wrong. Whether there are many plugins being loaded at once that you don’t need or mismanaged database connections that need to be cleaned up, you must get to the bottom of the bottleneck that is causing your page size to be more than 1 MB. Trust me, your page speed will thank you.
  6. Redirect Chains and Loops: This section reports redirect chains and loops that are causing major issues. The standard feedback in most cases of redirects is to make the URL a 301 permanent redirect that redirects to another URL that is similar in context. In most cases, redirects are fine when they are done correctly. But, when they are done incorrectly, they can have disastrous results. Those results can manifest themselves in a way that is not so obvious. For example, if Screaming Frog shows redirect chains, but you can’t spot any in a browser, your site likely serves requests differently to crawlers and browsers. While it doesn’t look quite as obvious as other errors, you should still work to fix this issue as soon as humanly possible during audit implementation.
  7. Slow Page (HTML) Load Speed: Here is where you’ll see any alerts about pages that have been flagged as loading too slowly. Since the probability of users bouncing increases by more than 30 percent as a page’s load time rises from one to three seconds, it’s crucial to quickly sort out the issues shown here.
  8. Uncompressed Pages: This is triggered by Semrush, per their description, when the content-encoding entity is not present in the response header. Page compression is a method of shrinking the file size of your pages using the same data, plugins, and the like. When pages are uncompressed, this leads to slower page load times. In addition, user experience tends to suffer as a result, which can eventually lead to lower search engine rankings. Pages compression helps the user experience because it allows the browser to get these files quicker, and reduces the time it takes to render the code. To start compressing pages, you can enable GNU zip (GZIP) compression or use a compression plugin for WordPress.
  9. Uncompressed JavaScript and CSS Files: These issues register in Semrush when compression is not present in HTTP response headers. The end result of compressing these JS and CSS files is that your overall page size will decrease, along with your page load speed. On the flip side, any uncompressed JS or CSS files will unnecessarily increase both page size and speed. One easy way to ensure all your JS and CSS files are compressed is to use an external application to upload them. That’s because many such applications will automatically compress your files for you.
  10. Uncached JavaScript and CSS Files: To avoid this issue appearing in Semrush, it’s necessary to make sure that caching is specified in the page’s response headers. The process of caching your JS and CSS files allows you to store and reuse these resources on the user’s machine, avoiding the step of having to download them yet again upon the reloading of your page. The browser uses less data, which in turn will help to make your page load faster.
    The simplest way to enable caching is to install a WordPress caching plugin.
  11. Too Large JavaScript and CSS Total Size: When SEMRush reports this, they look at whether or not the file size of the JS and CSS files on your page exceeds 2 MB. When your page exceeds 2 MB overall, that’s when you begin to have problems. Remember earlier where the overall page size should not exceed 200 kb? If your JS and CSS files themselves exceed two MB, then that’s two megs beyond the initial file size of your page! Of course, as websites get larger and more dynamic, it is likely that a point will come where 2 MB is not out of the question. But, I don’t think we’re there yet. Not by a long shot. In my experience, most pages don’t need to exceed 200 kb to keep things running smoothly on today’s technology. And if more extreme optimization measures are taken, depending on the type of site you run, you can reach the elusive 32kb average page size from a decade ago. As a much more conservative web developer in this area, I prefer five total files to 25 unnecessary JS and CSS files that aren’t ever used and perform mundane tasks.
  12. Too Many JavaScript and CSS Files: SEMRush reports on this issue when there are more than 100 JS and CSS files on a page. I think this is far too liberal of a number to report on. In fact, CSS-Tricks recommends a maximum of three CSS files for any given site, and the same principle applies to JS files. If you have hundreds of plugins doing their thing, and only 3 or 4 are necessary, you can save a ton of overhead in the server requests department by combining these files. CSS sprites are the preferred method of using CSS because combining CSS in this way reduces your CSS files down to one as far as the server requests are concerned. This reduces your server requests considerably, especially when you have many CSS files loading all at once. The same goes for JS files too.
  13. Unminified JavaScript and CSS Files: There is yet another process that SEMRush reports on when your files don’t utilize it—that of minification. What minification does is removes things like blank lines and unnecessary white space from your JS and CSS files. The final minified file will provide exactly the same functionality as the original files, and help to reduce the bandwidth of your service requests. This, in turn, will help improve site speed. Luckily, minification is anything but difficult. With tools like CSS Minifier, JavaScript Minifier, and Minifier.org, it’s as easy as copy + paste.
  14. Notices: In this section, Semrush will serve you any notices it sees fit. In our example, it’s encouraging us to connect our Google Analytics account to learn more about interactive pages.

Internal Linking

In one click, the Internal Linking section of the Site Audit tool reveals a wealth of invaluable information about a site’s internal linking system:

pasted image 0 22

Each subsection holds a great deal of information in its own right:

  1. Pages Crawl Depth: This section shows you a range of the percentage of pages that have a certain crawl depth. Crawl depth refers to how many clicks it takes a person to get from the home page to a specific page. This gives you a good overall picture of how your site is currently structured.There are several best practices and opinions in SEO on arranging your site structure. Some SEOs believe that a siloed site structure is best, while others prefer dumping all of their content in the main directory and adhering to a flat architecture, making sure that most pages are within two to three clicks of the home page. So which is best? As Google’s John Mueller said in a Webmaster Central office hours hangout, what’s most important is that a site’s architecture makes it clear to crawlers how pages are related to one another.
  2. Internal Links: This section shows you a high-level overview of the quantities of pages on your site and how they are linked. You can use it to figure out whether or not you have orphaned pages, pages that have one incoming internal link, or if you have any pages where the quantity of the outgoing links is too high. If you click on the light blue bar within this section, you’ll get a more detailed look of your site’s internal linking structure:
    pasted image 0 23
    Under the Pages tab, you can see each page’s unique views, crawl depth and issues. Under the Site Structure tab, you’ll be presented with an overview of each directory’s URLs and issues.
  3. Internal Link Distribution: In this section, you are able to examine the internal link distribution across your website. This will enable you to tweak link equity distribution and make sure that the pages you want to receive link equity are receiving it.
  4. Internal Link Issues: This section will show you all the issues that are impacting your internal links. Cleaning up your internal links in this fashion is important because it is a quality signal.If your links are outdated, broken, have mismatched content (like HTTP links on an HTTPS page), don’t link to the right pages (i.e. you link to one site when you mean another) or have any other issues, this screams “low quality” to Google. So, any internal link issues revealed here should be resolved as soon as possible. Common internal link issues include links that are wrapped in JavaScript, as well as those that point to pages that return 4xx or 5xx errors.
  5. Broken Internal Links: If you have broken internal links on your site, this is an issue that you should fix. When internal links are broken, they lead users to non-existent pages, and this is bad for users and search engines alike. For users, it’s a problem because they can’t find the information they’re looking for, and the user experience suffers as a result. Search engines don’t like it because they are unable to crawl your site because of these broken links. When it happens enough, they are known as crawler traps, trapping the search engine spiders within your site.
  6. Broken External Links: This issue is similar to broken internal links. Except, the external links are the links that are broken.
    The reason why this can be an issue—especially if you have many of them on your site—is that when search engine spiders crawl your site, and they see so many broken links, whether internal or external, they may think that your site is not maintained much (if at all).
    And if it’s not maintained, why would it be worth ranking highly in the SERPs?
  7. Too Many on-Page Links: Here, Semrush will report if a page has too many on-page links. Their magic number is 3,000 links, but I disagree. As Google’s Matt Cutts said in a Search Central video, you should keep each page’s number of links “reasonable” and bear in mind that the more links there are, the less link equity each one will pass on. Also consider that unless a page is clearly intended to serve as a link directory, a large number of links can appear spammy in the eyes of users.
  8. Nofollow Attributes in Outgoing Internal Links: This section will display any Nofollow attributes found in outgoing internal links. Nofollow is an attribute that you can add to a link’s tag if you don’t want search spiders to follow through to the link. It also doesn’t pass link equity to other pages, so the best rule of thumb is to not use it unless you have a good reason to.
  9. Nofollow Attributes in Outgoing External Links: Nofollow outgoing external links are likely just as damaging as nofollow internal links when used incorrectly. Again, these don’t pass any link equity, and they tell crawlers not to follow the links. As such, they should only be used in very rare cases where you may have a link that you don’t want to pass a value to. In Google’s own words, only use nofollow “when other values don’t apply, and you’d rather Google not associate your site with, or crawl the linked page from, your site.
  10. Orphaned Sitemap Pages: Any page appearing in your sitemap that is not linked to from another internal page on-site is considered an orphaned sitemap page. Semrush states that these can be a problem because crawling outdated orphaned pages too much will waste your crawl budget. Their recommendation is that, if any orphaned page exists in your sitemap, and if it has significant, valuable content, that page should be linked to immediately.
  11. Page Crawl Depth More Than Three Clicks: If it takes more than three clicks to reach any given page from the site’s homepage, the site will be harder to navigate for both users and search engine bots.
    So, any pages appearing in this section should be brought within two or three clicks of the homepage.
  12. Pages with Only One Internal Link: The ongoing purview regarding this issue is that the fewer internal links point to a given page, the fewer visits that page is going to get.
    With that in mind, be sure to add more internal links to any pages Semrush includes in this section.
  13. Permanent Redirects: 301 redirects (a.k.a. permanent redirects) are used all the time, whether for redirecting users to a new page or implementing HTTP to HTTPS transitions.
    However, they can become major issues when the oversaturation of redirects cause bottlenecks on the server. They can also pose problems when redirect loops cause the page to not be displayed.
    To avoid either of those scenarios, be sure to only use permanent redirects when it’s appropriate and you intend to change the page’s URL as it appears in search engine results.
  14. Resources Formatted as Page Links: Pages are defined as any page on your site that is a physical page with resources on it, while resources are the individual resources (such as images or videos) linked to from the physical page. When you format a resource as a page link, this has the potential to confuse crawlers. As a member of the Semrush team explained on Reddit, an alert in this section “is only a notice, so it is not currently negatively affecting the site, but fixing it could potentially help.
  15. Pages Passing Most Internal LinkRank: Semrush’s metric of Internal LinkRank (ILR) measures “the importance of your website pages in terms of link architecture.” In this section, you can quickly see which pages are passing on the most ILR.In other words, you can see which pages are the most important when it comes to your site’s link architecture.

Site Audit: Your Secret Weapon in the Fight for Better Technical SEO

For those who aren’t familiar with technical SEO, it can present a number of tricky challenges that take lots of research and practice to overcome. In fact, even seasoned technical SEO practitioners can become bogged down by tedious tasks that take up precious time and energy.

That’s why Semrush’s Site Audit tool is so robust: Whether you’re a technical SEO pro or are still learning the ropes, Site Audit makes it easy to see where your site stands, how it can be improved, and what you should work on first. From security certificates to internal linking, you’ll be able to handle any technical SEO issues that come your way.

]]> https://iloveseo.com/tools/semrush/achieve-next-level-technical-seo-with-semrushs-site-audit-tool/feed/ 0 Streamlining Your Editorial Tasks and Entity Optimization with Semrush https://iloveseo.com/tools/semrush/streamlining-your-editorial-tasks-and-entity-optimization-with-semrush/ https://iloveseo.com/tools/semrush/streamlining-your-editorial-tasks-and-entity-optimization-with-semrush/#respond Fri, 11 Dec 2020 20:34:16 +0000 https://iloveseo.com/?p=993 Streamlining Your Editorial Tasks and Entity Optimization with Semrush on iloveseo.com by Brian Harnish

For those who are not aware, keyword research involves performing research for keywords for your site that you can then create content for. But in recent years there has been a...

]]> Streamlining Your Editorial Tasks and Entity Optimization with Semrush on iloveseo.com by Brian Harnish

For those who are not aware, keyword research involves performing research for keywords for your site that you can then create content for. But in recent years there has been a bit of a shift from keyword optimization and targeting to entity optimization and content topic research.

Should You Ignore Keywords and More Standard Methods of Keyword Research?

While topic and entity research are gaining popularity, you should absolutely not ignore more traditional keyword research tactics. They should still be a major part of your strategy, and they lay a strong foundation for finding keywords you may not have otherwise thought about.

So, the pragmatic thing to do is continue performing thorough keyword research while also linking and optimizing for entity-related keywords and phrases.

To do so effectively, think of your site as an entity database that supports a particular topic. This is how your entire optimization strategy will be born.

Ideally, your main topic should be covered on-site, with supporting pages for that main topic throughout which serve to cover it as comprehensively as possible.

This is how entity SEO becomes so powerful: with topic reinforcement and getting so in-depth with your topic content that you blow competitors out of the water.

What Is an Entity, Anyway?

According to a patent filed by Google, an entity is “a thing or concept that is singular, unique, well-defined and distinguishable.” It goes on to say that an entity can be a:

  • person;
  • place;
  • item;
  • idea;
  • abstract concept;
  • concrete element;
  • other suitable thing; or
  • any combination thereof.

Notice that list doesn’t include specific words or phrases; Instead, it only includes the actual entities that words or phrases may be referring to.

Translation? The days of exclusively targeting specific keywords and phrases are over, and the days of topic targeting, topic research, topical focus reinforcement and entity optimization are the future of SEO.

So, topic and entity research, optimization and targeting should all be a part of your overarching strategy.

Let’s take a look at all three and how they play into the larger picture of modern SEO.

Initial Topic and Entity Research

Using Google Search, we can move forward with preliminary topic research and nail down topics and entities that will benefit your site in the SERPs.

If you have never performed entity research before, you may be glad to hear that it’s largely the same as keyword research. The only difference is that in place of keywords, you are using entities.

When shifting from keyword research to entity and topic research, ask yourself the following questions:

  • What does Google know about my site?
  • Which topics and entities does Google associate with my site?
  • How can I write content to optimize for those topics and entities?
  • Which topics should I be writing about?

When you want to find out what Google knows about your site and discover how it sees it for your particular topics and keywords, you can take one of our favorite Google search tips and use the following operator syntax:

site:example.com [topic]

It’s important to note that this is not accurate all of the time, but it is accurate enough times that you can use it to spot problems.

Going back to the Denver attorney website competitors we used in our chapter on competitor research with Semrush, let’s take a look at how Google sees them. For this example, we’ll examine the Anderson Hemmat, LLC law firm using the query:

site:andersonhemmat.com personal injury lawyers

The resulting SERP will only display listings for the specified website that also include the specified topic:

Screenshot 2020 12 08 124946 6

In this particular SERP, several of the website’s pages show up. The first page of results includes the law firm’s:

  1. about us > why choose our personal injury law firm page;
  2. what you should look for when choosing a personal injury lawyer page;
  3. complete beginner’s guide to personal injury cases page;
  4. personal injury practice areas page; and
  5. location-specific about us subpage.

Notice how the first result is not the firm’s homepage. In fact, the homepage doesn’t show up anywhere in the top five results. That’s because the firm has optimized its site for topics that people actually search for (such as what to look for when choosing a personal injury lawyer) rather than keywords alone.

In other words, Google doesn’t just associate the site with the personal injury lawyers key phrase—it also associates it with the personal injury lawyers entity, and therefore displays a wide range of results that are intuitively organized by topic.

Imagine the competitive edge that you could have if you also differentiated your site structure and content to support your main topic and provide the solutions people search for. It’s certainly worth a shot.

In-Depth Topic and Entity Targeting and Optimization

Using Semrush’s Topic Research tool, it is possible to plug in topics and entities to figure out exactly which ones will help you perform better in the SERPs:

pasted image 0 10

The basic steps, as shown in the graphic above, are:

  1. Enter your keyword or entity.
  2. Select your location (including your target region and/or city, if desired).
  3. Click the green get content ideas button.

A screen resembling the following will load. For our purposes, and the example, we are using the topic content gap analysis:

Screenshot 2020 12 08 141730 6

Here, you have different options that you can use to see how these ideas intersect and impact one another. For instance, you can view the ideas in the form of cards, or you can visualize them as a mind map.

This tool is a fantastic way to get a bird’s-eye view of your topic in your niche and see what’s already being written about. The information it provides will give you an idea of what you’re up against in terms of the competition for that topic.

It also provides different options you can use to see how these ideas intersect and impact one another. If you click on the explorer button, for example, you will be able to see what kind of Facebook engagement each subtopic has, top pages for the topic, the amount of backlinks to those pages and total overall social shares:

Screenshot235 6

The overview section shows the top ten headlines by backlinks, ten interesting questions that people ask about the topic, top subtopics and top ten related searches:

Screenshot236 6

Finally, the mind map section has a content cluster diagram that will help you see how all of these different subtopics and related topics align with your main topic. It is invaluable for creating a content silo that reinforces topical focus while closing the content gap between you and your competition:

Screenshot237 6

As you figure out what type of content you want to write, and which subtopics you want to put in your articles about your main topic, you will likely identify gaps in your on-site content that you will want to cover.

This is how your content strategy is born, and how it will help you in your quest to achieve SERP dominance over your competition.

After completing your in-depth topic and entity research, compile the top content ideas that you want to write about, along with pertinent questions, headlines and other data, in one central document. This will help you write on-point and on-topic articles, and ensure that they are thorough and competitive enough to rank.

The Future of Editorial Tasks and SEO

In simpler times, SEO-optimized content creation was about creating as much content with as many keywords as possible. Now, though, Google’s incredibly advanced search algorithm and rapidly advancing artificial intelligence (AI was the star of the show at Search On 2020) requires a more nuanced approach.

Luckily, Semrush’s editorial capabilities make it a snap to come up with relevant topic ideas and optimize for the entities that matter most to your site. With the Topic Research tool at your disposal, you’ll be ready for the future of SEO-friendly editorial tasks.

Image credits

Screenshots by iloveseo.com / December 2020

Screenshot by author / January 2020

]]>
https://iloveseo.com/tools/semrush/streamlining-your-editorial-tasks-and-entity-optimization-with-semrush/feed/ 0
How to Overtake Your SEO Competition with Semrush https://iloveseo.com/tools/semrush/how-to-overtake-your-seo-competition-with-semrush/ https://iloveseo.com/tools/semrush/how-to-overtake-your-seo-competition-with-semrush/#respond Mon, 07 Dec 2020 15:04:55 +0000 https://iloveseo.com/?p=892 How to Overtake Your SEO Competition with Semrush on iloveseo.com by Brian Harnish

When researching competitors, it’s important to make sure that your analysis tells you three things: what a competitor is doing; how they are doing it; and the factors they are...

]]>
How to Overtake Your SEO Competition with Semrush on iloveseo.com by Brian Harnish

When researching competitors, it’s important to make sure that your analysis tells you three things:

  • what a competitor is doing;
  • how they are doing it; and
  • the factors they are most excelling at in their SEO strategy.

Using this information, you can create scalable strategies that will help you surpass your competitors and get higher rankings. Fortunately for Semrush users, the platform’s suite of competitor analysis tools can help you do just that in only a few steps:

  • Uncover Traffic Data
  • Investigate the Market
  • Identify Keyword and Content Gaps

Uncover Traffic Data

Your first order of business is to find out what your competitor’s initial stats are.

Start by analyzing the bigger picture of your competitors. This can be accomplished with the Traffic Analytics tool, located under the competitive research section of the drop-down SEO menu:

Screenshot 2020 12 03 152619

This tool can help you compare up to five competitors at once, and put yourself in a position to implement strategies that will help you overtake their positions in the search engine results pages (SERPs).

For example, let’s look at the keyword phrase denver personal injury lawyers and analyze five competitors for that space:

To do so, we’ll enter the first URL in the box of the Traffic Analytics tool’s main page and click search:

Screenshot228

On the resulting page, we’ll individually enter the other four URLs into the box labeled competitor and click compare:

Screenshot226

After doing this, you’ll get an overview comparing a variety of data points for each of the URLs you entered:

Screenshot 2020 12 01 144625

Specifically, you’ll see information about each URL’s:

  • total number of visits;
  • number of unique visitors;
  • average number of pages viewed per visit;
  • average visit duration; and
  • average bounce rate.

So what does this information tell you about your competitors?

First of all, the main overview can help you discover at a glance which competitors are (or are not) performing in the SERPs. Then, you can move forward with examining the specific metrics mentioned above:

Screenshot 2020 12 01 145824

Keep scrolling and you’ll find a box titled Traffic Sources. This section will tell you how much of each URL’s traffic is coming from various categories—direct, referral, search, social media and paid:

pasted image 0 2

Then, the Traffic Journey graph displays traffic coming from different sources over time to each of the specified URLs:

Screenshot 2020 12 01 152226

This next section, Traffic by Country, will allow you to see the traffic segmented by domain and country:

Screenshot 2020 12 01 152310

Finally, the Company Info section will display any available data about the domain owner’s company. In the cases of our five examples, no such data could be immediately found:

Screenshot 2020 12 01 152646

In such a scenario, you can manually find out more information about a domain owner by entering the URL into a database like Who.is.

Want to see even more detailed data? At the top of the overview page, you’ll find find six additional tabs labeled Audience Insights, Traffic Journey, Top Pages, Geo Distribution, Subdomains and Bulk Analysis:

Screenshot227

Click any of the first five categories and you’ll be presented with a wealth of in-depth metrics. Or, if you’d like to analyze more than five domains at once, click on <i>Bulk Analysis</i> to enter as many as 200.

Investigate the Market

If you want to get a better idea of the market you’re competing in, you’d do well to take a look at Semrush’s Market Explorer tool.

It can be found under the CI Add-On section of the drop-down Competitive Research menu:

Screenshot 2020 12 03 152441

To start, enter the URL of your choice into the field on Market Explorer’s main page. Select the organic competitor option and click research a market:

Screenshot229

On the resulting page, you’ll immediately see a page titled Market Overview with a growth quadrant chart at the top. This serves to help you visualize the current competitive landscape in terms of both traffic volume and traffic growth:

Screenshot 2020 12 02 153018

Scroll down and you’ll find a section titled Domain vs Market Dynamics. Here, you can see metrics such as total, direct and referral traffic, total traffic trends, growth by sources, traffic generation strategy and basic audience demographics:

Screenshot 2020 12 02 153237

These metrics can not only help you understand how much traffic your competitors are getting and where it’s coming from, but can also assist in identifying your ideal audience and tailoring your content strategy accordingly.

Identify Keyword and Content Gaps

When performing a keyword gap analysis, you need to look at competitors’ domains along with the keywords they are ranking for.

Then, you identify gaps in the competitors’ keyword strategy, which act as windows of opportunity for you to fill with your own keyword targeting.

By finding those gaps, you’ll be able to formulate a clear strategy to go after your competitors at scale.

You can find them with the help of Semrush’s Keyword Gap tool, located under the competitive research section of the SEO drop-down menu:

Screenshot 2020 12 03 152328

Just as with the Traffic Analytics tool, you’ll be able to compare five different domains at one time. To start, enter the five URLs of your choice, select the organic keywords option and click compare:

Screenshot 2020 12 02 163653

On the subsequent page, you’ll be presented with a box titled Top Opportunities for [URL 1]. Here, Semrush will automatically identify missing and weak keywords you may be able to benefit from targeting:

Screenshot 2020 12 02 164703

Also at the top of the page will be a box titled <i>keyword overlap</i>. This section contains a Venn diagram showing which keywords your chosen URLs share with each other:

Screenshot 2020 12 03 122934

Venture a bit further down the page and you’ll see the meat and potatoes of Semrush’s Keyword Gap tool, a section titled all keyword details for [URL]. Here, use the drop-down menu at the top of the section to choose the URL you want to see information about:

Screenshot230

Then, use the upper buttons to view either shared, missing, weak, strong, untapped, unique or all keywords:

Screenshot231

No matter which option you choose, you’ll be able to view data about each keyword’s volume, difficulty percentage, cost-per-click, competitive density and number of results:

Screenshot232

Once you’ve completed your keyword gap analysis and established a thorough understanding of the keywords missing from your (and your competitors’) strategies, you can move forward with completing a content gap analysis.

In short, a content gap is the difference between your content and someone else’s content, and how your content helps fill in gaps in knowledge based on that.

You can benefit from performing a content gap analysis because it will allow you to lay foundations for current content performance, along with improvements that you want to make to overtake the competition. Then, you can create topic ideas targeting the entities you want to go after in the SERPs.

As part of your analysis, you’ll need to answer the following questions:

  1. What is your competition ranking for (i.e. keywords) that you aren’t ranking for?
  2. Which keywords you are ranking for from pages two–ten that your competitors are ranking for on the first page?

Once you’ve done so, you can start creating content that will surpass theirs by targeting those keywords. Start by brainstorming topics with the help of Semrush’s Topic Research tool—you’ll find it under the Content Marketing drop-down menu:

Screenshot 2020 12 03 151659

To use it, simply enter the keyword or key phrase of your choice and you’ll be given an assortment of potential topics:

Screenshot 2020 12 03 150834

Not feeling the card-style layout? Click one of the top buttons to switch to explorer (a bare-bones list), overview (several lists grouped by category) or mind map (diagram) view:

Screenshot233

Don’t forget to give some attention to your site’s existing content too. Even though it’s already live, you can still leverage what you’ve learned from your content gap analysis by tweaking the material you already have.

An easy way to identify weak spots in your site’s existing content is to use Semrush’s Content Audit tool, also located under the Content Marketing drop-down menu:

Screenshot 2020 12 03 152145

On the tool’s main page, click add new Content Audit:

Screenshot234

Then enter the URL of the site you’re analyzing in the resulting pop-up box. Click create project and you’ll be taken to a page that allows you to select the subfolders you want to audit:

pasted image 0 3

After selecting the subfolders you want Semrush to audit, click start audit and you’ll see a list of all the content that needs improvement. It will be broken down into handy categories like rewrite or remove, need to update and quick review:

pasted image 0 1

Don’t forget that lower performing pages don’t always need a complete content overhaul: Even something as simple as a new, catchy meta description or page title can improve rankings.

This is why it’s important to not go after the forest while ignoring the trees—low-hanging fruit can produce significant results, and often more quickly.

By implementing a content gap analysis into your SEO workflow in this manner, it is possible to achieve a great competitive advantage.

Don’t Settle for 15 Minutes of Fame

The renowned basketball coach Pat Summitt once observed that “it’s harder to stay on top than it is to make the climb.” This is just as true in SEO as it is in sports: No matter how well you’re doing today, if you don’t keep up with what your competitors are doing then you’re bound to fall behind tomorrow.

That’s precisely why competitor analysis is so crucial. With Semrush’s tools in your arsenal, you’ll be able to gain invaluable insights into your competitors’ successes, learn from their failures and get the knowledge you need to come out on top.

Image credits

Screenshots by iloveseo.com / December 2020

Semrush / July 2019

]]>
https://iloveseo.com/tools/semrush/how-to-overtake-your-seo-competition-with-semrush/feed/ 0
Upgrade Your SEO Benchmarking and Research Skills with Semrush https://iloveseo.com/tools/semrush/upgrade-your-benchmarking-and-research-skills/ https://iloveseo.com/tools/semrush/upgrade-your-benchmarking-and-research-skills/#respond Wed, 02 Dec 2020 20:16:07 +0000 https://iloveseo.com/?p=857 Upgrade Your SEO Benchmarking and Research Skills with Semrush on iloveseo.com by Brian Harnish

Every successful SEO campaign must begin with comprehensive research. From analyzing initial domain benchmarks to conducting an in-depth investigation, you must make sure that your data is accurate and complete...

]]>
Upgrade Your SEO Benchmarking and Research Skills with Semrush on iloveseo.com by Brian Harnish

Every successful SEO campaign must begin with comprehensive research. From analyzing initial domain benchmarks to conducting an in-depth investigation, you must make sure that your data is accurate and complete from the start.

When performing your analysis, ask yourself a set of key questions:

  • How is your site currently doing?
  • What keywords are you ranking for, and what keywords are your competitors ranking for?
  • How many backlinks does your site currently have?
  • How many backlinks do your competitors have?
  • Does your site have any technical errors?

To answer those questions, you can use Semrush to establish a benchmark of where you stand and accurately evaluate your progress over time. Best of all, you’ll be able to do so in just a few simple steps:

Create a Domain Overview Report

Semrush’s Domain Overview feature displays a range of metrics about any site’s online presence and visibility. You’ll find it under competitive research in the SEO section:

Screenshot of the 'competitive research' section of Semrush's navigation bar, with the 'domain overview' option highlighted.

Click it and you’ll be presented with an easy-to-read report for the URL of your choice complete with color-coded charts and graphs. We’ll use reports for Amazon’s U.S. homepage as an example:

Screenshot of a report titled 'Domain Overview' within Semrush

The report’s data points include:

  • a site authority score;
  • organic search traffic;
  • paid search traffic;
  • backlinks, including referring domains;
  • keywords by country; and
  • various engagement metrics.

From the report’s main page, you can click any metric to instantly access more detailed information.

Perform Organic Research

Semrush’s Organic Research reports display detailed information about a site’s traffic, top organic search competitors and organic keyword performance.

You’ll find the Organic Research section under competitive research within the SEO menu. Click it and enter the domain you want to perform organic research on:

Screenshot of the data generated by an Organic Research report from Semrush.

You should add these metrics to a key performance indicator (KPI) tracking spreadsheet to set up your benchmarks.

Pay close attention to the numbers under Keywords, Traffic, Traffic Cost, Branded Traffic and Non-Branded Traffic. These are valuable metrics to track as your campaign progresses. Here’s what they mean:

  • Keywords: The number of keywords your site ranking for in the top 100.
  • Traffic: Estimated traffic based on the number of keywords ranking, their position, and the estimated search volume.
  • Traffic cost: The amount you’d have to pay Google Adwords to gain the visibility that you have organically. This number increases as you rank for keywords that are more transactional and lower in the funnel. The higher people are willing to bid for those keywords, the higher the value. This is a great metric to make sure the site is not just targeting keywords that have no transactional value, and can help anchor a monetary value to your organic SEO campaign.
  • Branded Traffic: Keywords that include your brand name. This is influenced by brand awareness, social media campaigns and traditional advertising methods.
  • Non-Branded Traffic: keywords that do not include your brand name.

Next, you can modify specific graphs to display the exact data you’re after.

In the Organic Keywords Trend box, you can use the checkboxes to choose the number of organic keywords you want to view:

Screenshot of the checkboxes available to click over the 'Organic Keywords Trend' graph."

This will help you see if your organic traffic and keywords are trending up or down and if they’re changing over time.

You can also click on Traffic to see the Estimated Traffic Trend graph:

Screenshot of a line chart titled 'Estimated Traffic Trend.'

Notice that you can change the date range of the time to see “all time”, two years, one year, six months or one month. By viewing your estimated traffic trend for different timeframes, you can achieve a deeper understanding of your site’s current and past health.

You will also get a snapshot of the top organic keywords based on volume, as well as the keywords that have had the greatest change in ranking positions:

Screenshot of two lists, one titled 'Top Organic Keywords' and on titled 'Top Position Changes

Just below the Top Organic Keywords and Top Position Changes boxes, you’ll see an overview of SERP performance titled SERP Features. This section summarizes the different types of search results that pages from this domain appear in:

Screenshot of a cluster of topics under the title 'SERP Features

From the image, you can see that SERP results linking to the domain include 205.7 keywords from featured snippets, 18.4M from reviews, 433.4K from images and so on.

As you scroll down you’ll find the next sections, Top Pages and Top Subdomains. These sections tell us, at a glance, which pages have the most volume both in terms of traffic and in terms of keywords ranking.

You can use them to isolate the top performers as pages with the most visibility and the highest value:

Screenshot of two lists of URLs and data points, one titled 'Top Pages' and one titled 'Top Subdomains

Finally, you will see a breakdown of your top competitors and a map for how they stack up against you (and each other):

Screenshot 2020 11 30 160319 Screenshot 2020 11 30 160319

By the end of this report, you will have a strong overview of how the site is performing, what the top pages are, what the top keywords are and who you’re competing with.

Explore Your Organic Metrics

At the top of your Organic Research report, there are five other tabs that you can take a look at to get more in-depth information on your domain. They include:

    • Positions;
    • Position Changes;
    • Competitors;
    • Pages; and
    • Subdomains.Screenshot of highlighted tabs titled 'Positions,' 'Position Changes,' 'Competitors,' 'Pages' and 'Subdomains.'"

 

Let’s take a closer look at each.

Positions

When you click on the Positions tab, you will see the following screen:Screenshot of various graphs and data points under a tab titled 'Positions.'"

This section lets you dig into your keyword rankings to find out how many of them rank for specific positions.

You can also export your keyword ranking report into a spreadsheet that will tell you metrics like each keyword’s position, previous position, traffic percentage, competition and number of results:

Screenshot of a keyword spreadsheet opened in Microsoft Excel

Position Changes

Next, move on to the Position Changes tab. When you click on it, you should see the following:

Screenshot 2020 11 30 162044 Screenshot 2020 11 30 162044

If you briefly scan the overview of the Position Changes report, you will see the following headings:

  • All Position Changes: This shows the total number of keyword changes that have occurred for the analyzed domain.
  • New: This is the total number of new keywords that are actually ranking.
  • Improved: This shows the number of keywords for which an analyzed domain has improved its ranking in the SERPs.
  • Declined: This shows the number of keywords which have declined in performance.
  • Lost: This one shows the number of keywords for which an analyzed domain has lost its position in the SERPs entirely.

You can also download and export a spreadsheet or PDF of these results to put together in a nice visual report Your format choices include the usual: Excel, CSV or CSV semicolon.

Competitors

Now, let’s go ahead and click on the next tab up top, Competitors. Here, you’ll see a detailed overview of the competitors of the domain you entered here.

For our site example, you will see that Amazon’s top competitors include eBay, Pinterest and Walmart:

Screenshot 2020 11 30 160319 Screenshot 2020 11 30 160319

This tab’s Organic Competitors section features several additional metrics you can use in your reporting, too. These include:

  • Keywords: The number of keywords bringing a user in from the search results.
  • Traffic: An estimated traffic number prediction. Basically, it says that you can expect to see traffic of this amount if your organic traffic continues to stay the same.
  • Cost: this is the overall monthly cost it would take to rank for each site’s organic keywords in Google AdWords.

This list of competitors and their corresponding data points can also be exported to a spreadsheet. Pro tip: As you continue to save and export these reports, be sure to store them in the same folder. It will make things much easier when it comes to compiling these reports in their own spreadsheet deliverables.

Pages

Next, let’s click on the tab up top labeled Pages. This one will allow you to view and export all of your site’s URLs:

Screenshot 2020 11 30 163403 Screenshot 2020 11 30 163403

You’ll also see specific metrics such as:

  • URL: The page the metrics are being compiled for.
  • Traffic: The amount of estimated organic traffic being driven to an analyzed page.
  • Traffic %: The percentage of traffic being driven to the page.
  • Keywords: The number of keywords that a given URL is ranking for in the top 100 Google search results.
  • Ads keywords: The number of keywords that are bringing users in via the paid ads being published in the Google SERPs.
  • Backlinks: The total number of backlinks pointing to a particular URL.

Subdomains

Finally, click on the Subdomains tab to view a chart like the following:

Screenshot 2020 11 30 163905 Screenshot 2020 11 30 163905

The information provided in tab is useful for making strategy decisions based on subdomain removal (if desired), 301 redirect removal (if needed) and other factors that may be weighing down the domain.

Dig into Your Data to Make Better Decisions Tomorrow

Practicing effectual SEO isn’t just about implementing best practices or following rules set by experts: It’s also about using the data at your disposal to make informed decisions that are custom-tailored to suit your site and its individual needs.

With the benchmarking and research tools available in Semrush, you can easily access, understand and save the data you need to make those decisions and achieve the best rankings possible.

Image credits

Screenshots by iloveseo.com / November 2020

]]>
https://iloveseo.com/tools/semrush/upgrade-your-benchmarking-and-research-skills/feed/ 0