iloveseo.com https://iloveseo.com The Friendliest Source of Industry News and Information Tue, 19 Sep 2023 14:43:10 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 Technical SEO Best Practices: How To Improve Your Website’s Architecture And Performance https://iloveseo.com/seo/technical-seo/technical-seo-best-practices-how-to-improve-your-websites-architecture-and-performance/ Mon, 20 Mar 2023 15:06:19 +0000 https://iloveseo.com/?p=8564 Technical SEO Best Practices: How To Improve Your Website’s Architecture And Performance on iloveseo.com by Steven Guberman

Maximize your website's performance with technical SEO best practices. Our expert guide will show you how to optimize your website's architecture and boost search visibility.

]]>
Technical SEO Best Practices: How To Improve Your Website’s Architecture And Performance on iloveseo.com by Steven Guberman

You need the right technical SEO best practices to improve your website’s architecture and performance. This isn’t rocket science; it’s more like an intricate dance of optimization that will help your site soar higher than ever – if done correctly! This article will give you the low-down on improving your website’s structure and performance with simple yet effective technical SEO strategies. So please put on your dancing shoes; let’s explore these best practices together.

Technical SEO is all about ensuring search engines can easily crawl, index, and rank your website to provide relevant search results for users. It involves optimizing websites from both a content and code perspective to ensure they are optimized properly. When implemented correctly, these strategies can increase organic traffic, improve user experience, and better rankings in SERPs (search engine results pages).

By following the steps outlined within this article, you can take advantage of the full potential of Technical SEO and ultimately increase ranking visibility while providing visitors with a seamless user experience across devices. Whether you’re just starting out or have been working on SEO for years, there’s something here for everyone – so read on!

Definition Of Technical SEO – Things You Need To Know

Technical SEO is a specialized search engine optimization (SEO) that optimizes website architecture and performance. It’s all about ensuring that search engines can easily crawl, index, and understand the website to improve its visibility in organic search results. Technical SEO covers activities such as creating an XML sitemap, setting up redirects, URL structure optimization, and more.

The primary goal of technical SEO is to ensure that webpages are accessible and easy for crawlers to understand so they can accurately index them in SERPs. This helps increase organic rankings since it allows search engines to quickly identify content on the page and assess how relevant it is compared to other websites. Additionally, when implemented correctly, technical SEO provides users with a better experience; faster loading times, fewer errors, and easier navigation – all these factors contribute towards higher rankings in SERPs.

It is important for any business or individual who wants their site to rank well organically to carry out technical SEO best practices regularly. Without doing this, you could end up with a poorly performing website, resulting in lower visitor engagement levels than expected and, ultimately, less traffic from natural sources and referrals.

In short, technical SEO should always be part of your overall strategy if you want your website or online presence to drive maximum returns from organic traffic channels! Moving forward, we’ll discuss specific aspects related to crawling & indexing that play important roles in improving visibility within SERPs.

pako:eNpNUkuP2jAQ isjH3paKgE3DpXY8FjaUFaEqlUnHIwzBKuJHdkOC7va 762yYbe7O9lz ONCV0Qm7DS8OYE6TZXU9yROCkpeAXTtpBuD4PBN3jEqeLV9ZXgNx2sdASZM61wraG9N0VNgsmJxL9e8UzmqE3NlbhrZrils6QX2KjBMy99zHzTk3OcX3gtFcFKFXSRqoQvkBj UvGDrKS79soFrpRtSDhYa08RLIwkVVTea22vWuLUWg9AFj9MojW3kFw9RsHTEH9t02Do7iP8yc y5E5qdUfHGAJq3tjoTSK6GmIsINW8gKwhKvY9NcJVHbhN42QtXz jOnaMa6nk0be4w3M1i8z3Ie6k89XseBme6tARrslxmJEVRjbB8x85xifiRWhVZ8rVPDI hrjVB 3sV3dx x4d4VLrsgrt4EacIPFpuqK7YIx 1il0Bce8RSRSn0e28XJ5pvAZWYa d2Qfe5vHoJvHFXZkXUxZRuF6iFmWQkLG3ToQXu6oEcYR WH6rQmlsAdWk18gWfgVfcsVQM7ciWrK2cQfCzrytnI5y9W7l LW6eyqBJv4zaQH1jaFj59J7pe7ZpMjr6xHG67 av15f 8ARdf YQ?type=png pako:eNpNUkuP2jAQ isjH3paKgE3DpXY8FjaUFaEqlUnHIwzBKuJHdkOC7va 762yYbe7O9lz ONCV0Qm7DS8OYE6TZXU9yROCkpeAXTtpBuD4PBN3jEqeLV9ZXgNx2sdASZM61wraG9N0VNgsmJxL9e8UzmqE3NlbhrZrils6QX2KjBMy99zHzTk3OcX3gtFcFKFXSRqoQvkBj UvGDrKS79soFrpRtSDhYa08RLIwkVVTea22vWuLUWg9AFj9MojW3kFw9RsHTEH9t02Do7iP8yc y5E5qdUfHGAJq3tjoTSK6GmIsINW8gKwhKvY9NcJVHbhN42QtXz jOnaMa6nk0be4w3M1i8z3Ie6k89XseBme6tARrslxmJEVRjbB8x85xifiRWhVZ8rVPDI hrjVB 3sV3dx x4d4VLrsgrt4EacIPFpuqK7YIx 1il0Bce8RSRSn0e28XJ5pvAZWYa d2Qfe5vHoJvHFXZkXUxZRuF6iFmWQkLG3ToQXu6oEcYR WH6rQmlsAdWk18gWfgVfcsVQM7ciWrK2cQfCzrytnI5y9W7l LW6eyqBJv4zaQH1jaFj59J7pe7ZpMjr6xHG67 av15f 8ARdf YQ?type=png
This Mermaid diagram represents some steps to perform a technical audit of a website. The process includes analyzing the website structure, checking website performance, reviewing on-page SEO, examining indexing and crawlability, inspecting mobile friendliness, and assessing site security. Each step is further divided into sub-steps to provide more specific areas to check during the audit.

Google Website Crawling And Indexing

Ah, website crawling and indexing; are the lifeblood of SEO. It can make or break your SEO efforts depending on how well you plan and optimize them. But fear not! You’ll be well on your way to improved performance with a few simple tips.

First off, let’s start with Google crawl optimization. This is essential for ensuring that search engines can find all pages within your site efficiently to index them properly in their databases. To achieve this, you want to minimize duplicate content issues and ensure no broken links are on your page. Use robots’ meta tags appropriately and create an XML sitemap file for submission when needed.

Next comes Google index optimization, which should include having a good internal linking structure between pages and proper usage of heading tags (H1-H6) throughout each page. You must also pay attention to keyword density and ensure keywords are used naturally within titles and headings instead of stuffing them into the body copy where possible. Lastly, focus on optimizing image alt attributes since these are important for helping search engine crawlers understand what images represent within the web pages during their analysis process.

Regarding best practices for crawling and indexing, ensuring URL parameters don’t block resources from being crawled by default is very important. Also, consider using canonical URLs if multiple page versions exist and avoid creating too many redirects since this could hinder the crawling speed of certain parts of the website – especially relevant when dealing with mobile sites separately from desktop ones!
To move forward effectively in our technical SEO journey, keyword research and usage must come next…

Keyword Research And Usage For Google SERPs

When it comes to SEO, keyword research and usage optimization are essential. Successful optimization requires understanding the search terms your customers use when looking for information related to your industry. Identifying these keywords will enable you to target long-tail keywords and improve website visibility in Google organic searches.

The first step is to conduct a comprehensive audit of your existing content and look for opportunities to optimize using relevant keywords. This involves evaluating titles, meta descriptions, headlines, and other components of each page on your domain. You should also assess how closely aligned individual pages are with specific topics or themes, which can help inform future keyword selection.

Once you have identified potential keywords and evaluated their relevance, you must consider how they fit into the overall architecture of your website. Incorporating them into page content naturally while ensuring good readability is important; this includes optimizing images with alt tags containing targeted keywords where appropriate. Additionally, it’s beneficial to include internal links between the various pages so that users can navigate more easily through your site hierarchy, which helps promote better SEO performance over time.

Finally, tracking metrics such as click-through rate (CTR) across different web pages allows us to measure our progress against goals over time, providing insight into what works best regarding keyword usage optimization strategy. With this data, we can continually refine our approaches based on user behavior and market demand shifts to create an effective online presence for our brand. As we explore internal link structure next, remember the importance of continually monitoring results for maximum success.

Why Is Your Sites Internal Link Structure So Important?

“A stitch in time saves nine.” The same can be applied to SEO: a well-structured internal link structure is essential for good website architecture and performance. Internal links are key to providing your users with an optimal experience and helping search engine bots crawl the site more effectively. Moreover, they help distribute page authority throughout your domain, thus boosting rankings.

When creating an effective internal linking strategy, it’s important to consider user intent first. Links should tell users what pages they will visit when clicked on, using descriptive anchor text that makes sense in the context of surrounding content. Additionally, don’t overdo it – too many links may overwhelm visitors or create confusion. You must also ensure you’re linking back to relevant pages within the same domain; avoid offsite links unless absolutely necessary.

All your internal links must also be functional – broken links frustrate users and negatively impact your SEO ranking. To ensure this isn’t an issue, use automated tools like Google Search Console or Screaming Frog regularly to audit your site for any errors or issues related to internal linking. Lastly, make sure you have proper tracking set up so you can monitor how people interact with your website after clicking through from one page to another.

By following these simple tips and tricks, you can easily optimize and improve your website’s internal link structure – leading to better overall performance! With a solid foundation in place, we now turn our attention toward URL optimization…

URL Optimization – Help Your Site Make Sense

Now that we’ve discussed internal link structure let’s explore the topic of URL optimization and how it can help improve your website’s architecture and performance.

One way to properly optimize URLs for SEO is by ensuring they are simple and intuitively organized. This helps search engines more easily crawl through them, resulting in better indexation. Using descriptive keywords within the URLs also aids with SEO rankings. It also makes them easier to remember and share across channels, increasing traffic potential.

Moreover, optimizing addresses is a crucial part of successful SEO performance. By ensuring all addresses have unique page titles, meta descriptions, headings, images, and other important elements, you ensure users and search engine crawlers understand each one’s purpose without confusion.

Here are some key aspects to consider when optimizing URLs for Technical SEO:

  • Avoid having too many folders/subfolders in the same address, making navigation difficult.
  • Ensure each folder has its name rather than just numbers or symbols.
  • Always use hyphens instead of underscores between words, increasing readability for both humans & bots.
  • Use short but meaningful words instead of lengthy phrases to reduce character length.

Optimizing website structure thus plays an integral role in improving overall website performance – from boosting organic visibility to fostering user engagement rates. Therefore it pays off to invest time into getting these details right from the start, so you don’t encounter problems down the line! Let’s now move on to exploring site speed optimization strategies…

Site Speed Optimization – Faster is Better!

It’s essential to optimize your website for speed. Site speed optimization is an important part of technical SEO success and should be a priority when developing or updating a site. The faster the page loading time, the better the user experience resulting in higher rankings. This can be achieved through several methods, such as optimizing images and reducing redirects.

A starting point would be to perform a site speed analysis of the current performance of web pages on both desktop and mobile devices. Analyzing this data gives insight into where improvements are needed, allowing you to prioritize tasks accordingly. You may find that changes to HTML markup, caching strategies, minifying scripts, and external resources must be made to improve overall web performance optimization.

Furthermore, it’s also beneficial to use tools like Google PageSpeed Insights, which provide actionable recommendations on further enhancing the core elements impacting page load times, including server response time, memory consumption, and browser rendering issues. Additionally, leveraging AMP technology (Accelerated Mobile Pages) can help reduce download times significantly – thus improving user engagement with content hosted by the site directly or indirectly via social media channels, etc.

Finally, employing the abovementioned techniques will improve search engine visibility and produce tangible benefits for users who depend on optimal experiences during their online journeys – leading them towards more conversions from organic sources rather than paid campaigns alone. To ensure your website provides maximum value for all visitors regardless of device type or connection quality, assessing mobile usability should come next.

Mobile Usability – Make It Fit Your Screen

A sleek website design is essential, but how does it hold up on mobile devices? Usability has become a key factor in optimizing any website’s architecture. Mobile usability is important when designing your site to ensure that all users have access to the same content and functionality regardless of their device.

The importance of considering mobile usability cannot be understated. As more and more people use their phones or tablets to browse online, you must ensure that your website caters to these audiences too. It’s not just about user experience either; Google now considers the loading speed and overall performance of websites on mobile devices when ranking them in search results.

Optimizing a website for mobile devices requires different considerations than desktop versions, such as page size, font size, tap targets, navigation styles, and loading times. To get ahead of the competition in this area, consider implementing features like accelerated mobile pages (AMP), which help increase page load time by stripping out unnecessary code from webpages before serving them to visitors.

Mobile optimization should also ensure all images are correctly sized, so they don’t take too long to load. This can significantly improve the user experience while helping boost organic rankings simultaneously. With proper planning and implementation, optimizing your website’s mobile usability can help bring better visibility and engagement with potential customers without sacrificing quality or performance.

Structured data markup provides context around certain elements on your webpage to help search engines understand what information they contain. Knowing how to use structured data properly is essential for maximum SEO impact across desktop and mobile platforms.

Structured Data Markup – Schema

Now that we have discussed mobile usability, it’s time to focus on Structured Data Markup. This type of SEO helps search engines understand the content and structure of your website better so they can serve up relevant results for users. Structured data markup is an important part of technical SEO best practices because it allows you to create rich snippets in SERPs which can help improve click-through rates from organic search results.

BenefitsExamples
OrganizeSchema/JSON-LD markups
EnhanceRich Snippets such as Star Ratings & Reviews
OptimizeLong Tail Keywords & Meta Descriptions

Structured data markup facilitates better indexing of webpages by providing information about the page’s contents and their relatedness. It also provides more accurate descriptions when displaying a webpage in SERP (Search Engine Results Pages). Furthermore, it enables websites to appear in Google’s featured snippet section at the top of SERP, thus increasing visibility and click-through rate.

For example, implementing structured data markup with long tail keywords will allow search engine crawlers to identify those terms easily when searching for them. Additionally, schema markups such as JSON-LD can make meta descriptions stand out more prominently than regular text versions allowing potential customers to quickly get a gist of what the page is about before clicking through. Finally, including ratings or reviews within the code will enable stars and other visuals to appear beside website listings, making them much more appealing to visitors who scan down their options on SERP.

Using structured data marks can bring numerous advantages; however, one must consider several things before implementation. Such considerations range from ensuring accuracy regarding any facts mentioned in the metadata description to compliance with applicable rules and regulations set by governing bodies like GDPR (General Data Protection Regulation). Keeping these points in mind while creating a well-thought-out implementation plan will ensure maximum benefit from structural data marking without legal issues. With this knowledge, let us move on to image optimization – another key component of successful technical SEO strategies.

Image Optimization – Compress and Squeeze Those Pics

“A picture paints a thousand words.” This adage is especially true for SEO optimization. Image optimization is important in website performance and architecture, as it can improve user experience and search engine rankings. Here are some best practices for optimizing images:

  • Image Compression
  • Reduce file size by compressing the image file format (e.g., JPEG)
  • Utilize online tools such as TinyPNG or Kraken to compress large files
  • Image Sizes
  • Use standardized sizes for all images on your website.
  • If possible, use vector graphics instead of rasterized images so that they remain crisp across browser screens with different resolutions.
  • Alt Tags
  • Include relevant alt tags that accurately describe each image’s content and purpose.
  • Avoid keyword stuffing; keep it concise but descriptive!

Image optimization should be thoughtfully implemented into any website design strategy to maximize SEO potential. Next up is canonicalization – this vital process ensures that URLs point to the right page without creating duplicate pages.

Canonicalization – Letting Google Know What Is Important

Canonicalization is an important SEO practice that helps manage duplicate content and prevent URL standardization issues. It’s a process of taking multiple URL versions and pointing them to the same page, thus consolidating any associated signals from search engines into one single version. This also prevents situations where different URLs are used for what should be seen by search engines as identical pages with identical content. You can ensure your website architecture is optimized for performance through canonicalization, resulting in better rankings and higher visibility in SERPs.

The main purpose of canonicalization is to help reduce or eliminate duplication on websites by normalizing all the URLs into one single version so that they can be accessed easily. When implemented correctly, it will improve user experience since visitors won’t find themselves stuck in endless redirection loops while navigating your site. Additionally, this technique assists with preventing penalties due to duplication issues when crawling and indexing your web pages.

When utilizing canonicalization techniques on your website, there are some best practices you’ll want to follow, such as avoiding abusive redirects and self-referencing canonical, which could result in poor performance within SERP results if not done properly. Also, remember that 301 permanent redirects should always be preferred over 302 temporary redirects whenever possible, as they send stronger signals and closer link equity between two pages more effectively than their counterparts do.

By implementing proper canonicalization strategies on your website, you’ll have peace of mind knowing that search engine algorithms view each page uniquely without the risk of being penalized for duplication errors or getting lost among numerous competing URLs from other sites vying for attention online. With these considerations taken care of, we can move on to creating and submitting sitemaps – another vital step toward optimizing our website’s performance!

Sitemap Creation And Submission

The next technical SEO best practices step is sitemap creation and submission. A successful website architecture requires an up-to-date and well-structured sitemap that can be used as a guide for search engine crawlers, helping them to index your content more effectively. To ensure you completely understand this process, we will discuss how to create and submit an optimized sitemap, analyze its performance, and build it correctly.

StepDescription
1Create Sitemaps with XML or HTML formats based on the size of your site
2Submit your sitemaps to Google Search Console & Bing Webmaster tools
3Monitor & Optimize: Analyze the traffic from organic sources & make changes if needed
4Build Correctly: Make sure URLs are valid & navigate properly between pages on your site

Creating accurately structured XML or HTML formatted sitemaps is essential for optimizing website structure for search engines. Using these files helps search engine crawlers find all the important resources within a website much faster than before. Once completed, submitting these documents to both Google Search Console and Bing Webmaster Tools should be done so they are available for crawling by each platform’s respective bots. This will also give you better insights into how the two platforms view your web pages’ optimization level. Afterward, monitoring and optimizing your submitted sitemaps should be taken seriously since analytics provided by these services can help identify any issues related to crawling ability, such as broken links or duplicate content problems. Lastly, building out carefully constructed redirects between different pages ensures that users don’t get stuck navigating through dead ends while browsing your site – something beneficial for SEO and the overall user experience.

To further enhance a website’s visibility online, robot exclusion protocols must also be configured accordingly, given their importance regarding what crawlers may or may not index from specific areas on a page/site – making it another crucial piece of the puzzle put together effectively SEO strategies.

Robots Exclusion Protocols

Robots Exclusion Protocols (REP) are essential tools in technical SEO. REP allows website owners to control which areas of their websites can be accessed by web crawlers, search engine spiders, and other automated agents. These protocols help keep a website secure from malicious content scraping or indexing that could harm its performance.

Here are three key points about robots exclusion protocols:

  1. Robots.txt is the main file for controlling access to a website via REP.
  2. Robot-exclusion directives must be clearly defined and communicated to crawlers and other automation agents with precision and accuracy.
  3. The robot protocols should be regularly monitored and updated when necessary due to changes in your site’s architecture or technology updates on the web infrastructure level.

For optimal security protection, it’s important to ensure that all exclusion protocols have been correctly implemented across every website page, including any subdomains or microsites you may have connected with the primary domain URL. Regularly checking all robot directives will help maintain overall security on your website and protect against potential cyber threats like malware injections or scrapers attempting to steal valuable data from your online presence. Additionally, ensure no broken links are caused by incorrectly written rulesets within the robot’s text file; these could lead to more serious issues if not addressed properly over time.

It’s also worth noting that some search engines like Google offer various options for customizing how crawlers interact with certain sections of a website using their own proprietary parameters like “noindex” tags, allowing even greater levels of control over what gets indexed into search results pages without having to rely solely on REP alone. To maximize the efficiency and effectiveness of both REP and customized crawling instructions, reviewing them together periodically is highly recommended for best practices in protecting one’s digital assets while optimizing visibility through organic search traffic channels simultaneously. With this knowledge under our belt, we can now move on to evaluating and enhancing the security posture of our websites moving forward.

Security Evaluation And Enhancement – Keeping It Safe!

Securing a website is paramount for any technical SEO project. Security evaluation and enhancement should be the first step in improving website performance. Through security protocols, vulnerability scans, and other measures, it’s possible to identify potential risks and take action to mitigate them before they become an issue.

The best way to assess website security is by evaluating existing security policies and software configurations. This can give insight into areas needing improvement or further protection against malicious attacks. In addition, performing regular updates of current programs will help ensure that all data remains safe from exploitation.

Furthermore, staying updated with industry-standard encryption techniques like SHA-256 and TLS/SSL certificates and implementing firewalls and intrusion detection systems (IDS) to monitor suspicious activity is important. Additionally, websites should also have secure login methods such as two-factor authentication (2FA), CAPTCHAs, or biometric scanners set up to protect user information.

Finally, organizations should invest in continuous monitoring through automated scanners to identify emerging vulnerabilities quickly before hackers can exploit them. By proactively addressing these issues ahead of time, businesses can rest assured knowing their site is protected against malicious threats while maintaining optimal web performance standards. With this foundation of comprehensive security in place, we can move on to content optimization strategies in the next section.

Content Optimization – Natural Language Processing

Now that your website has been secured and evaluated, optimizing content for better search engine performance is the next step. Content optimization consists of various strategies and techniques designed to improve the visibility of web pages in organic search results. To ensure optimal results, it’s important to understand how these tactics can be used together to maximize SEO potential.

First, use keywords throughout all text on the page. This will help crawlers identify what topics are being discussed and rank those pages higher in search engine result pages (SERPs). Keywords should appear naturally within context to not disrupt readability; stuffing them into sentences won’t do any good! Additionally, ensure titles accurately reflect the page’s contents – this helps users and search engines find relevant information quickly.

Next up is optimizing images with appropriate file names, captions, and alternative texts – also known as ALT tags. These attributes provide helpful hints about an image’s context, allowing crawlers to index them correctly for SERP rankings. Furthermore, adding internal links between related articles will create a network of interconnected pieces that encourages repeat visitors by providing more material from your site. Lastly, take advantage of third-party tools like Google Search Console or Bing Webmaster Tools to learn more about how your content is performing online.

By following these steps, you’ll be able to craft content that meets user needs while still adhering to best practices for SEO success. You can immediately start seeing organic traffic improvements with accurate keyword usage, optimized images, and internal linking! Now that you have implemented some basic content optimization measures, it’s time to focus on monitoring SEO performance…

Monitoring SEO Performance – Getting The Fruits Of Your Labor

Monitoring SEO performance is key for any website to stay ahead of the competition and achieve organic ranking success. It’s important to track your website’s search engine visibility and measure its overall performance. Doing so will help you identify areas that need improvement or additional optimization. Let’s look at some popular methods for monitoring a website’s SEO performance:

Website TrackingSearch EngineWebsite Analytics
ToolsGoogle AnalyticsBing Webmaster ToolsGoogle Search Console
PurposeMeasure Traffic & PerformanceMonitor Indexed Pages & Crawl ErrorsTrack Rankings & Queries with Keywords

Google Analytics can measure traffic levels, user engagement, and other important metrics related to site performance. Additionally, it provides insights into keywords driving users to your pages and how they interact with each page’s content. Meanwhile, Bing Webmaster Tools allows you to monitor indexed pages on the search engine and detect crawl errors that might affect rankings in their index. Last, Google Search Console gives you access to valuable data about queries people use when finding your website through keyword searches. This tool also helps you determine which query generated more clicks and analyze how much time visitors spend on a page before leaving the site.

With all these tools in hand, webmasters gain valuable insight into how their websites perform on different search engines such as Google or Bing, what type of content resonates best with users, which keywords drive the most traffic to their sites, and many other aspects affecting SEO success. Having this information makes it easier for them to spot opportunities for growth and make necessary improvements more efficiently. So if you’re serious about improving your website’s architecture and performance, tracking SEO progress should become essential to your day-to-day operations.

technical seo

Frequently Asked Questions

What Is The Best Way To Handle Duplicate Content?

Did you know that the percentage of websites with duplicate content is estimated to be at least 30%? This begs the question, what is the best way to handle duplicate content? One of the most important elements in a successful SEO strategy is managing and avoiding duplicate content. Here we will discuss some effective strategies for tackling this issue.

When it comes to dealing with duplicate content, there are several solutions available. The key lies in developing a comprehensive plan and implementing it effectively. A good starting point would be to use 301 redirects or canonical tags when possible – they can help ensure search engines only index one version of your web page, thus preventing potential penalties due to duplication. Additionally, you should prevent any accidental creation of new versions of existing pages by ensuring your URLs are concise and consistent across all platforms. Furthermore, if needed, you could also block certain sections of your website from appearing in search engine results via a robots.txt file or meta robots tag.

The next step would be to create a detailed audit of your site’s architecture so that you have visibility into how each piece fits within its overall structure. You should then identify areas with overlapping information or unnecessary repetition that can lead to an unwieldy amount of duplicated material on your site and rank-reducing algorithm penalties from Google and other major search engines. To ensure every part works together efficiently, you may need to optimize internal linking structures and navigation menus, making them user-friendly while still conforming to SEO best practices such as keyword usage and proper anchor text formatting.

Here are some additional tips for handling duplicate content:

  • Utilize cross-domain canonicalization where applicable
  • Use unique titles & descriptions for every page
  • Monitor changes regularly using analytics tools like Google Search Console
  • Take advantage of social media sharing features (LinkedIn, Twitter, etc.)

By following these strategies and investing time in understanding how different pieces work together, businesses can avoid costly penalties while optimizing their sites for maximum performance benefits. If done properly, the result will be increased traffic and improved rankings – essential for success in today’s digital landscape!

How Do I Optimize Content For Voice Search?

Voice search optimization is a key element of online success. As technology evolves, companies must stay ahead of the trends to remain competitive. To optimize content effectively for voice search, businesses need to understand how it works and what techniques they can use to maximize their results.

When optimizing content for voice search, several best practices should be incorporated. First and foremost, website owners should focus on creating unique content tailored specifically for this type of search engine optimization (SEO). This includes considering query intent, natural language processing, and local listings management. Additionally, webmasters should ensure that their page titles include keywords relevant to the user’s query and that structured data markup is used if possible.

To improve performance with voice search optimization tips, website owners should also consider utilizing tools like Google Search Console or Bing Webmaster Tools. These platforms provide valuable insight into how users interact with your site through voice searches and allow you to make informed decisions about which pages or topics require more attention from an SEO perspective. Furthermore, these services will help identify potential indexing speed or loading times issues that could negatively affect rankings.

Finally, businesses need to keep up with changes in the industry by staying abreast of new strategies related to voice search optimization. For example, some websites have begun implementing featured snippets – short paragraphs providing quick answers – to drive better click-through rates from organic traffic sources when users conduct queries via speech recognition software. By leveraging innovative tactics such as this one alongside other traditional SEO methods, webmasters can increase visibility for their brand in both organic and paid channels – ultimately resulting in increased revenue over time.

How Can I Measure The ROI Of The SEO Efforts Of My Site?

Have you ever wondered how to measure your SEO efforts’ ROI (ROI)? When tracking and understanding the effectiveness of search engine optimization, several steps must be taken to ensure success.

The first step is to take a look at your website analytics. This will help you understand where organic traffic comes from and other metrics such as page views, bounce rate, time spent on site, etc. You can then use this information to identify areas for improvement or opportunities for further optimization. Additionally, by keeping track of these metrics over time, you’ll have a better idea of what strategies are working and which aren’t.

Next, if possible, set up an SEO tracking system so that you can monitor performance more closely.

To do this effectively, you should focus on key performance indicators (KPIs) such as:

  • Organic Traffic:
  • Number of visitors
  • Pages per visit
  • Average session duration
  • Conversion Rate:
  • Percent of visitors who complete goals/objectives
  • Shopping cart abandonment rate

By monitoring your KPIs regularly, you’ll be able to determine whether or not your efforts have the desired effect and make adjustments accordingly. This will also allow you to assess the ROI of each individual strategy so that you can optimize accordingly and maximize results.
Finally, it’s important to remember that SEO optimization isn’t just about increasing rankings; it’s also about providing value through content creation and link-building activities to drive organic growth. If done correctly, these tactics can provide long-term benefits for your business and improve overall ROI. Keep testing different approaches until you find the best combination for your needs!

What Are The Best Seo Tools To Use?

When it comes to SEO, a variety of tools are available for website owners and marketers. Knowing which are best suited for your needs is key when improving the performance of a website’s architecture and technical SEO efforts. This article will discuss the best SEO tools to optimize any website.

An SEO analysis tool is one of the most essential pieces of software an SEO specialist should have. With such a program, you can easily identify what areas need improvement on your site and where potential gains or losses could occur from certain changes. These results can provide valuable insight into what keywords would yield better visibility and higher rankings on search engines like Google.

Another important element of SEO success is keyword research. A good keyword research tool will help you uncover which terms users are searching for related to your niche and industry so that you can create content targeted toward them and gain organic traffic from those queries. Additionally, with such data, you’ll know exactly how competitive each term is – giving you invaluable insights into which topics may offer more opportunities than others!

Furthermore, understanding who is linking to your website (and why) is just as vital as knowing what keywords people are using to find it; after all, backlinks act as votes of confidence in the eyes of search engine algorithms! Access to an effective backlink analysis tool makes it much easier to understand where these links come from (and whether they’re beneficial or detrimental). It also allows one to monitor competitors’ link profiles, highlighting areas where improvements can be made over time.

Lastly, performing periodic site audits helps ensure that no issues go unnoticed when optimizing a website’s architecture and performance for search engine crawlers. This process involves checking for broken links/images; identifying pages with thin content; reviewing page titles & meta descriptions; scanning source code; etc. All these tasks become significantly simpler through high-quality site audit software – ensuring nothing slips past our attention during implementation!

In short, by leveraging comprehensive SEO tools like analysis programs, keyword research utilities, backlink analyzers, and site auditing software – any webmaster or marketer has everything they need to improve their website’s architecture and performance for maximum SEO effectiveness!

How Do I Use Seo To Drive Traffic To My Website?

Are you looking for the most effective way to drive traffic to your website? SEO is one of the best strategies that can help boost visibility and online presence. According to statistics, an estimated 92% of all web traffic comes from organic searches. Leveraging SEO techniques helps optimize websites by improving their architecture and performance, ultimately driving more visitors.

To use SEO effectively, it’s necessary to understand how they work to maximize its potential. Improving a website’s architecture involves optimizing content with targeted keywords relevant to what the page is about and adding meta descriptions and title tags with those same keywords. This allows search engines like Google or Bing to find pages on your site easily when people search about these topics. Ensuring each page loads quickly and efficiently contributes to good SEO performance.

Furthermore, there are various tools available that can be used to measure a website’s overall health, such as Screaming Frog, which allows users to audit both internal links and external ones as well as check for any broken links or missing images/alt text; or Moz Pro which provides insights into keyword rankings along with other helpful information including analytics data on domain authority levels across multiple sites. Utilizing these resources correctly will better understand where improvements need to be made. Hence, to avoid increasing bounce rates too much due to slow loading times or poorly optimized titles/descriptions.

Taking advantage of this type of strategy ensures greater visibility in SERPs (search engine results pages) for target audiences while providing valuable content at the same time – something which has become increasingly important nowadays, given consumer trends have shifted towards wanting more innovative solutions than ever before. It’s, therefore, essential for businesses who want success online to invest in SEO if they haven’t done so already – doing so will allow them to reach goals faster while still maintaining high-quality standards throughout the process.

Adopting a technical approach using proven methods consistently over the long term should bring desired outcomes within a relatively short period, especially if combined with other marketing activities such as paid ads campaigns or social media outreach efforts. Then, leveraging SEO is, without a doubt, a powerful tool worth considering seriously if seeking ways to improve a website’s visibility online successfully over extended periods of time!

Conclusion

As a technical SEO specialist, I have seen countless websites suffer from poor architecture and performance due to inadequate SEO practices. However, any website can be optimized for search engine success with the right strategies and best practices.

Taking proactive steps such as dealing properly with duplicate content and optimizing for voice search can greatly improve your website’s visibility on SERPs. Additionally, using tools like Google Search Console and Ahrefs allows me to measure the ROI of my efforts while driving targeted traffic to your site.

Ultimately, by utilizing these techniques, we can ensure that your website is well-optimized and performing at its highest potential. Therefore, through proper implementation of technical SEO best practices, you should see an improvement in both organic ranking and conversions over time.

]]>
Use the Canonical Tag to Boost Traffic and Authority https://iloveseo.com/seo/technical-seo/use-the-canonical-tag-to-boost-traffic-and-authority/ https://iloveseo.com/seo/technical-seo/use-the-canonical-tag-to-boost-traffic-and-authority/#respond Thu, 29 Sep 2022 22:03:16 +0000 https://iloveseo.com/?p=39 Use the Canonical Tag to Boost Traffic and Authority on iloveseo.com by Carrie Powers

Whether you’ve realized it or not, you’ve come across plenty of duplicate content just by browsing the web. Sometimes it’s obvious—ever seen the same news article published on two different...

]]>
Use the Canonical Tag to Boost Traffic and Authority on iloveseo.com by Carrie Powers

Whether you’ve realized it or not, you’ve come across plenty of duplicate content just by browsing the web. Sometimes it’s obvious—ever seen the same news article published on two different sites? That’s duplicate content. Other times, it’s so subtle it’s barely noticeable. If you type a URL into your browser bar without the www prefix but are automatically sent to a URL that does have it, you’ve just been moved from one duplicate page to another.

This can be achieved with the canonical tag, a powerful tool that tells search engines which duplicate page is the original version (i.e. the canonical version) and which is the secondary version (i.e. the non-canonical version). By using this tag effectively and sidestepping common mistakes, you can avoid duplicate content, boost search result rankings and authority, and improve a site’s user experience.

How the Canonical Tag Helps You Control Duplicate Content

Duplicate content is to search engines what a wrench is to a spinning gear. When a site has two or more identical versions of the same piece of content, Google doesn’t know which version to index, which should rank for search results or what to do with link metrics. Worse still, its bots may spend valuable time crawling multiple copies of the same page rather than crawling the site’s new or updated content.

However, it’s important to note that in general, duplicate content is not inherently deceptive. A site might have duplicate content for a variety of valid technical reasons, including inadvertent URL variations, improperly configured content management systems (CMS), different language variants and printer-friendly page versions. Accordingly, Google doesn’t typically penalize any duplicate content it perceives as innocuous.

Even so, the less duplicate content a site has, the more likely it is to be efficiently crawled and achieve greater prominence in search results.

This begs the question, how can duplicate content be avoided in the first place? The answer lies in the canonical tag, an HTML element that prevents and eliminates duplicate content issues. When you tag a page in this way, its address becomes a canonical URL.

By learning how to use this tag properly, you can boost a site’s visibility, performance and user experience in one fell swoop.

Choose a Canonical Page

First, specify which version of a page you want Google to view as canonical—i.e., choose which page version you want people to see in search results. Your preferred version should be the one with the best performance. If all versions perform equally well, pick your favorite one.

The simplest way to indicate which page is canonical is to use the canonical link element. In the(not) of a non-canonical site page, insert the tag to direct search engines to the canonical one. The tag itself is both straightforward and brief:

To be clear, the canonical tag is technically an HTML element, not a tag—that designation belongs to the portion of the element. Nevertheless, it’s almost always colloquially referred to as such, so to avoid confusion we’ll call it the rel=canonical tag or canonical tag for short.

You can also use the canonical HTTP header instead of the canonical link element. Google added this option to give webmasters the ability to canonicalize non-HTML documents such as PDF versions without increasing page size.

By adding the canonical HTTP header to a non-HTML page, you can choose the HTML page you want to direct search engine crawlers to.

Please note that using the canonical HTTP header is considered to be an advanced technique. Google has marked it as such because the headers can be difficult to keep up with on large sites or sites with frequently changing URLs and URL parameters .

If you feel up to the challenge, you can insert a canonical HTTP header using the following snippet of code in your page’s source code:

Link: <http://www.example.com/resources/ebook.pdf>; rel=”canonical”

Throughout the canonicalization process, remember self-referencing canonical tags are OK. For example, the homepage www.example.com can point to the same URL, www.example.com. This may seem unnecessary, but it can help further clarify to search engines which page you want to be indexed.

Identify Canonicalization Issues

If the canonical tag is already being used on a site, it can be difficult to find pre-existing issues.

To avoid tediously combing through your XML sitemap, try using a free tool like the Screaming Frog SEO Spider (the Yoast SEO WordPress plugin is a stellar option too).

Once you’ve downloaded the SEO Spider, enter the URL of the site you want to analyze.

Then, click on the “canonicals” tab. This will bring up a complete list of the site’s canonical URLs and show you which pages are indexable and which aren’t.

Screaming Frog Canonicals Tab Screenshot

When reviewing this list, keep an eye out for canonical tags that:

  • Point to the wrong page. For instance, one non-canonical page might point to another non-canonical page instead of the canonical one.
  • Use the wrong URL. For example, a tag might point to a URL that doesn’t include a trailing slash (www.example.com) when it should be pointing to a URL that does (www.example.com/).
  • Send mixed signals to search engines. This can occur when page X points to page Y, and page Y points to page X. In that scenario, search engines won’t know which page is canonical.
  • Point to the first page of a paginated series. If the second page of a series points to the first page, then the second page will not be indexed.
  • Contain relative rather than absolute URLs. Relative URLs don’t specify the protocol (example.com), while absolute URLs do (https://example.com). If a tag contains a relative URL, search engines will likely interpret it incorrectly.
  • Appear multiple times on the same page. If a single page has more than one canonical tag, search engines will ignore all of them.

When fixing any canonicalization issues, always keep in mind even slight canonical URL differences can matter in the eyes of search engines.

The following URLs are all viewed as distinct by search crawlers:

  • http://www.example.com
  • https://www.example.com
  • http://www.example.com/
  • https://www.example.com/
  • http://example.com
  • https://example.com
  • http://example.com/
  • https://example.com/
  • www.example.com
  • example.com

So, be sure to keep things consistent while addressing existing canonicalization issues or adding new canonical tags, whether you’re using Yoast SEO, the Screaming Frog SEO Spider or another tool altogether.

Give Precedence to HTTPS Pages

Google specifies in its canonicalization guidelines that it prefers HTTPS pages over HTTP pages by default.

What’s the difference between the two? HTTP, or hypertext transfer protocol, is a communications protocol used to transfer information via the internet. HTTPS, or hypertext transfer protocol secure, is the same type of protocol, except it’s encrypted.

With HTTPS, data is transferred using transport layer security (TSL) protocol. TLS offers three key security benefits: encryption, data integrity and authentication.

In an effort to protect user data and promote widespread encryption, Google has expressly encouraged the adoption of HTTPS over HTTP. Given that 95 percent of Google traffic was encrypted as of September 17, 2022, that effort has proven to be largely successful.

With that in mind, it’s prudent to specify HTTPS pages as being canonical while specifying any duplicate HTTP pages as non-canonical.

Allow Indexing on Canonicalized Pages

You can use the noindex directive to stop Google from indexing pages you don’t want to be included in search results, such as login or thank you pages.

At first glance, it seems logical to include noindex on non-canonical pages, too. If you’re going to point search engines toward one main canonical page anyway, why not block indexing on the pages you don’t want to rank?

The answer has to do with link equity, once known as link juice, a process in which external or internal linkspass authority and ranking power to other links.

By adding the noindex directive to a non-canonical page, you’ll be losing any link equity that page may have, which can lower the ranking power of the canonical page. But since canonical tags pass link equity, they don’t create the same problem.

To avoid any negative impact on valuable internal link equity, allow indexing on canonicalized pages and save the noindex directives for pages that truly shouldn’t be indexed.

It’s also worth noting Google no longer supports noindex in robots.txt, so be sure to use noindex in either the HTTP response headers or page HTML instead. To check if search crawlers can access a page you don’t want to be indexed, try using the Google Search Console URL Inspection Tool.

Take Advantage of Cross-Domain Canonicalization

By canonicalizing pages across multiple sites, you can tell search engines you’d like them to index a page’s content on a single domain rather than each one individually.

This is referred to as cross-domain canonicalization, a strategy often used to generate traffic from content syndication, i.e. content that’s re-published on sites other than the original.

For instance, one news website (site A) may publish original content that’s then re-published on another news site (site B). Site A gets exposure and increased organic traffic, and site B gets fresh and relevant content.

Site A can benefit from that scenario even further, though, by canonicalizing the article. Even though the article is re-published on site B, the canonical tag will tell search engines the definitive version of the article is on site A.

As a result, users making relevant search queries are more likely to see the original article as it appears on site A in search results.

Cross-domain canonicalization isn’t just useful for canonicalizing content on third-party sites, either. If multiple domains belong to the same owner and the same article is published on several of them, the site owner can use the canonical tag to specify which domain they want to show up in search results.

Prioritize Responsive Design

For sites with separate mobile URLs, you can set the desktop version as the canonical URL to tell search engines to index it instead of the mobile version.

However, Google explicitly recommends eschewing separate URLs in favor of responsive web design, which automatically adjusts page layout to suit the device type being used to view it.

With responsive design, if a user visits a page on a mobile device, they won’t be redirected to a separate mobile-specific URL (for instance, www.m.example.com).

Instead, the website will identify the type of device they’re using and alter its layout accordingly while using the same URL (www.example.com). This provides a better user experience, eliminates the need to manually create multiple layouts of the same site, and streamlines analytics and performance tracking.

As such, when you want to ensure a positive user experience across platforms and devices, it’s best to prioritize responsive web design over a separate canonical URL when possible.

Know When to Use 301 Redirects

There are times when the canonical tag isn’t the best way to specify which page is canonical. For example, when deprecating a duplicate page, Google recommends using a 301 redirect (also known as a 301 status code) instead.

In users’ eyes, the difference between the two options is a 301 redirect means they never see the page they were trying to reach in the first place. With the canonical tag, they’re still taken to the URL they entered or clicked on, such as a URL for products in a specific color, even if it’s non-canonical.

What makes 301 redirects different from other types of redirects? Unlike 302 and 307 redirects, 301 redirects tell search engines the page in question has been permanently moved to a new location. By comparison, 302 and 307 redirects indicate a page has been temporarily moved to another location.

The result is that pages with 301 redirects immediately transfer about 95 percent of their link equity to the new destination page.

Preserving link equity this way can significantly impact a site’s authority and search rankings. Although 302, 307, and other types of redirects are no longer directly penalized, it takes some time for Google to realize the redirect is no longer temporary and start passing on link equity accordingly.

You can also replace a site’s 404 pages with 301 redirects when appropriate. For instance, you could use a 301 for a URL leading to a non-existent (but previously well-trafficked) page about custom shoes.

The redirect sends visitors to a current page about custom clothing and ensures Google indexes the correct page. Remember, though, to always guide visitors to the most relevant alternative page possible.

If you do decide a 404 page would be more appropriate (for instance, for a URL that received minimal traffic or was never functional to begin with), consider using custom 404 pages for a better user experience.

The Rel=Canonical Tag? You Can Rel=Conquer It

Proper use of canonical URLs isn’t as much of a complicated subject as it may initially seem. From implementation to troubleshooting to fine-tuning, anyone can master the rel=canonical tag with the right strategies (and reap SEO benefits in the process).

Once you’ve canonicalized a site’s pages and blog posts with the canonical tag, you’ll be able to unify pesky duplicate versions of identical content, direct search engine traffic where you want it to go while boosting your authority, improve overall search engine optimization and create a streamlined and intuitive user experience.

Image Credits

Screenshot by author / March 2020

]]>
https://iloveseo.com/seo/technical-seo/use-the-canonical-tag-to-boost-traffic-and-authority/feed/ 0
How Many Sites on WPEngine Have a Staging Subdomain Indexed? https://iloveseo.com/seo/how-many-sites-on-wpengine/ https://iloveseo.com/seo/how-many-sites-on-wpengine/#respond Wed, 24 Mar 2021 17:02:58 +0000 https://iloveseo.com/?p=1678 How Many Sites on WPEngine Have a Staging Subdomain Indexed? on iloveseo.com by Brian Harnish

We posed an intriguing question to ourselves: How many people have their staging servers indexed, and how many of those are also hosted on WPEngine? During the course of our...

]]>
How Many Sites on WPEngine Have a Staging Subdomain Indexed? on iloveseo.com by Brian Harnish

We posed an intriguing question to ourselves: How many people have their staging servers indexed, and how many of those are also hosted on WPEngine?

During the course of our SEO work, we’ve frequently run into clients who have their staging servers indexed. This happened often enough that we began to wonder if there are any specific hosting servers to blame for this phenomenon.

In the end, we think that it’s a matter of who the webmaster is and whether or not they’re savvy enough to know that they should not be indexing the staging site.

But even though it’s the webmaster rather than the hosting provider who’s responsible for managing important indexing details, less experienced site administrators can encounter duplicate content issues stemming from indexed staging servers.

Why Should Staging Servers Be Noindexed?

The reason behind this optimization point is that the staging site can be a 1:1 duplication of the client’s main site. This is what we find to be true in the majority of cases, and it’s a significant cause of client site performance issues on Google.

It is well-known that Google does not like duplicate content, and neither do websites themselves.

However, many webmasters do not look to see if their staging site is indexed, including its IP address. Both can be damaging to the site’s overall ranking and traffic performance.

Our Findings Regarding WPEngine

At the time of writing, we’ve discovered that approximately 2,050,000 WPEngine sites have this basic issue that isn’t resolved.

Here is how we arrived at this conclusion.

Using the site: operator, we performed research into Google based on the domain name with a wildcard.

In other words, site:*.wpengine.com

Using this Google operator, it’s possible to see how many sites with staging addresses on wpengine.com are indexed.

Screenshot of Google Search Operator used to determine indexation of WPEngine staging sites.

While this won’t reveal sites that have their own staging.domainname.com on WPEngine, it does give a rough approximation.

This means that sites which are not addressing this issue properly could be missing SEO opportunities and failing to overtake their competition.

As such, resolving this issue may make SEO much easier for these sites.

What Is the Best Way to Resolve This Issue?

Do a quick check and see if your site is actually being indexed through the staging subdomain or its IP address.

You can do so by using the site: search operator for your full staging address and entering it on Google search. Your query should look like the following:

site:https://staging.sitename.com/

This will provide you with data directly from Google that shows, in detail, the staging subdomain(s), IPs and other elements that are currently indexed.

All you have to do is make sure the offending URLs are removed from Google via the robots.txt file, noindex directive, or Google’s URL Removals tool.

We recommend exercising extreme caution when you are using Google’s URL Removals tool, however, because if used improperly it can cause irreparable damage.

Fixing This Issue Will Improve Your SEO

You would be surprised how many times we run across this issue in an SEO audit, and it’s not limited to WPEngine sites. It happens so often, in fact, that we have made it a checklist item in our official audit process.

This is why it’s so important to not leave any stone unturned during the audit process: You may find something that is causing such an issue (like duplicate content) that would make it almost impossible for you to compete at all.

Why not make it easier on yourself at the outset and achieve higher rankings in the process?

]]>
https://iloveseo.com/seo/how-many-sites-on-wpengine/feed/ 0
Google Has an Update for Page Speed Insights Scores https://iloveseo.com/seo/google-has-an-update-for-page-speed-insights-scores/ https://iloveseo.com/seo/google-has-an-update-for-page-speed-insights-scores/#respond Thu, 11 Mar 2021 03:52:30 +0000 https://iloveseo.com/?p=1496 Google Has an Update for Page Speed Insights Scores on iloveseo.com by Brian Harnish

It appears that Google has made sweeping changes to how their page speed insights scores are calculated, resulting in a significant improvement of overall scores across the board. Don’t get...

]]>
Google Has an Update for Page Speed Insights Scores on iloveseo.com by Brian Harnish

It appears that Google has made sweeping changes to how their page speed insights scores are calculated, resulting in a significant improvement of overall scores across the board.

Don’t get too excited, though: It appears that the improvements occurred on their side. Let’s explore what that entails and what it could mean for the site you’re optimizing.

What Exactly Is Google Page Speed Insights?

Google’s Page Speed Insights (PSI) is a tool used by many webmasters to gauge page speed and performance on both mobile and desktop devices. It also displays Core Web Vitals, which are crucial components of the upcoming page experience update:

Page Speed Insights metrics for Twitter's homepage URL, twitter.com.

There are certain weaknesses in terms of how it processes data, but that is easily rectified by also looking at other tools such as GTMetrix, Pingdom, or WebPageTest.org.

In the real world, one could use a single tool to capture the entire page experience from both the server and client side. But sadly, most tools don’t work in the real world. It’s therefore necessary to use more than one tool in order to gauge accurate performance of your site from industry benchmarks and metrics.

But, when it comes to Page Speed Insights, it does the job.

Myths about Page Speed Tools

Myth 1: It’s Possible to Capture Page Speed Using One Metric

Unfortunately, this is not the case. Identifying specific metrics involves examining a series of key milestones in your users’ journeys. You must understand all metrics that are involved in creating the final numbers and then you must ensure that they are accurately represented by your tool of choice.

Myth 2: If I Create One Score, My Rankings Will Magically Increase

This is, sadly, not how things will unfold. It requires significant tweaking in order to identify the specific improvements that page speed needs, and then you can create the metrics that will be most effective within a tool.

There are many different tools available, though, and not every tool will give you the exact solution you need. This is where an SEO analyst comes in. The analyst, if they are doing their job correctly, will assess specific metrics according to many different tools and put together recommendations that will allow you to make prioritized improvements to your page speed numbers.

Myth 3: Every Tool Is Equal When It Comes to Considering Different Devices

Again, there are too many factors in this equation to give an accurate assessment on. Certain page speed tools will give measurements that apply at a certain time to certain devices of the time period, but upgrades and devices evolve over time. They are never completely equal (or accurate) and it’s up to the analyst to come up with recommendations that can be tailored accordingly.

Myth 4: Fast Websites Will Immediately Result in Better Conversions

This is seldom the case. On their own, faster websites do not necessarily lead to better conversions or more customers. You can have the fastest website in the world, but if you don’t have content, links or a good SEO strategy, you will probably not reach the higher conversion rates you’re seeking. Page speed, in combination with the factors mentioned above, is what will help you achieve the conversions you deserve.

Two Different Types of Data Sets

Page Speed Insights uses two different data sets in order to track and report on performance. They use Real-World Field Data and Lab Data.

Real-World Field Data

Page Speed Insights field data showing page speed attributes

Page Speed Insights, when given a specific URL, will look up the URL and its metrics in the Chrome User Experience Report data set. Assuming that Page Speed Insights has this data available, it will examine the metrics behind Core Web Vitals.

These include First Contentful Paint (FCP), First Input Delay (FID), Largest Contentful Paint (LCP) and Cumulative Layout Shift (CLS). These Core Web Vitals make up Google’s page speed elements that you must get right in order to achieve a ranking benefit.

Lab Data

This type of data comes from Google’s open-source Lighthouse tool, which analyzes a given URL and generates performance scores based on Core Web Vitals metrics. But it also gives two additional data points: Time to Interactive (TTI) and Total Blocking Time (TBT).

Why Must I Care about These Metrics?

These metrics are being released in Google’s latest page speed update, scheduled to release in May, 2021. They will become the foundation for which page speed is measured and calculated.

Roger Montti reported that “targeted improvements on any specific sites will not be reflected in field data, especially CWV scores, until the next 28-day cycle is reported on.”

Google also announced a significant change in how Page Speed Insights gathers data. Specifically, it’s switching to the HTTP/2 network protocol.

This is why the page speed updates are significant and why we must consider this when working on page speed analyses.

With faster data transfer speeds thanks to the newly implemented protocol, you will not see delays in page speed processing, which is why this improvement is being shown across all metrics.

The one limitation on this is if your server is not supporting HTTP/2, but, this is highly unlikely. If you are suspicious about your host’s server speed, however, you may want to inquire about HTTP/2 and make sure that it’s enabled, just in case.

Google’s Official Page Speed Insights Update Announcement

This is Google’s official announcement regarding their updates to HTTP/2:

“As of March 3, 2021, Page Speed Insights uses http/2 to make network requests, if the server supports it.

 

…With this change, network connections are often established quicker. Given your requests are served in h2, you can likely expect metrics and the performance score to improve.

 

In general, performance scores across all PageSpeed Insights runs went up by a few points.”

A significant update, indeed. We are looking forward to page speed increases based on little effort on our part, and we’re sure there are plenty of SEO pros who feel the same!

What is the Difference Between HTTP and HTTP/2?

In 1997, the very first standardized version of Hypertext Transfer Protocol (HTTP) was created. Designed to facilitate communication between web browsers and servers, HTTP was (and is) what makes web browsing possible. It went through several early iterations of development, and eventually wound up as HTTP/1.1. This is the default for the web as a whole.

In 2015, a brand-new version of HTTP called HTTP/2 was created.

HTTP/2 solves many problems that the creators of HTTP did not foresee. HTTP/2 is significantly faster and much more efficient than HTTP/1.1.

HTTP/2 prioritizes content during the loading process, which is why it is considerably faster in comparison.

How Can I Tell if HTTP/2 is Enabled?

Screenshot of the 'Network' tab within the Chrome Web Developer Console.

The method is quite straightforward. Simply bring up the Chrome Web Developer Console (hit the F12 key on your keyboard to do so), click on network and turn your attention to the to protocol column. It should specify HTTP/2 as the protocol in use. If it says anything else (as is the case below) you have some work to do.

Luckily, the process is relatively easy. If you’re familiar with server development, you might turn to Kaizen’s seven-step guide to learn how. If you’re not, just get in touch with your server’s administrator.

Does HTTP/2 Require Encryption?

No, HTTP/2 does not technically require the use of encryption. There are situations, however, where some implementations have said they will not allow HTTP/2 over unencrypted connections.

And here’s the real rub: At the time of writing, no existing web browser actually supports HTTP/2 on unencrypted connections.

So while encryption may not be technically required, it is practically necessary.

Google PSI Scores are Higher, but Probably Not From SEO Work

This is another element to consider beyond some of the specifics mentioned above. If you have made recent changes to your page speed, and you think these have resulted in improvements, you may want to double-check and test these changes on other pages. It’s likely that the improved results we are experiencing can be attributed to Google’s HTTP/2 update rather than any SEO measures implemented in the recent past.

Image credits / Screenshots by author, March 2021

]]>
https://iloveseo.com/seo/google-has-an-update-for-page-speed-insights-scores/feed/ 0
How Google’s Page Experience Update Will Really Impact Your Metrics https://iloveseo.com/seo/googles-page-experience-update-metrics/ https://iloveseo.com/seo/googles-page-experience-update-metrics/#respond Thu, 04 Mar 2021 21:59:57 +0000 https://iloveseo.com/?p=1341 How Google’s Page Experience Update Will Really Impact Your Metrics on iloveseo.com by Carrie Powers

Google’s much-anticipated page experience update is just a few short months away, and SEO practitioners everywhere are anxious to see how their sites’ rankings will be affected. Fortunately, Google public...

]]>
How Google’s Page Experience Update Will Really Impact Your Metrics on iloveseo.com by Carrie Powers

Google’s much-anticipated page experience update is just a few short months away, and SEO practitioners everywhere are anxious to see how their sites’ rankings will be affected.

Fortunately, Google public Search Liaison Danny Sullivan took to YouTube and Twitter to shed some light on ways the new update will (and won’t) impact key metrics.

Chaos Will Not Ensue

On February 23, Danny Sullivan joined Cherry Sireetom Prommawin, Martin Splitt and Ashley Berman Hale for a Search Central Live event streamed on YouTube:

A little over 20 minutes into the discussion, Splitt asked a question that’s crossed the minds of countless SEO pros over the past year: “Do we expect the [page experience update’s] impact on the metrics to be significant or more subtle?”

Sullivan’s response is sure to soothe some of your concerns:

I mean, I think if you go back and look at how we’ve had these sorts of things over time, it really isn’t that, OK, then the next day everything completely changes. There’s no intent to try to do that, even though we might say we start using [page experience] as a factor.

And in a February 24 tweet, he reiterated that “it shouldn’t be the case that overnight, we flip some sort of switch and there’s a massive change. That’s not typically how rollouts of this nature (such as speed, mobile-friendly) have worked.”

Danny Sullivan on the Page Experience Update

Translation? Don’t panic! While the addition of a brand new ranking factor sounds earth shattering, the page experience update won’t immediately tank (or boost) your rankings.

Relevant and High-Quality Content Will Still Be King

In the Search Central Live discussion, Sullivan emphasized that as a ranking factor, page experience will not supersede relevant, high-quality content:

First of all, [page experience] remains one of many things. Secondly, it’s always the case that we’re going to try to return the best content based on this basket or bucket of things.

 

So maybe you don’t have the best page experience. But if you’re still the most relevant content, that is overall on various things we’re looking at.

 

So I think it’s not a case of start being all super concerned, and understand that we want to make sure that this is coming in a moderated fashion.

In other words, there are still hundreds of ranking factors that will determine whether the site you’re optimizing sinks or swims in the SERPs, with content relevance and quality being some of the most important among them.

To make your site’s content meet Google’s exacting quality standards:

  • ensure each page accomplishes both its own unique purpose and the site’s overall purpose;
  • only publish copy that’s well-written, original and on-topic; and
  • work to improve the site’s expertise, authority and trust, or E-A-T.

As long as you never let those best practices fall by the wayside, a great page experience will serve to augment your site’s current rankings, not define them.

Page Experience Will Gain Importance Over Time

While the page experience update might not wreak havoc on rankings right away, it is expected to have a comparatively greater impact on sites’ metrics in the long run.

As Sullivan explained at the Search Central Live event, “over time, what will happen is, as more and more content is coming up in page experience, and if you’re in a situation where things are all relatively equal, the things that are more page experience-oriented are likely to start doing better.”

In other words, the more SEO practitioners upgrade their sites’ page experience, the more important page experience will be as a ranking factor.

Sullivan elaborated further on Twitter, adding that “when mobile-friendliness began, plenty of pages still needed to become that way. So while it was a factor, using it more heavily as a factor initially doesn’t make much sense. But over time, it (like any factor) might become more valuable.”

Danny Sullivan on Page Experience Update and How it Affects Mobile Friendliness

“So with page experience,” he continued in the same thread, “it could become a more important factor over time than with an initial launch as a great page experience becomes more common to pages. But also, and as we’ve kept saying, it’s one of many factors.”

He then linked to a Google Search Central document about the page experience update and highlighted a paragraph expressing a similar sentiment:

Understand How Page Experience Will Affect Ranking

The bottom line is that while Google’s rollout of the page experience update might not immediately take a monumental toll, there’s a good chance it will make a greater and greater impact as time goes on.

So, it’s crucial for SEO practitioners to avoid complacency and make top-notch page experience a long-term goal.

A Little Less Conversation, a Little More Action, Please

As Danny Sullivan emphasized in the Search Central livestream, if there’s something you should be doing that you’re not already, Google will typically make sure you’re aware. That’s because Google doesn’t want sites’ rankings to go plummet—rather, it wants SEO pros to have actionable information they can use to achieve the best rankings possible.

So if you want to make the most of the page experience update and earn the highest possible rankings, Sullivan recommends spending less time speculating and more time taking action to generate tangible results.

Image credits
Screenshots by author / March 2021

]]>
https://iloveseo.com/seo/googles-page-experience-update-metrics/feed/ 0
Keyword Cannibalization: How to Stop Your Site from Getting Eaten Alive https://iloveseo.com/seo/technical-seo/keyword-cannibalization-how-to-stop-your-site-from-getting-eaten-alive/ https://iloveseo.com/seo/technical-seo/keyword-cannibalization-how-to-stop-your-site-from-getting-eaten-alive/#respond Mon, 26 Oct 2020 13:19:07 +0000 https://iloveseo.com/?p=359 Keyword Cannibalization: How to Stop Your Site from Getting Eaten Alive on iloveseo.com by Carrie Powers

After spending time and effort optimizing a high-quality page to rank for its target keywords, the last thing you want to see is it eclipsed by another page from the...

]]>
Keyword Cannibalization: How to Stop Your Site from Getting Eaten Alive on iloveseo.com by Carrie Powers

After spending time and effort optimizing a high-quality page to rank for its target keywords, the last thing you want to see is it eclipsed by another page from the same site. Unfortunately, such a scenario is entirely possible when keyword cannibalization rears its head.

Want to prevent keyword cannibalization from ruining your pages’ hard-earned rankings? All you need to do is implement a few simple tactics to maximize conversions, improve the user experience and ensure your best pages achieve the highest rankings possible.

What Is Keyword Cannibalization and Why Does It Matter?

Let’s say you’re trying to rank for the keyword phrase custom t-shirts. The more blog articles you publish including that keyword phrase, the better off you’ll be, right?

Not necessarily: If your site has dozens of blog articles all including the keyword phrase custom t-shirts, they’ll all compete with each other to rank for that same phrase. This is an issue known as keyword cannibalization, and it’s especially important to monitor since Google stopped showing more than two listings from the same domain in its top results.

Google does this in the name of site diversity, a concept designed to provide users with more choices. To see site diversity in action, perform a Google search for any keyword and notice how many different domains you see on the search engine results page (SERP).

When searching for drug store, for instance, Walgreens, CVS, The Online Drugstore and Rite Aid all have a place on the first page, with none appearing more than once:

cannibalization

When keyword cannibalization gets out of hand, a site’s less important pages may end up ranking higher for a main keyword than its most important pages. When trying to rank for custom t-shirts, for example, you probably wouldn’t want an outdated blog post ranking higher than the site’s main page.

Keyword cannibalization can also reduce page rankings by diminishing both authority and links. Think of it this way: Instead of having one highly authoritative page with lots of inbound links, you might have several moderately authoritative pages with only a small number of inbound links each.

All in all, keyword cannibalization is a pesky issue that can sneak up on SEO practitioners without warning, especially when the site they’re optimizing has been up for several years and may have many pages targeting the same keywords.

Luckily for you, keyword cannibalization is far from impossible to fix. With a few straightforward tactics, you can stop your pages from gobbling each other up and start enjoying higher rankings, more clicks and increased conversions.

Find Keyword Cannibalization Issues

To identify a site’s existing instances of keyword cannibalization, you’ll need to examine its organic keyword data. This can be accomplished with the help of an organic keywords analytics tool such as:

Once your chosen tool is ready to go, analyze the site you’re optimizing to see which pages are ranking for which keywords. Whenever you see multiple pages ranking for the same keyword, it’s time to dig deeper. Ask yourself:

  • Which page is ranking highest for the given keyword?
  • Which generates the most conversions?
  • Do any of the pages fail to meet the site’s quality standards?

If the highest-ranking page drives significantly less conversions or is much lower-quality than its lower-ranking counterparts, you’ve got yourself a classic case of keyword cannibalization.

But before you start nixing every instance of keyword cannibalization in sight, it’s important to recognize when they aren’t an issue. For instance, if you have two high-quality pages ranking first and second for a given keyword, and especially if each page provides visitors with a uniquely valuable experience, there’s no need to give up that valuable SERP real estate.

Merge Similar Pages

If a site has two medium- or low-performing pages ranking for the same keyword, or a high-quality page ranking lower than a low-quality page on the same topic, you may benefit from combining those pages into one.

For example, let’s say you’re optimizing a site for an urgent care, and it has two pages ranking for the keyword how to choose a doctor. Both are performing moderately well: Each is linked to by a few other sites, and each regularly generates a modest number of conversions.

But what if they could be combined into one ultra-authoritative page that’s linked to by many other sites and generates a higher number conversions than both individual pages put together? You may be able to achieve such a result by merging the two original pages.

As Google Webmaster Trends Analyst John Mueller explained in a 2018 livestream:

…if you take two or three or four kind of weaker pages and merge them into one, even within the same site or externally, then…we can say this is a stronger page. We can see that…more parts of the site are referring to this one single piece of content, so it’s probably more relevant than those individual small pieces that you had before.

Before you start combining all your site’s remotely similar pages, remember to follow a few key guidelines:

  • Only merge pages that are highly relevant to each other, aim to achieve the same purpose and cover the same topic.
  • During the merging process, take the time to edit the pages into one cohesive piece of content.
  • Add even more value to the merged page than each of the original pages previously had—in other words, aim for a whole that’s greater than the sum of its parts.

De-optimize Low-Performing and Less Relevant Pages

Let’s say you have two pages ranking for the same keyword. Page A is less relevant to the keyword and doesn’t generate many conversions, but it appears higher in the SERP. Page B is more relevant to the keyword and boasts a higher conversion rate, but it doesn’t get as much traffic.

If page A serves its own unique purpose but you don’t want it ranking higher than page B for the given keyword, you can choose to solve the issue by de-optimizing page A for that same keyword.

The simplest way to do this is by removing the keyword from page A’s title and headers, as well as most of its text. Be sure to replace the keyword with one you do want the page to rank for. By doing so, you’ll ensure both pages rank for the most relevant keywords possible without compromising each other’s SERP position.

Delete Low-Quality Pages

In some cases, the page ranking highest for a given keyword isn’t just less relevant or underperforming—it’s flat-out low-quality.

Pages can be low-quality for a variety of reasons. For instance, they could:

 

  • contain too little content;
  • provide little or no value to users;
  • contain uncanonicalized duplicate content (more on that next);
  • be severely outdated;
  • be incompatible with modern browsers;
  • be poorly written; or
  • fail to meet their purpose.

Whatever the case, there’s no reason to allow low-quality pages to leech SERP rankings from high-quality ones. In such instances, it may be best to save yourself the trouble of merging or de-optimizing and simply delete the offending pages.

Prevent Indexation

In the case of duplicate content-related keyword cannibalization, merging, de-optimizing or deleting pages may not be the best option—instead, you could benefit most from disabling search engine indexation altogether.

Duplicate content can occur for a variety of reasons, including:

  • URL variations such as www.example.com versus http://example.com;
  • printer-friendly or PDF versions of existing pages; and
  • product descriptions copied across multiple pages.

In such situations, keyword cannibalization comes into play when a page containing duplicate content (i.e., a non-canonical page) ranks higher for a keyword than the page you do want to rank (i.e., a canonical page).

Several tactics can be employed to resolve this issue and prevent non-canonical pages from being indexed. For instance, you can use:

  • the canonical tag to tell search engines which version of a page is canonical (and thus should be indexed) and which version is non-canonical (and thus shouldn’t be indexed);
  • 301 redirects to redirect visitors and search engines from non-canonical pages to canonical ones; or
  • the noindex meta tag to directly tell search engines not to index a page.

With indexation disabled in the right places, you’ll be able to take control of your pages’ rankings and direct search engines to those that matter most.

Kill Keyword Cannibalization to Bring Your Rankings to Life

No SEO practitioner wants to see their pages eat each other’s rankings, but when keyword cannibalization runs rampant that’s exactly what can happen.

Fortunately, it only takes a few quick and straightforward tricks to nip keyword cannibalization in the bud and enable your pages to achieve the rankings they deserve.

Image credits

Screenshot by author / October 2020

]]>
https://iloveseo.com/seo/technical-seo/keyword-cannibalization-how-to-stop-your-site-from-getting-eaten-alive/feed/ 0
Write a Killer Meta Description to Hook Readers and Reel in Clicks https://iloveseo.com/seo/on-page-seo/write-a-killer-meta-description-to-hook-readers-and-reel-in-clicks/ https://iloveseo.com/seo/on-page-seo/write-a-killer-meta-description-to-hook-readers-and-reel-in-clicks/#respond Mon, 26 Oct 2020 12:18:17 +0000 https://iloveseo.com/?p=362 Write a Killer Meta Description to Hook Readers and Reel in Clicks on iloveseo.com by Carrie Powers

As an SEO practitioner, your daily tasks likely involve thinking about lots of ranking factors. So, hearing that something isn’t a ranking factor might make you think your attention is...

]]>
Write a Killer Meta Description to Hook Readers and Reel in Clicks on iloveseo.com by Carrie Powers

As an SEO practitioner, your daily tasks likely involve thinking about lots of ranking factors. So, hearing that something isn’t a ranking factor might make you think your attention is better spent elsewhere.

But when it comes to meta descriptions, nothing could be further from the truth. With a polished meta description designed to pique readers’ interest and set each site page apart, you can achieve a higher click-through rate and climb the search engine ranks.

How a Great Meta Description Can Benefit You

We know you’re eager to start learning about the ideal meta description length and uncover the best techniques, but we first need to clear up one fundamental issue: What is a meta description, exactly?

Simply put, it’s a brief piece of text included in the metadata of a page. In just one or two sentences, it serves to tell readers and search engines what the page is about and the value it offers.

In some cases, Google may even use a page’s meta description to create a results snippet. This is the text that shows up under the page title on the search engine results page (SERP):

Meta description snippet

After analyzing nearly 200,000 pages ranking in the top 10 Google results for their respective keywords, Ahrefs found that Google uses a page’s custom meta description 37.22 percent of the time:

meta descriptions 2 meta descriptions 2

But if meta descriptions aren’t a ranking factor and aren’t always used by Google to form results snippets, how can they aid your SEO efforts? The answer concerns humans rather than bots: When search engine users see a well-written meta description that not only shows them a preview of the page’s content but also captures their attention, they’re more inclined to click on that result.

To understand why, put yourself in the reader’s shoes. Let’s say you’re searching Google for hotel discounts and skim over the top two results. The first has a meta description reading:

Best hotel discounts.

On the other hand, the second result has a meta description reading:

Make your next trip a breeze. Compare hotel prices and find the best rates at hundreds of hotels around the world.

Unless you value brevity above all else, we’re willing to bet you’ll choose the second result. That’s the true value of a thoughtfully crafted meta description: If Google opts to use it for the results snippet, it has the ability to draw in visitors who might have otherwise clicked on a different result. And, as the page’s click-through rate (CTR) improves, so can its overall SERP ranking—in this way, meta descriptions act as indirect ranking factors.

Long story short, if you’re trying to maximize a page’s SERP ranking then you can’t afford to overlook the power of a fantastic meta description.

How to Add a Meta Description

If you don’t already know how to add a meta description to a site page, it’s time to get familiar with the nuts and bolts of the process. Luckily for you, it couldn’t be simpler—if you can copy and paste text, then you can insert a meta description.

To start, open up the page’s HTML file and find the section located at the top of the file. Then, copy the following meta tag code and paste it within the section, bookended by a less than symbol (<) on the left and a greater than symbol (>) on the right:

meta name="description" content="Your meta description here"

While you’re at it, you can also add other meta tags such as those for keywords and the author’s name.

What’s the Best Meta Description Length?

Although Google has its own meta description guidelines, it’s frustratingly silent on recommended length. Perhaps that’s because meta descriptions can technically be any length—however, there is a limit on the number of pixels (and therefore characters) Google will display in a results snippet.

Fortunately, plenty of SEO experts have already gone through the trouble of counting those pixels and the number of characters that can fit within them. For example, Moz has observed Google cutting off snippets more than 155–160 characters in length. As such, we recommend you err on the side of caution and limit your meta description to 155 characters or less.

On the flip side, you also want to give readers a comprehensive preview of the page by making your meta description as detailed as possible—we’ve found that an adequately thorough description typically clocks in at 100 characters minimum.

To get the length right, try writing your meta description right in a free word counting tool. EasyWordCount.com is a simple, no-frills option, while WordCounter also offers grammar and spell check, an undo button and other handy features.

Decide Which Keyword to Emphasize

Although Google doesn’t use meta descriptions as a ranking factor, it does appear to more frequently use a page’s custom meta description for queries that include fat-head keywords (i.e. those that are short, widely-used and have a high search volume) rather than long-tail keywords (i.e. those that are long, less widely-used and have a lower search volume).

The Ahrefs study found that Google is about six percent more likely to use a page’s custom meta description for fat-head keywords than it is for long-tail keywords:
meta descriptions 3 meta descriptions 3

But if users decide which type of keyword to search for and Google decides whether or not to display the custom meta description, why does this matter for you? Because when you include a fat-head keyword in your meta description, Google will bold that keyword in its search results. Take for example the search results for WordPress tutorials:
meta descriptions 4 meta descriptions 4

Those bolded keywords aren’t just eye-catching—they also tell users the page in question addresses the exact topic they’re searching for.

The lesson? Don’t focus on long-tail keywords. Instead, identify the fat-head keyword a page is most likely to rank for and add it to your meta description.

Get Specific and Demonstrate Value

The number one most important function of a meta description is to tell readers (and search engine bots) what the page it’s attached to is about. This is true for every type of page, from individual articles to landing pages to download forms.

For you, this means you need to strive to make each meta description as specific as possible. For instance, let’s say you’re creating a meta description for a page where visitors can download an e-book about fitness. A more generic description might look something like this:

Download our e-book today.

While such a description would be better than none at all, you can do much better with a more specific representation of the page’s contents. For example:

Sick of spending hours at the gym? Download our e-book for dozens of equipment-free exercises you can do anytime, anywhere, no membership required.

A description like this won’t just tell readers that the page will let them download an e-book, or even that the e-book is fitness-related. Instead, it will tell them exactly what kind of exercises the e-book contains as well as how they can benefit from it—in this case, by avoiding a pricey gym membership.

The benefit you choose to highlight is the key meta description ingredient that serves to establish value and rouse readers’ interest before they even open the page. To better understand how the best meta descriptions accomplish this, check out some real-world examples.

A Google search for vegetarian recipes yields more than one billion results, all of which have their own meta description or auto-generated results snippet. And yet, the listing for Allrecipes’ vegetarian recipe collection stands out because it tells us what to expect inside (hundreds of user-reviewed recipes) and how we can benefit as a result (by achieving a healthier diet). As an added bonus, it hasn’t be truncated:
meta descriptions 5 meta descriptions 5

By the same token, a search for airpods review returns over four million results. The Airpods Pro review from Wired magazine sets itself apart with its concise description of the page’s content (a review of Apple’s AirPods Pro) and attention-grabbing language—saying the Pros are “what the original AirPods should have been” is certainly a strong statement:

meta descriptions 6 meta descriptions 6

With such a carefully crafted combination of specificity and offered value, any page can start seeing more clicks and higher engagement.

Stick to the Style Guide

You’re already using a style guide to inform the syntax and tone of a site’s content (or at least we hope you are), but you might not realize you should also be using it to write meta descriptions.

Think of it this way: Besides the title, the very first impression search engine users get of any given page is the snippet displayed under its listing. By writing a meta description that meshes with the site’s overarching voice, you can create a cohesive user experience and ensure readers get the best first impression possible.

Start with grammar and spelling. For instance, if a site spells e-book as ebook, the meta description should too. The same goes for stylistic details such as Oxford commas and OK versus okay.

Next, incorporate the same voice used on the rest of the site. For example, maybe a site’s voice is predominantly:

  • friendly and informative;
  • authoritative and wise;
  • casual and humorous; or
  • witty and succinct.

Whatever the case, it’s your job to ensure the site’s voice is present in the meta description too.

Research the Competition

While creating meta descriptions entirely from scratch can yield unique and creative results, you can also get useful ideas by looking at the competition’s.

Let’s say you’re writing a meta description for a landing page selling camping tents. To get a feel for your competitors’ meta descriptions, perform a Google search for camping tents. Find results whose snippets haven’t been cut off by Google, and instead look for those which appear to be based on a meta description—in other words, those that don’t end in an ellipsis. The meta description for REI’s camping tents page is a prime example:

meta descriptions 7 meta descriptions 7

Taking inspiration from that meta description, you may decide to enhance your own description by adding a sentence advertising the site’s reasonable shipping fees, or you could choose to highlight its knowledgeable staff who can help customers choose the right product.

Have a specific competitor in mind? You can see any page’s metadata by viewing its source code in your browser. Then, press Ctrl+F (or Command+F for Mac) and type name=”description”. This will immediately jump to the meta description:

meta descriptions 8 meta descriptions 8

Revise Old Meta Descriptions

As you hone your skills and learn how to craft increasingly high-quality meta descriptions, it’s crucial to routinely revise a site’s old meta descriptions. This will allow you to draw more visitors to both high- and low-ranking pages, all while making their subject matter easier for search engines to identify.

When refreshing old descriptions, remember to check for:

  • length;
  • clarity and specificity;
  • demonstration of value;
  • style guide adherence; and
  • quality compared to the competition.

Updating previously published meta descriptions will also give you the chance to keep up with any fluctuations in Google’s length restrictions—they’re known to change unpredictably, sometimes shifting down and up again within a span of a just a few months. So, it pays to stay on top of the latest changes and promptly tweak a site’s existing meta descriptions accordingly.

Craft Descriptions That Would Hook You

Meta description creation is an art, not a science. To write ones that successfully lure in users, you need to craft them with humans like you in mind (unless you’re a robot—if so, 01101000 01101001!).

With a focus on creativity and originality, as well as a keen sense of what makes people click on results in the first place, your meta descriptions can become powerful tools for intriguing users, boosting CTR and driving traffic, all in 155 characters or less.

Image credits

Screenshots by author / September 2020

Ahrefs / September 2020

]]>
https://iloveseo.com/seo/on-page-seo/write-a-killer-meta-description-to-hook-readers-and-reel-in-clicks/feed/ 0
Assessing Crawlability to Remove Ranking Roadblocks https://iloveseo.com/seo/technical-seo/assessing-crawlability-to-remove-ranking-roadblocks/ https://iloveseo.com/seo/technical-seo/assessing-crawlability-to-remove-ranking-roadblocks/#respond Mon, 14 Sep 2020 14:00:48 +0000 https://iloveseo.com/?p=315 Assessing Crawlability to Remove Ranking Roadblocks on iloveseo.com by Carrie Powers

Google may be a tech industry colossus (the kraken of search engines does have a certain ring to it), but even its most robust algorithms aren’t perfect. Mistakes are made,...

]]>
Assessing Crawlability to Remove Ranking Roadblocks on iloveseo.com by Carrie Powers

Google may be a tech industry colossus (the kraken of search engines does have a certain ring to it), but even its most robust algorithms aren’t perfect. Mistakes are made, crawlers get stumped and some pages inevitably fall through the cracks. While typical internet users may never notice them, these crawlability errors can have a big impact on SEO practitioners and the results they’re striving to achieve.

What does this mean for you as an SEO pro? It’s your job to make a site as crawlable as possible. Make it easier by assessing and improving a site’s crawlability with an arsenal of targeted techniques designed to guide Googlebot on its merry way.

What Is Crawlability?

Before Google can add a site to its index and assign it a ranking, it has to crawl the site.

Crawling is the automated retrieval of a site’s content, whether in the form of text, videos or images. Once it’s crawled a site, Google adds each page’s information to its index and analyzes it using algorithms.

The crawling process is executed by Google’s bots, collectively (and perhaps affectionately) referred to as Googlebot. On every page it visits, it diligently scans content and follows links, eventually creating a comprehensive overview of the entire site.

Its job isn’t always a walk in the park, though. From explicit crawling blocks to accidental 404 errors, plenty of obstacles can get in the way. The ease with which Google can crawl a given page is called crawlability. The fewer obstacles a page contains, the more crawlable it is.

Fortunately, site crawlability isn’t outside your control. Implement a few simple tactics to expedite the crawling process, and you’ll pave the way for better rankings and more clicks.

Check CMS Settings

If a site isn’t being crawled as frequently or efficiently as it could be, the solution may be as simple as toggling a setting in its content management system (CMS).

Every major CMS includes an option to dissuade search engines from crawling or indexing a site. In WordPress, mouse over Settings in the left-hand navigation bar, choose the Reading option, and check the box next to Search Engine Visibility:

crawlability 1 crawlability 1

Or, you can manually edit the site’s robots.txt file for the same result. Squarespace also allows users to discourage indexing via either the settings or robots.txt file, while HubSpot users can only disable indexing via robots.txt.

If your goal is to make a site more crawlable, verify these settings. It’s quick and easy, and might provide a one-click solution to your crawlability issues.

Reduce Index Bloat

No, not the kind of bloating you get after a wild night at Taco Bell. We’re talking about index bloat, a problem that arises when Google indexes pages it shouldn’t. Index bloat can cause a site’s most valuable pages to be overlooked in favor of obsolete, irrelevant or duplicate ones.

The headaches don’t end there. In the case of large sites, index bloat can devour crawl budget faster than you can say, “chalupa overload.” In case you need a refresher, crawl budget refers to “the number of URLs Googlebot can and wants to crawl.” Put differently, if Google starts crawling a glut of pages it shouldn’t, higher-quality pages might not get the attention they deserve.

To check for index bloat, verify how many pages a site has listed in its sitemap, perform a Google search for pages within the site (using the format site:www.example.com) and note the number of results Google returns:

crawlability 2 crawlability 2

If the number of results is equal to or greater than the number of pages listed in the sitemap, you’re looking at a piping hot serving of index bloat supreme. As with drive-through nachos, more website indexation isn’t necessarily better website indexation.

If you want to end a site’s index bloat woes, the noindex tag is your new best friend. To use it, open any page’s HTML editor and add aL editor and add a quick line of code to the head section:

<meta name=”robots” content=”noindex”>

Alternatively, you can delete any pages that shouldn’t be indexed and provide no value to users. For example, a completely outdated or low-quality page should neither be indexed nor shown to visitors.

Contrast this with a “thank you for registering” page (for example), which shouldn’t be indexed, but should still be shown to users.

Maximize Page Speed

If you’ve ever waited an eternity (a.k.a. anything longer than ten seconds) for a site to load, you already know how vital page speed is to user experience.

What may be less obvious is how strongly page speed can also affect a site’s crawlability. While Googlebot can’t really get impatient, it does emulate human users by navigating sites using internal links.

The longer each page takes to load, the more a site’s crawl budget is depleted. In the worst cases, Google may stop crawling a site entirely and move on to a less sluggish destination.

To improve page speed and help search engine crawlers navigate sites with ease, use techniques such as:

  • implementing image optimization;
  • installing a WordPress caching plugin;
  • reducing the number of server requests;
  • reducing server response times;
  • using asynchronous loading; and
  • using a content delivery network (CDN).

Be More Mobile-Friendly

If you happened to make a bet ten years ago that mobile traffic would someday overtake desktop traffic, it’s time to cash in. (Lucky you!)

In July 2010, desktop devices accounted for just over 97 percent of internet traffic, with mobile devices producing just 2.86 percent. By July 2020, the tables had turned—50.88 percent of all internet traffic was from mobile devices, while 46.39 percent of traffic came from desktops.

crawlability 3 crawlability 3

How does this affect crawlability? The answer lies in Google’s crawling methods: In an effort to adapt to the internet’s massive mobile traffic, Google has switched to mobile-first indexing for all sites.

This means Google primarily relies on a site’s mobile version to assign rankings. As a result, optimizing a site for mobile use isn’t just a good idea—it’s a necessity.

If you’re not sure exactly how to make a site as mobile-friendly as possible, start by following Google’s mobile-first indexing best practices:

  • Ensure Google can access and render mobile pages’ content and resources.
  • Confirm a site’s mobile pages contain the same content as its desktop pages.
  • If you use structured data, include it on both desktop and mobile pages.
  • Use the same metadata on a site’s mobile pages as its desktop pages.
  • Follow the Coalition for Better Ads recommendations for mobile ad placement.
  • Adhere to Google Image best practices.
  • Adhere to Google’s video best practices.
  • Take extra precautions to ensure separate mobile and desktop URLs function properly (you’d be better off not using separate URLs at all, though).

Once you’ve successfully optimized a site for mobile use, Google will have a much easier time crawling (and ranking!) its pages.

Fix Broken Pages

Crawl budget depletion can again rear its ugly head when a site has one or more broken pages.

Think of Googlebot like a person driving through a city: It wants to travel quickly and efficiently, but the more detours that crop up, the longer the trip takes. Broken pages are those detours, and too many can seriously impede site crawling.

Worse yet, broken pages can also prevent link equity from freely flowing between pages.

The first step to solving this pesky problem and making a site more crawlable is locating broken pages. You can do this quickly and automatically with the help of comprehensive paid tools like Semrush’s Site Audit or Ahrefs’ Site Explorer, or you can use a simpler (but free) tool like Dead Link Checker.

For example, we scanned the New York Times food section landing page using Dead Link Checker and uncovered a total of three broken pages:

crawlability 4 crawlability 4

Once you’ve identified a site’s broken pages, determine how best to mend the breaks. Search engines displaying the incorrect link, or users linking to a nonexistent page is beyond your control. But you can absolutely control whether broken pages occur as a result of site migration or changing site structure.

For the broken pages you can control, use 301 redirects to keep both crawlers and human visitors moving in the right direction. For those you can’t, create a custom 404 page to provide the best possible user experience.

Organize Blog Pages

While Google’s algorithms are pretty darn good at identifying context and related content within a site, clear organization and labeling expedite the process considerably. This is especially true for blogs, where disorganized blog posts can quickly turn into a maze of hundreds or even thousands of uncategorized pages.

Keep the chaos under control by organizing a site’s blog posts using categories, tags, an archiving system or any combination of the three.

For instance, the food blog Cookie and Kate divides posts into broad categories such as breakfast, salad, soup and dinner:

crawlability 5 crawlability 5

From there, individual posts are organized even further with the help of category and ingredient tags like baked goods, dairy free, coconut oil and maple syrup:

crawlability 6 crawlability 6

This type of stringent organization helps users find exactly what they’re looking for and boosts crawlability.

To start categorizing, tagging and archiving a blog yourself, first find out which types of organization the site’s CMS supports:

Take the time to categorize, tag or archive all the blog’s previous posts, and implement protocols to organize new posts, too.

Optimize JavaScript and CSS

To understand how different languages affect Google’s crawling behavior, think of websites as cars.

Markup languages such as HTML organize the internal parts working to keep gas and electricity flowing, programming languages such as JavaScript create the knobs and dials drivers actually interact with, and style sheet languages such as CSS determine those buttons’ design, colors and position.

HTML, JavaScript and CSS are all astoundingly prevalent across the web:

Given their ubiquity, you’d be forgiven for thinking all three languages are equally crawlable in Google’s eyes. As with many aspects of SEO, though, things aren’t quite so simple, and both JavaScript and CSS can trip up Google’s crawlers if used improperly.

Follow Google’s recommendations to ensure its bots can crawl and index a site’s JavaScript and CSS files.

  • Use the URL Inspection tool to see how Google views a page.
  • Use the robots.txt Tester to make sure crawlers aren’t blocked from JavaScript and CSS content.
  • Use the Mobile-Friendly Test to confirm a site’s JavaScript and CSS files can be properly rendered on mobile devices.
  • Test both desktop and mobile URLs for crawlability (if you’re using separate ones for each).

As an added bonus, following these recommendations ensures image crawlability, too.

Tidy up the Sitemap

When exploring an unfamiliar city, one of the first things you’re likely to do is refer to a map—maybe even a paper one if you’re feeling particularly old school.

Googlebot is no different. As a crawler programmed by humans, it uses sitemaps to get around. A sitemap is a file outlining a site’s pages and media, and a well-organized one will show Google how a site’s parts relate to each other:

crawlability 7 crawlability 7

Google outlines three basic steps for building and submitting a sitemap:

  1. Decide which of a site’s pages Google should crawl, and use the canonical tag to identify the original version of any duplicate pages.
  2. Choose which format you want to use (options include XML, RSS and text).
  3. Ensure Google can access the sitemap by adding it to the robots.txt file or submitting it directly to Google.

Improve Crawlability and Rise through the Ranks

With thousands of tactics available to you as an SEO practitioner looking to boost sites’ rankings, it’s easy to get swept up in granular techniques and algorithm updates. But without a crawlable site, all your other SEO efforts will be moot.

Use the variety of crawlability-enhancing tools at your disposal to give Google and other search engines a clear path to your content. Banish bloat, make your site mobile friendly and keep your content clean and organized, and get set for colossal ranking performance that’ll help your sites rise above the competition.

Image credits
Screenshots by author / August 2020
StatCounter / July 2020
Illustration by author / August 2020

]]>
https://iloveseo.com/seo/technical-seo/assessing-crawlability-to-remove-ranking-roadblocks/feed/ 0
Optimize for Featured Snippets to Earn Rankings, Visibility and Authority https://iloveseo.com/seo/technical-seo/optimize-for-featured-snippets-to-earn-maximum-organic-traffic/ https://iloveseo.com/seo/technical-seo/optimize-for-featured-snippets-to-earn-maximum-organic-traffic/#respond Mon, 08 Jun 2020 14:00:01 +0000 https://iloveseo.com/?p=70 Optimize for Featured Snippets to Earn Rankings, Visibility and Authority on iloveseo.com by Carrie Powers

Imagine you have a burning question about elephants (what are those tusks for, anyway?) and also happen to be friends with a librarian. So you pose your question to your...

]]>
Optimize for Featured Snippets to Earn Rankings, Visibility and Authority on iloveseo.com by Carrie Powers

Imagine you have a burning question about elephants (what are those tusks for, anyway?) and also happen to be friends with a librarian. So you pose your question to your librarian friend. In all likelihood, they won’t show you every book on elephants in the library. Instead, they’ll pick out the book with the best possible answer.

That’s the idea behind the featured snippet. After entering a search query, users can immediately see the information they’re looking for without having to decide which source to click. By optimizing for featured snippets, you can boost organic traffic, build online authority and win users’ trust.

The Featured Snippet: A Modern Solution to a Timeless Problem

You may have heard the story of how Guinness World Records was created to give people an easy way to find answers unavailable in standard reference books. The question that spawned the idea was simple: “What is the fastest bird in the world?”

While Guinness World Records is still alive and well, most people now turn to a faster, more easily accessible source to find the answers they want: Google. Depending on the query, users don’t even need to comb through search results. Instead, Google displays the relevant information right at the top of the search engine results page (SERP).

Featured snippet example

This type of result is known as a featured snippet. In Google’s words, a featured snippet “provides a quick answer or summary with a snippet of content from a relevant website,” and is more likely to appear for a search written in the form of a question.

Each featured snippet contains:

  • information quoted from the source page;
  • a link to the page;
  • the page’s title; and
  • the page’s URL.

For users, featured snippets are a convenient way to instantly find the answer they’re looking for and quickly click through to the linked page if they want more information. For you, they’re nothing short of prime search engine real estate.

How Featured Snippets Can Affect SEO Metrics

If Google uses a page to create a featured snippet, everyone who enters the right query will be greeted with a box of information pulled straight from that page. Should the searcher wish to find out more about the topic at hand, the link to the page is right there—no scrolling required.

In a 2020 study, Ahrefs found just over 12 percent of search queries have a featured snippet. (software like Semrush can tell you how many a website already has).

As a result, traffic ends up getting diverted from the first “natural” search result in favor of the featured snippet. In SERPs with a featured snippet, the snippet gets 8.6 percent of clicks while the result right below it gets 19.6 percent. In SERPs without one, however, the first organic result gets 26 percent of clicks:

Graphs showing Ahrefs study data

In short, featured snippets seem to take a portion of clicks from their standard, non-featured counterparts.

Keep in mind SERPs with the snippets get 4 percent fewer clicks than those without, presumably because users get the answer they want without having to click at all. Nevertheless, these results still indicate if a site is used for a featured snippet, it could get a significant percentage of clicks that would otherwise go to the top organic result.

The caveat? Per Ahrefs’ data, sites chosen for Google featured snippets already rank in the top ten organic results 99.58 percent of the time. In other words, you’ll need to build a great SEO foundation for your entire site if you want to reap the benefits of a featured snippet.

Create Snippet-Friendly Content

Let’s say a page already ranks in the top ten results for a given query, but it’s still not showing up as a featured snippet. In this case, the solution may lie in better optimizing the page’s content to suit the Google featured snippet format.

Perform the Right Keyword Research

Not all keywords are created equal. Rather than searches using exact-match keywords (i.e., those precisely describing what the user is looking for), searches triggering featured snippets often use long-tail keywords. Many (but not all) times, this means the query is a true query, and takes the form of a question.

For example, instead of searching for “Harajuku,” a user might enter a question like “what is Tokyo’s fashion district.” This tells Google it’s a good time to display a featured snippet:

Featured snippet example for "Harajuku" search

When searching for the same thing without using a question format, a slightly different featured snippet is displayed. However, it’s still from the same blog post from the same site:

Featured snippet example for "Tokyo fashion district" search

With this in mind, remember to focus on keywords users are likely to enter when they don’t know the answer to a question, whether they write their query in question format or not.

To start your keyword research, consider the types of questions a page may be able to answer. For instance, let’s say you’re optimizing a marine conservation group’s website. You might position the site’s content to answer common questions about marine life, such as “how big is a blue whale?” or “how do dolphins communicate?”

For help discovering the questions users are asking, try using a site like AnswerThePublic. Upon entering a keyword, you’ll be presented with a visualization of the questions people are asking about it:

Answer the Public keyword visualization

Choose a few questions the site’s content will be able to answer, and use a keyword research tool like Moz’s Keyword Explorer (ten free queries available per month per account), Wordtracker (free) or Google’s own Keyword Planner (requires a free Google Ads account).

With Moz’s Keyword Explorer, for example, you’ll be able to see a variety of key metrics for your query, as well as a list of related suggestions and the current top-ranked pages:

Moz Keyword Explorer data example

Of course, you’ll also want to enter your search term into Google to see which page is used in the featured snippet, if there is one. Once you’ve decided which keyword you want a page to rank for, incorporate it throughout the page’s content—just don’t overdo it.

Use Strategic Formatting

To ensure Google can easily use content to create a featured snippet, try to pre-format it accordingly.

The first step is to understand the different types of featured snippets:

Paragraph

Paragraph featured snippet example

List

List featured snippet example

Video

Video featured snippet example

Table

Table featured snippet example

Paragraphs are by far the most common, accounting for 82 percent of all featured snippets.

But this doesn’t mean you should go out of your way to try to rank for paragraph snippets. Instead, focus on whichever style makes sense for the queries you’re aiming to answer.

Remember to consider length, too. Paragraph snippets usually have an average length of 45 words.

Those in list format have an average of 4.2 items with 10.8 words per item, and those in table format have an average of 3.6 rows and 2.5 columns:

Optimal featured snippet lengths

In terms of minimum length, Google has no specified requirements. This is because it’s “variable based on a number of factors, including—but not limited to—the information in the snippet, the language, and the platform (mobile device, app, or desktop).”

So don’t stress over minimum length. Instead, try to answer your targeted query as thoroughly as possible without greatly exceeding the average snippet length.

If you’re not sure which type of featured snippet to target, perform a Google search for the query you want to rank for, and see for yourself what kind of snippet appears.

Provide Value to Readers

Remember, all the keyword research and proper formatting in the world won’t help you land a featured snippet if the information you’re providing isn’t valuable.

What determines value, exactly? In this case, it’s the page’s ability to answer a specific query in a succinct, accurate and easily comprehensible way. In other words, a user should be able to get the answer they’re looking for in just a few seconds.

Ask yourself if the content you’re optimizing for featured snippets:

  • immediately answers the user’s question;
  • provides completely accurate information;
  • is logically organized;
  • is free of extraneous content (e.g. self-promotions or irrelevant product details);
  • can be easily understood by readers of all kinds; and
  • is well written with proper spelling and grammar.

By ensuring the content meets all those criteria, you’ll be able to create a fantastic user experience which can generate higher search rankings and boost your chances of securing the featured snippet you’re pursuing.

Win a Featured Snippet, Win Users’ Trust

Just as you’re more likely to trust a book recommended by a librarian, search engine users may be more likely to trust information prominently featured at the top of the SERP. This means the perks of earning featured snippets go far beyond a higher ranking.

With a featured snippet, a page can also gain more visibility, stronger authority and elevated status as a reliable source. Are those benefits worth pursuing? We don’t need a featured snippet to know the answer.

Image credits
Screenshots by author / June 2020
AnswerThePublic / June 2020
Moz / Retrieved June 2020

]]>
https://iloveseo.com/seo/technical-seo/optimize-for-featured-snippets-to-earn-maximum-organic-traffic/feed/ 0
How to Make a Custom 404 Page https://iloveseo.com/seo/technical-seo/how-to-make-a-custom-404-page/ https://iloveseo.com/seo/technical-seo/how-to-make-a-custom-404-page/#respond Mon, 27 Apr 2020 14:00:51 +0000 https://iloveseo.com/?p=40 How to Make a Custom 404 Page on iloveseo.com by Carrie Powers

Think of the last time you couldn’t find your keys. You probably felt annoyed and unsure what to do next. That’s the effect a poorly designed 404 page can have...

]]>
How to Make a Custom 404 Page on iloveseo.com by Carrie Powers

Think of the last time you couldn’t find your keys. You probably felt annoyed and unsure what to do next. That’s the effect a poorly designed 404 page can have on a site’s visitors. And if you’re not careful, it can bring the user experience to an abrupt halt. As a result, visitors can get frustrated, think less of the site or simply leave in search of greener (or at least more user-friendly) pastures.

Custom 404 pages can help prevent that scenario altogether. With a little planning and creativity, you can use these pages to keep users engaged and impressed, drive your bounce rate down and keep visitor satisfaction up.

What are Default and Custom 404 Pages?

Imagine you’re driving down a road (once you’ve found your keys, of course) and you encounter a sign reading, Dead End, Do Not Enter. Your only choice is to get back on the highway and choose another road. The internet equivalent is a user visiting a website, clicking on a link and being greeted with a 404 page.

This error page lets users know they’ve reached a dead end and must go back to the main website to find a valid destination.

What Causes a 404 Error?

In some cases, there’s nothing you can do to prevent the 404 page from popping up for some of your visitors. For example, it may happen when:

  • search engines display an outdated link; or
  • other sites link to outdated pages.

But to do have control over a couple of causes, such as:

  • a page being removed from the site; or
  • the site being relaunched.

And even without invalid links either within the website or from external sources, users sometimes type in the wrong URL, which will also take them to the 404 page.

Regardless of the reason, you need a way to keep them engaged.

Why Use Custom 404 Pages?

When visitors request a page that can’t be found, every site relies on its 404 page to break the bad news. Learning how to make a 404 page with custom content gives you the chance to redirect visitors’ frustration as well as their traffic through an enhanced user experience.

In terms of user experience, though, there’s a big difference between a default 404 page and a custom one. Usually comprising black text on a plain white background, default 404 pages are usually pretty basic:

Basic 404 page

But custom 404 pages can be specially designed to reflect a site’s unique style and improve the overall user experience. Take this one from Google, featuring a clickable company logo, playful robot illustration and simple explanation of what happened:

Google 404 page

Some companies take it to the next level by adding humor, helpful links and even animated GIFs. Discord’s page for lost visitors demonstrates this with a tongue-in-cheek explanation of the issue, helpful links to follow and a charming GIF for good measure:

Discord 404 page

Though the pages vary, the end result is the same: The user has to go to another (valid) page. With Google’s and Discord’s custom 404 pages, however, users can quickly understand what happened, click on convenient links to get where they want to go, and even get a glimpse of the brands’ individual personalities.

In that respect, the difference is clear—users enjoy a more pleasant experience with a well-designed and easily comprehensible custom 404 page. When you learn how to make a custom 404 page, visitors are more likely to stick around longer and less likely to go to another site.

How to Make a 404 Page Users Will Love

To get started, create a blank site page like you would for any other type of content. When you’re ready, following a simple set of guidelines makes it easy to create an attention-grabbing custom 404 page:

  • Tell the user why they’re there. Explain in plain language why they’re seeing an error page.
  • Give the user a way to proceed. Help them navigate without having to click the back button. For instance, a search bar, a few helpful links, an option to chat with customer support, a clickable main logo, or all of the above.
  • Maintain visual appeal. Just because it’s an error page doesn’t mean it has to be ugly. Play with design elements until you end up with a page that’s helpful and attractive.
  • Stay consistent. Custom 404 pages should have a design consistent with their sites. Stick with the same language, color palette, theme and brand identity.

Finally, set the page to appear automatically in place of the default 404 page. This can be done by configuring the site’s web server software (i.e., Apache or Nginx). On WordPress sites, you can also set a custom 404 page using a plugin.

Six Custom 404 Pages to Emulate

There’s no single right way to make a fantastic custom 404 page—the success of each one depends on how well it meshes with the rest of the site.

To spark some ideas, try looking at the custom 404 pages of other sites you love, like some of our favorites.

Netflix

Friendly language paired with a beautiful still from a fitting show (Lost in Space) entices visitors to keep looking for content directly from Netflix’s error page.

Netflix 404 page

ModCloth

Featuring an amusing play on words, ModCloth’s error page keeps visitors charmed by the brand’s personality, links to product pages to facilitate continued browsing, and includes clickable customer service options to ensure shoppers can get prompt help.

ModCloth 404 page

Twitter

A straightforward explanation and multiple ways to search for something new make Twitter’s not found page short and sweet.

Twitter 404 page

Allrecipes

An apt photo maintains visual appeal, a short pun keeps things light, and friendly language on the Allrecipes error page all encourage visitors to keep browsing.

Allrecipes 404 page

Ars Technica

The brand’s “moonshark” mascot makes a memorable appearance on Ars Technica’s error page, while a search bar helps visitors find what they’re looking for.

Ars Technica 404 page

Pixar

In what is probably the epitome of branding a custom 404 page, Pixar uses a character from its movie Inside Out to perfectly illustrate the feeling of landing on an error page.

Pixar Sadness 404 page

404 Pages May Be Unavoidable, but Frustrated Visitors Aren’t

The array of user-friendly options you have when creating a custom 404 page is evidence that even when things go wrong, you still have a chance to win over visitors and keep a site’s bounce rate down.

Next time a user follows an invalid link, make sure their experience remains positive by crafting a custom 404 page that’s helpful, engaging and bursting with personality.

Image Credits
Screenshots by author / April 2020

]]>
https://iloveseo.com/seo/technical-seo/how-to-make-a-custom-404-page/feed/ 0