Before you should even dive into strategies for your website’s search engine optimization, you have to consider your website’s Google indexing.
Without Google indexing, your website is basically invisible to search queries, which would basically kill your organic web traffic.
It’s important to understand that Google discovers new web pages by crawling the internet. Doing so then leads to Google adding new pages to its index (often referred to as indexing).
Then, when you search (or Google) for something, you’re asking Google to give you all relevant pages from its index. These days, there can easily be millions of results for any given search, so Google sorts these results, ranking those that seem most relevant up top for the user to see first.
Learn more about the process with this video from Google.
Of course, indexing and ranking are two different things. The key point is that you must be indexed in order to rank at all.
First things first, let’s find out whether your website has been indexed by Google yet or not. (This is especially important to check if you just launched or are about to launch your website.)
- Go to Google.com and search for site:yourwebsite.com. For example, here at DailyStory, we would search for site:dailystory.com.
- The search result number shows about how many of your web pages have been indexed by Google.
- You can get more specific with the same process but adding the web slug at the end of the URL in the search, searching for site:dailystory.com/blog/7-tips-to-level-up-your-content-marketing/ (for example).
- No results will show if the page is not indexed.
To get a more accurate understanding of what is indexed and what isn’t, you can use the free Google Search Console. Just navigate to the Index and then the Coverage report. The number of “valid pages” (with and without warnings) reflect the number of pages that have been indexed. If the number is 0, then your website has not been indexed. If the number is very low (lower than the number of pages on your website), then your site has only been partially indexed. You also can check on a specific page by using the URL Inspection tool on Google Search Console.
Perhaps only part of your website is indexed, or maybe your newest web pages aren’t getting indexed fast enough.
The following are 13 tips to get your website indexed by Google faster so that you’re not waiting for an indefinite period of time. (Several of our suggestions are made easier to perform by using Google Search Console.)
Request an indexing from Google
This might sound obvious, but plenty of businesses are not aware that they can use the Google Search Console to request an indexing from Google directly.
- Log into the Google Search Console.
- Navigate to the URL Inspection tool.
- Paste the URL you’d like Google to index into the search bar.
- Google will check the URL.
- Click the “Request Indexing” button.
While this can be an effective method for new pages, it might not help with old pages if there is an underlying issue that has prevented indexing thus far.
Check your robots.txt file for any crawl blocks
When Google isn’t indexing your entire website, it could be because of a crawl block in your robots.txt file.
You can check for this by going to yourdomain.com/robots.txt and look for these snippets of code:
- User-agent: Googlebot
- User-agent: *
These code snippets tell Google that it’s not allowed to crawl any pages on your website. To fix this, simply remove them.
To see if there is a crawl block for a specific page, paste the URL into the URL Inspection tool in Google Search Console. You’ll be able to find out more by clicking on the coverage block and looking for the error that says “Crawl allowed? No: blocked by robots.txt.”
Remove any rogue noindex tags
Because you may want to keep some web pages private, you can ask Google not to index those pages. But if these tags get onto other pages that you want to be indexed, you’ll obviously want to find and remove them.
- Look for the meta tag in the <head> section of your web page. This could say <meta name=”robots” content=”noindex”> or <meta name=”googlebot” content=”noindex”>. The key is that you’re looking for anything that says “noindex.”
- Look for the “noindex” in the X-Robots-Tag HTTP response header.
Add your not-indexed page to your sitemap
While Google is capable of finding web pages that are not in your sitemap, it’s still a best practice to include them.
Sitemaps tell Google which pages on your website are important and which aren’t. They also suggest how often the pages should be recrawled.
You can check to see if a page is in your sitemap by using the URL inspection tool within Google Search Console. If you get an error saying, “Sitemap: N/A” within a “URL is not on Google” message, then the page isn’t in your sitemap.
Delete any rogue canonical tags
Canonical tags tell Google which is the preferred version of a web page, and most pages don’t have them at all.
However, if a page has a canonical tag that directs Google to a preferred version of that page that doesn’t exist, your page won’t get indexed. These tags are referred to as rogue canonical tags.
To find whether this is the case, use the URL Inspection tool. You’ll see a warning saying, “Alternate page with canonical tag” if the page in question is pointing to another page. If that’s not supposed to be there, remove the canonical tag so that the page can be indexed.
Keep in mind that canonical tags are not always bad and do serve a purpose. Before removing a canonical tag, ask yourself if the other page being referred to is actually the preferred version. If not or if the other page doesn’t exist, then definitely remove the tag.
Double-check for any orphaned pages
An orphaned page is a web page that doesn’t have any internal links (within the website) pointing to it.
The problem with an orphaned page is that not only will Google not find it, but website visitors won’t either.
If the orphaned page is not important, feel free to delete it and remove it from your sitemap. If the page is important, incorporate it into the internal linking structure of your website.
Fix any nofollow internal links
Nofollow links are hyperlinks on your page that prevent the crawling and ranking of that destination URL from your page.
They originally came about to prevent spam in comment sections from gaining any credit in search rankings by littering a well-ranked page.
However, if the rel=“nofollow” tag is on any of your internal links within your website, that will cause an indexing issue.
Determine if your pages are high-quality, valuable
If it’s not a technical issue preventing your web page from being indexed, than it could be a quality issue.
That sounds subjective, and to be honest, it is somewhat. However, it is important to frequently review your pages and ask yourself:
- Would a user value this page if he or she clicked on it from a search results page?
- Does my page offer high-quality, useful information?
- Is my page valuable?
A regular website audit can help you stay on top of reviewing content, asking those questions and then improving content wherever necessary.
If the page itself is not valuable and overall low-quality, consider removing the page altogether. A bonus for doing this includes optimizing your “crawl budget,” which is the amount of server resources in relationship to the amount of pages needing to be crawled.
Find out more about crawl budget from Google.
Build up your quality backlinks
Backlinks are when other websites link to a page or pages on your website. Quality backlinks are when high-ranking websites link to you.
Your pages don’t need backlinks to be indexed, but Google will definitely index your page faster the more quality backlinks you have.
See our seven tips to grow quality backlinks for your website.
Don’t forget about social media
Of course, the sharing of your web pages across social media helps Google index your pages faster.
You’ll want to ensure that your social media strategy includes the social publishing of your most valuable content in a way that is engaging for your followers so that they will share it as well with their own networks.
Is your business just getting started on social media? Make sure you’re starting with the right social media platform for you and your goals.
Submit a post through Google My Business
This recommendation only works if the page you’re sharing via a Google My Business post makes sense to publicly appear on your knowledge panel.
In other words, not all pages may make sense to do this with.
That being said, you can give Google a push to crawl and index a page by:
- Signing into Google My Business
- Selecting the location you want to submit the post for (if applicable)
- Clicking “Create post” and then the “What’s New” type
- Adding a photo (if applicable)
- Writing the post
- Selecting the “Learn more” option for “Add a button”
- Filling in your URL in the “Link for your button” field
- Hitting “Publish”
Use the Google Indexing API for short-lived content
If your website features short-lived pages and content, such as job postings, event announcements and/or live-stream videos, you should consider using the Google Indexing API.
This tool enables you to automatically request the crawling and indexing of new content and content changes.
Specifically, the Google Indexing API can help you:
- Update a URL
- Remove a URL
- Obtain the status of a request
Be mindful of duplicate content
Duplicate content refers to very similar (or even identical) content that appears on multiple pages within your website or across other websites.
Essentially, duplicate content can confuse Google since the search engine aims to index only one URL for each set of unique content. This can be a common issue for ecommerce websites due to product descriptions being reused across pages and websites.
Find out more about duplicate content.
Looking to level up your digital marketing process as you get your website indexing right? Consider DailyStory, which features automation, audience segmentation and more. Schedule your free demo with us today.