Do you want more organic search traffic to your site? I’m willing to bet the answer is yes – we all do!
Organic search traffic absolutely matters. In fact, it’s the source of over half of all site traffic, on average, as compared to 5% from social media – up to 64% of your traffic, according to Conductor.com.
But that stat doesn’t matter much if your site doesn’t show up in the search results at all.
How do you get your new site or blog indexed by Google, Bing and other search engines? Well, you’ve got two choices.
You can take the “tortoise” approach – just sit back and wait for it to happen naturally.
Or you can expend a little effort and make it happen now, giving you more time and energy to put towards increasing your conversion rate, improving your social signals — and, of course, writing and promoting great and useful content.
I don’t know about you, but I’d rather get my sites indexed as quickly as possible, because it gives me more time to build my audience.
Launched a new website? Download this cheat sheet to learn how to get Google to instantly index your website and increase your crawl rate.
If ranking your sites sounds good to you, too, read on for 11 simple things you can do today to get your new site or blog indexed as quickly as possible.
Step 1: Understand How Search Engines Work
Search engines rely on complicated algorithms to do their magic, but the basic process isn’t all that hard to understand.
Essentially, search engines like Google rely on spiders — little bits of computer code that each search engine sends out to “crawl” the web (hence, “spider”). You want an efficient crawl rate.
The spider’s job is to look for new stuff on the web and figure out what it’s about. That “new stuff” can be a new page on an existing site, a change to an existing webpage or an entirely new site or blog.
Once the spider finds a new site or page, it needs to figure out what that new site or page is about.
Way back in the Wild, Wild West of the early web, search engine spiders weren’t nearly as smart as they today. You could force a spider to index and rank your page based on nothing more than how many times a particular search phrase appeared on the page. For today’s content success, you can’t rely on these old school search engine optimization strategies.
And the keyword didn’t even have to be in the body of the page itself. Many people ranked for their biggest competitor’s brand name just by stuffing dozens of variations on that brand name in a page’s meta tags!
Fortunately for Google search users and for ethical website owners, those days are long gone.
Today, keyword and meta tag stuffing will get you penalized, not rewarded. And, meta keyword tags aren’t really part of the algorithm at all (though there are still good reasons to use them).
These days, Google is much more concerned with the overall user experience on your site and the user intention behind the search — i.e., does the user want to buy something (commercial intent) or learn something (informational intent)?
Don’t get me wrong — keywords still matter. Other factors are also important — up to 200 altogether, according to Brian Dean of Backlinko, including things like quality inbound links, social signals (though not directly) and valid code on all your pages.
But none of that will matter if the spiders don’t even tell the search engines your pages are there to begin with. And that’s where indexing comes in.
Indexing is simply the spider’s way of gathering and processing all the data from pages and sites during its crawl around the web and improves your search results.
The spider notes new documents and changes, which are then added to the searchable index Google maintains, as long as those pages are quality content and don’t trigger alarm bells by violating Google’s user-oriented mandate.
So the spider processes both the content (text) on the page as well as the location on page where search terms are placed. It also analyzes the titles tag, meta tag and alt attributes for images.
That’s indexing. It is an essential webmaster tool. When a search user comes along and looks for information related to the same keywords, Google’s algorithm goes to work, deciding where to rank that page among all the other pages related to those keywords.
But, how do search engine spiders find new content — pages, sites or changes to pages — in the first place?
The spider starts with pages that have already been indexed via earlier crawl sessions. Thus having your page indexed already helps improve your crawl rate.
Next, it adds in sitemap data (more on that in a little bit).
That’s the short and somewhat simplified version of how Google finds, analyzes, and indexes new sites like yours. Many other search engines follow similar procedures, though there can be variations in the specifics and each engine has its own algorithm.
If you’ve recently published a new site on the web, you’ll want to first check to see if Google’s already found it.
The easiest way to check this is to use a site:domain.com search in Google. If Google knows your site exists and has crawled it, you’ll see a list of results similar to the one for NeilPatel.com in the screenshot below:
If Google hasn’t yet found your site, you’ll get no results at all, similar to this:
Step 2: Add a Blog
Why do you need a blog?
It’s simple: blogs are hard-working SEO machines. Blog content gets crawled and indexed more quickly than static pages. In fact, websites with blogs get an average of 434% more indexed pages and 97% more indexed links.
Blogs also bring in more traffic. Businesses that blog regularly generate 55% more visitors to their sites than those that don’t, according to HubSpot.
And blogging works for every kind of business, industry, or niche, as well as for almost all business models — even B2C and ecommerce sites. For instance, 61% of online shoppers have actually bought something based on the recommendation of a blogger.
Don’t be afraid of committing to a blog. Yes, it does require consistent effort. You do have to write (or outsource) high-quality, in-depth blog posts on a regular basis. But the rewards, I’ve found, are absolutely worth it.
And you don’t have to blog every single day — although 82% of those marketers who do blog daily report they get customers from these web page posts.
If you have an ecommerce site, blogging doesn’t have to be terribly complex or difficult.
For example, when you create a new product page, write and publish a blog post about the new product. Add some good quality images of the product and link to the product page. This helps the product page with crawl rate and get indexed more quickly by search engines.
Step 3: Use Robots.txt
If you’re not an expert coder or developer, you might have seen a file called “robots.txt” in your domain’s files and wondered what it is and what it does.
It’s a basic, plain text file that should reside in the root directory of your domain. If you’re using WordPress, it’ll be in the root directory of your WordPress installation.
The “what it does” is a little more complex. Basically, robots.txt is a file that gives strict instructions to search engine bots about which pages they can crawl and index — and which pages to stay away from.
When search spiders find this file on a new domain, they read the instructions in it before doing anything else. If they don’t find a robots.txt file, the search bots assume that you want every page crawled and indexed.
Now you might wonder “Why on earth would I want search engines not to index a page on my site?” That’s a good question!
In short, it’s because not every page that exists on your site should be counted as a separate page for search result purposes.
Say, for instance, that you’ve got two pages with the same content on your site. Maybe it’s because you’re split-testing visual features of your design, but the content of the two pages is exactly the same.
Your first step is to confirm that your new site has a robots.txt file. You can do this either by FTP or by clicking on your File Manager via CPanel (or the equivalent, if your hosting company doesn’t use CPanel).
If it’s not there, you can create one fairly simply using a plain text editor like Notepad.
Note: It’s very important to use only a plain text editor, and not something like Word or WordPad, which can insert invisible codes into your document that will really mess things up.
WordPress bloggers can optimize their robots.txt files by using reliable a wordpress plugin like Yoast’s SEO plugin.
The format of a robots.txt file is pretty simple. The first line usually names a user agent, which is just the name of the search bot – e.g., Googlebot or Bingbot. You can also use an asterisk (*) as a wildcard identifier for all bots. This type of wordpress plugin is an effective webmaster tool.
Next comes a string of Allow or Disallow commands for the search engines, telling them specifically which parts of your domain you want them to crawl and index and which they should ignore, giving you the right search engine optimization.
So to recap: the function of robots.txt is to tell search engines what to do with the content/pages on your site. But does it help get your site indexed?
Harsh Agrawal of ShoutDreams Media says
He was able to get sites indexed within 24 hours using a combination of strategies, including robots.txt and on-page SEO techniques.
All that being said, it’s crucial to be very cautious when revising your robots.txt file, because it’s easy to make a mistake if you don’t know what you’re doing.
An incorrectly configured file can hide your entire site from search engines — which is the exact opposite of what you want. This is why you must understand this webmaster tool to prevent hurting your crawl rate.
You may want to hire an experienced developer to take care of the job and leave this one alone if you’re not comfortable with the risk of hurting your search engine optimization.
Step 4: Create a Content Strategy
In case I haven’t said it enough, let me say it again: It’s to your own benefit to have a written content marketing strategy focused on search results.
But don’t take my word for it. From the Content Marketing Institute: “Business-to-business (B2B) marketers who have a documented strategy are more effective and less challenged with every aspect of content marketing.”
That’s absolutely true in my experience, but a documented content strategy also helps you get your site’s pages indexed when you follow through by creating new pages of content.
According to HubSpot’s “State of Inbound 2014” report, content marketers said that blogging produces 13x positive ROI when done properly.
Doing it properly, as Alex Turnbull of GrooveHQ says, means
Doing your best to publish valuable, interesting and useful content and then doing everything you can to make sure that your potential customers see it.
Here’s an example: when I create and publish a professional infographic on my site and then it gets shared on another web page with a link back to my page, I get content marketing “credit” for both.
And since it’s an infographic, I’m more likely to engage my audience on both sites.
Other examples of “offsite” content that you can publish that’ll help grow your audience thus improve crawl rate include:
- Guest blog posts to other sites in your niche
- Press releases submitted to sites that publish that kind of content
- Articles on high-quality article directory sites (Note: Be careful here — the vast majority of article directories are not high-quality, and can actually hurt your brand, reputation, and SEO.)
- Videos hosted on Vimeo or your YouTube channel
Of course, any content you put your name and brand on must be high quality content and published on a reputable, authoritative site. Otherwise you’re defeating your own purpose in search engine optimization.
Content that’s published on “spammy” sites with a link back to your site suggests to Google search results that your site is spammy, too.
A well-thought-out and written content marketing plan helps you avoid getting tripped up in the mad rush to publish more content. It puts you in the driver’s seat of search engine optimization, so you can focus on generating leads and increasing your conversion rate.
Creating a written content strategy doesn’t have to be complex or difficult. Simply follow a framework:
- What are your goals? Specify SMART goals and how you’ll measure your progress (i.e., metrics).
- Who is your target audience? Customer profiles or personas are essential to understanding your audience and what they want/need.
- What types of content will you produce? Here, too, you want to make sure you’re delivering the content types that your target audience most wants to see.
- Where will it be published? Of course, you’ll be hosting your own content on your new site, but you may also want to reach out to other sites or utilize platforms such as YouTube, LinkedIn and Slideshare.
- How often will you publish your content? It’s far better to produce one well-written, high-quality article a week consistently than to publish every day for a week, then publish nothing for a month.
- What systems will you adopt for publishing your content? Systems are basically just repeatable routines and steps to get a complex task done. They’ll help you save time and write your content more quickly, so you can stay on schedule. Anything that helps you publish content in less time without sacrificing quality will improve your bottom line. Include the blogging/content tools and technology you’ll use and how they fit into your system.
Once you have your content marketing plan documented, you’ll find it easier to publish great content on a consistent schedule, which will help your site’s new web pages get indexed more quickly.
Step 5: Create and Submit a Sitemap
You’ve undoubtedly seen the word “sitemap” before – but maybe you never knew exactly what it meant and how it relates to search engine optimization. Here’s the definition Google pulls for us:
So, the sitemap basically is a list (in XML format) of all the pages on your site. Its primary function is to let search engines know when something’s changed – either a new web page, or changes on a specific page – as well as how often the search engine should check for changes.
Do sitemaps affect your search rankings? Probably not – at least, not significantly. But they will help your site get indexed more quickly with a more efficient crawl rate.
In today’s hummingbird-driven world of search, there are a lot of SEO myths you need to be wary of. But, one thing remains the same: all things being equal, great content will rise to the top, just like cream.
Sitemaps help your great content get crawled and indexed, so it can rise to the top of SERPs more quickly, according to the Google webmaster blog. In Google’s own words, “Submitting a Sitemap helps you make sure Google knows about the URLs on your site.”
Is it a guarantee your site will be indexed immediately? No, but it is definitely an effective webmaster tool that helps in that process.
And it might help even more than Google has acknowledged thus far. Casey Henry wondered just how much sitemaps would impact crawl rate and indexing, so he decided to conduct a little experiment of his own.
Casey talked to one of his clients who ran a fairly popular blog using both WordPress and the Google XML Sitemaps Generator wordpress plugin (more on that below).
With the client’s permission, Casey installed a tracking script, which would track the actions of Googlebot on the site, as well as when the bot accessed the sitemap, when the sitemap was submitted, and each page that was crawled. This data was stored in a database along with a timestamp, IP address, and the user agent.
The client just continued his regular posting schedule (about two or three posts each week).
Casey called the results of his experiment nothing short of “amazing” as far as search engine optimization is concerned. But judge for yourself: when no sitemap was submitted, it took Google an average of 1,375 minutes to find, crawl, and index the new content.
And when a sitemap was submitted? That average plummeted to 14 minutes.
And, the numbers for Yahoo!’s search bot followed a similar trend.
How often should you tell Google to check for changes, by submitting a new sitemap? There’s no set-in-stone rule. However, certain kinds of content call for more frequent crawling and indexing.
For example, if you’re adding new products to an ecommerce site and each has its own product page, you’ll want Google to check in frequently, increasing crawl rate. The same is true for sites that regularly publish hot or breaking news items constantly competing in search engine optimization races.
But there’s a much easier way to go about the sitemap creation and submission process, if you’re using WordPress: simply install and use the Google XML Sitemaps plugin.
This is the same plugin Casey Henry used in the case study I mentioned above.
Its settings allow you to instruct the plugin on how frequently a sitemap should be created, updated and submitted to search engines. It can also automate the process for you, so that whenever you publish a new page, the sitemap gets updated and submitted automatically.
Other sitemap tools you can use include the XML Sitemaps Generator, an online tool that should work for any type of website and Google Webmaster Tools, which allows you take a more “hands on” approach.
To use Google Webmaster Tools, simply log in to your Google account, then add your new site’s URL to Webmaster Tools by clicking the “Add a Property” button on the right.
In the popup box, enter your new site’s URL and click the “continue” button.
Follow Google’s instructions to add an HTML file that Google creates for you, link your new site through your Analytics account or choose from another of the options Google will outline.
Once your site has been added to Google’s Webmaster Tools dashboard, simply click the URL to go to the Dashboard for that site. On the left, under “Crawl,” click “Sitemaps” then in the upper right corner click “Add/Test Sitemap.”
You can also use Bing’s Webmaster Tools to do the same for Bing and it’s good to cover all of your bases.
Step 6: Install Google Analytics
You know you’re going to need some kind of access to basic analytical data about your new site, right? So why not go with Google Analytics and maybe – just maybe – kill two birds with one stone, so to speak?
Installing Google Analytics may give Google a little wake-up nudge, letting the search engine know that your site is there, That, in turn, may help trigger the crawl rate and indexing process.
Then you can move on to more advanced tactics with Google Analytics, such as setting goals and tracking conversions.
Step 7: Submit Website URL to Search Engines
You can also take the direct approach and submit your site URL to the search engines.
Before you do this, you should know that there’s a lot of disagreement about site URL submission as a method of getting a site indexed.
Some bloggers suggest that it’s at least unnecessary, if not outright harmful. Since there are other methods that do work efficiently, most bloggers and site owners ignore this step.
On the other hand, it doesn’t take long and it can’t hurt their search engine optimization.
To submit your site URL to Google, simply log in to your Google account and navigate to Submit URL in Webmaster Tools. Enter your URL, click the “I’m not a robot” box and then click the “Submit Request” button.
To submit your site to Bing, use this link, which simultaneously submits to Yahoo as well.
Step 8: Create or Update Social Profiles
Do you have social media profiles set up for your new site or blog? If not, now’s the time.
Why? Because one component of search engine optimization is paying attention to social signals. Those signals can potentially prompt the search engines to crawl and index your new site.
What’s more, social signals will help you rank your pages higher in the search results.
Matt Cutts of Google fame said a few years back:
I filmed a video back in May 2010 where I said that we didn’t use “social” as a signal, and at the time, we did not use that as a signal, but now, we’re taping this in December 2010, and we are using that as a signal.
It’s obvious by now that a solid social media marketing plan helps SEO. But social profiles for your website also give you another place to add links to your site or blog.
Twitter profiles, Facebook pages, LinkedIn profiles or company pages, Pinterest profiles, YouTube channels and especially Google+ profiles or pages — all of these are easy to create and the ideal places to add links pointing to your website.
If, for whatever reason, you don’t want to create new profiles on social sites for your new site or blog, you can alternatively just add the new site’s link to your existing profiles to increase the crawl rate.
Step 9: Share Your New Website Link
Another simple way to get links to your new site or blog are through your own social status updates.
Of course, these links will be nofollow, but they’ll still count for indexing alert purposes, since we know that Google and Bing, at least, are tracking social signals from web pages.
If you’re on Pinterest, select a good, high-quality image or screenshot from your new site. Add the URL and an optimized description (i.e., make sure you use appropriate keywords for your site) and pin it to either an existing board or a new one you create for your site.
If you’re on YouTube, get creative! Record a short screencast video introducing your site and highlighting its features and benefits. Then add the URL in the video description.
If you have an existing email list from another site that’s related to the same niche as your new site, you can send out an email blast to the entire list introducing your new site and including a link.
Finally, don’t forget about email. Add your new URL and site name to your email signature.
Step 10: Set Up Your RSS Feed
What is RSS? And how does it impact indexing and crawling?
Well, before we get to that, let’s clear one thing up now: Many think RSS is dead. In my opinion, that’s not so, though it may be evolving rapidly and the number of users has been steadily dropping especially after Google killed Google Reader in 2013.
But even Danny Brown, who wrote that last linked-to article in which he called RSS “Really So-Over-It Syndication,” has changed his tune a bit.
RSS generally helps increase readership and conversion rate, but it can also help get your pages indexed. It stands for Really Simple Syndication or Rich Site Summary, and it’s good for both users and site owners.
To users, RSS feeds deliver a much easier way to consume a large amount of content in a shorter amount of time.
Site owners get instant publication and distribution of new content, plus a way for new readers to “subscribe” to that content as it’s published.
Setting up your RSS feed with Feedburner (Google’s own RSS management tool) helps notify Google that you have a new site or blog that’s ready to be crawled and indexed.
RSS will also let Google know whenever you publish a new post or page which Google needs to index.
Step 11: Submit to Blog Directories
You probably already know that submitting your new URL to blog directories can help your site “get found” by new potential users.
But it can also help the crawl rate and indexing take place more rapidly — if you go about it the right way.
Once upon a time, free blog directories littered the digital landscape. There were literally hundreds – if not thousands – of these sites and way too many of them provided little to no value to blog readers.
The quality problem got so bad that, in 2012, Google purged many free site directories from its index, properly dropping the search engine optimization of web pages with little content value.
Moz examined the issue by analyzing 2,678 directories, finally concluding that “[o]ut of the 2,678 directories, only 94 were banned – not too shabby. However, there were 417 additional directories that had avoided being banned, but had been penalized.”
So what’s the answer? If you’re going to submit to directories, then make sure you only submit to decently ranked and authoritative directories.
Best-of lists of directories compiled by industry and authority blogs can help you weed out the good from the bad, but make sure the list you’re using is current. For instance, this one from Harsh Agrawal has been updated as recently as January 2015.
Other options that you might want to explore are TopRank, which has a huge list of sites you can submit your RSS feed and blog to; Technorati, which is one of the top blog directories around; and — after you’ve published a decent amount of high-quality content — the Alltop subdomain for your niche or industry.
Submitting to high quality sites with decent Domain Authority ratings can not only open your content up to a whole new audience, but also provide incoming links that can nudge the search engines to crawl and index your site.
There you have it – eleven methods for getting your new site or blog indexed quickly by Google and other search engines.
As with most content marketing-related strategies and concepts, things change quickly, especially where search engines are concerned. It’s vital to stay current with industry news and double-check any new suggested technique with your own independent research.
What crawling and indexing tactics have you tried? What were your results?