Search

SEO Advice from Dan Thies

0 views

Getting Your Site Noticed by the Big Crawlers

When you launch a new website, the first instinct is to make sure search engines find it. Four major crawlers still dominate the landscape: Google, Altavista, FAST, and Inktomi. While the names Altavista and FAST sound like old‑school brands, many sites still submit pages to them because they retain a small but dedicated user base. The good news is that submitting to the three free options - Google, Altavista, and FAST - is straightforward and quick. Start by typing your full address, for example www.cannedbooks.com, into the search box of each engine. If you see your site listed, it is already indexed; no further action is needed. If you don’t, you can add your home page using the “Submit a URL” or “Add Site” feature. Keep in mind that the submission process is a one‑time action per month; you shouldn’t spam the same address more often than that, or you risk being flagged as a spammer.

Altavista, FAST, and Inktomi also offer paid inclusion plans for businesses that want a guarantee that their home page will always appear in search results. For most small and medium‑sized sites, paying for inclusion is unnecessary. The search engines crawl the web automatically, and any site that earns enough inbound links will eventually appear in their indexes. Building links remains the core of site promotion. Spend at least an hour a week on link‑building activities - guest posting, outreach, broken link replacement, and social media promotion. Even a modest, consistent effort keeps search engines coming back for fresh content.

Don’t rely on automated submission services that claim to “submit to hundreds of search engines.” In practice, there are only a handful of major search engines, and most smaller services are designed to generate spam traffic. The result is an inbox that never empties. If you prefer a hands‑on approach, manually submitting to the four main engines is more reliable and takes less than five minutes. For those who want to experiment, you can run a test by submitting a fresh page to each engine and checking whether it appears in their search results after a day or two.

Another useful avenue for increasing visibility is the Open Directory Project (ODP) and its community forums. The ODP editors are volunteers, but they review each submission carefully. Find the “Open Directory Public Forum” and read the welcome messages before posting. A clear, well‑structured category request that matches your site’s content will help you get accepted quickly. In addition to ODP, keep an eye on newer volunteer directories such as GoGuides.org and JoeAnt.com. These directories, run by former Go.com editors, are easy to submit to and can generate direct traffic. Remember that editors may be offline for weeks, so polite follow‑ups to the most active editor in your chosen category often lead to quicker acceptance.

In short, submitting to the four major engines, building a healthy link profile, and keeping your site in respected directories create a strong foundation for search visibility. Use the free tools provided by each engine, avoid over‑submission, and nurture relationships with directory editors and webmasters to keep your site ranking well over time.

Choosing the Right Keywords and Testing Their Value

One of the most common questions from new site owners is how to decide between singular and plural forms of a keyword. Google treats each variation as a distinct search term. For example, “web hosting” and “website hosting” have different search volumes and may attract slightly different audiences. Rather than guessing, use Google’s advertising tools to estimate the number of clicks each term could bring. The Keyword Planner is free to access; simply sign in to your Google Ads account and click “Tools” > “Keyword Planner.” Select “Discover new keywords,” enter your search terms, and review the suggested metrics. The planner shows average monthly searches, competition level, and estimated cost per click, but more importantly, it lists the average number of clicks per day you might receive for each term.

Set up a mock ad campaign to test these terms. Choose a language and country that matches your target market - most small businesses start with English in the United States. Click “Create ad and continue,” and fill out the ad fields with placeholder text; you are not actually launching the campaign, just exploring the data. Once you’ve entered the keyword list, use the “Keyword Suggestion Tool” to identify related terms that might have higher search volume. Copy and paste the suggested keywords into the ad creation form to see how they compare.

After the data populates, click “Save and continue” and then “Calculate estimates.” The table will show, for each keyword, the projected clicks per day, the cost per click, and the competition level. For instance, “web hosting” may yield 370 clicks per day, while “website hosting” only brings 40 clicks. These numbers help you decide which keyword deserves more emphasis on your home page, in meta tags, and in your content. If a keyword’s click potential is low, you might still use it in niche posts or in long‑tail variations that attract highly qualified traffic.

Remember that the Keyword Planner estimates clicks for paid campaigns, not organic traffic. However, the relative ranking of terms remains useful because the search intent is the same. A high click‑through rate in paid ads usually correlates with strong organic interest. Use this information to shape your on‑page SEO strategy: place the highest‑volume keyword near the top of your page title, in the first paragraph, and in the meta description. For the secondary keyword, weave it naturally into subheadings and body text. By aligning your content with the data, you improve the likelihood of ranking higher in the search results.

Once you have selected your primary keyword, revisit your content regularly to ensure it remains relevant and updated. Search engines reward fresh, well‑structured pages, so an annual review of your keyword list and content can keep you ahead of competitors who may not be updating their pages as often.

Ensuring Search Engines Crawl Your Site More Often

Keeping search engines crawling frequently is crucial for timely indexation and better rankings. While Google no longer offers a paid inclusion program, there are still practical steps you can take to increase crawl frequency. First, focus on generating high‑quality inbound links. Links are a primary signal that your site is worth exploring again. A popular page that receives a surge of links in a short period will see the search engine bots visit it more often.

Second, update your content regularly. If your home page changes with each visit - new blog posts, fresh case studies, or updated product information - search engines interpret this as a sign of activity. Add a “last modified” timestamp in the page header or footer so bots can easily detect changes. This practice also benefits visitors who appreciate current information.

Third, use a robots meta tag to indicate how often you want bots to revisit. Place a tag in the page’s <head> section like <meta name="robots" content="index,follow">. If you want to request more frequent crawling, add max-video-preview:-1 or similar directives - though be careful not to over‑request, which may cause your site to be temporarily deprioritized. The key is to balance your crawl request with the search engine’s resources.

Fourth, continue building a strong directory presence. Listing your site in major directories such as the Open Directory Project or newer volunteer directories can serve as a signal that your page is worth visiting again. Each directory link is a backlink that improves overall link equity, making search engines more likely to crawl your main site on a regular basis. The combination of directory links and external backlinks from reputable sites forms a robust link profile that encourages frequent bot visits.

Finally, monitor your site’s crawl stats through Google Search Console. The “Coverage” and “Performance” reports show how many pages are indexed and how often bots are crawling them. Use this data to identify pages that are under‑indexed and adjust your strategy accordingly. If a page consistently shows low crawl frequency, consider adding internal links from high‑traffic sections or publishing a new update that includes the target keywords.

By following these three core tactics - link building, frequent content updates, and smart use of meta tags - you can significantly improve how often search engines revisit your site. Consistent crawling leads to faster indexation of new pages, higher rankings for updated content, and ultimately better visibility for your business.

Suggest a Correction

Found an error or have a suggestion? Let us know and we'll review it.

Share this article

Comments (0)

Please sign in to leave a comment.

No comments yet. Be the first to comment!

Related Articles