Search

Submitting to Inktomi + 5 More SEO Tips

0 views

How Existing Sites Get Indexed Without Manual Submission

Many site owners stumble over the idea that a page has to be handed to each search engine in order for it to appear in results. That belief dates back to the early days of the web, when search engines were new, slow, and relied on hand‑rolled directories. The reality today is that major engines - Google, Bing, Yahoo, and the legacy Inktomi system - crawl the internet automatically on a regular basis. If a site has a clean, accessible structure and a sitemap, crawlers will discover its pages without any extra effort from the webmaster.

When Mary Beth reached out, she wondered whether she needed to pay for inclusion in Inktomi after her site had already been indexed by MSN and Hotbot. Jill pointed out that the paid submission feature is an optional service that can shave days or weeks off the time it takes for new content to be indexed, especially for very large or newly launched sites. For a site that has been live for months and already has a working link profile, waiting 2 to 6 weeks for a full crawl cycle is often the safest, most cost‑effective approach. The cost of a paid inclusion may outweigh the benefit, particularly if the primary goal is long‑term visibility rather than a quick boost.

The key to reliable indexing is ensuring the technical foundation is solid. A functional robots.txt file, correct use of noindex tags, and a comprehensive XML sitemap are all part of that foundation. Search engines rely on these signals to determine what content is ready for indexing. In the case of Inktomi, which is now largely integrated into the Yahoo! and Bing ecosystems, the same crawling logic applies. A well‑structured site will automatically surface in the search results without any manual steps.

Some site owners still invest in paid inclusion because they want to avoid the uncertainty that comes with waiting. If time is a critical factor - say, for an event or a limited‑time promotion - then paying for early access can be justified. But most of the time, a steady crawl schedule provides a more sustainable, long‑term strategy. It allows search engines to re‑index pages gradually, ensuring that each update receives fresh, accurate coverage. In practice, the difference between a free crawl and a paid submission is marginal for the average business site, and the risk of being dropped from an index because you didn't pay is virtually nonexistent.

In short, focus on building a clean site architecture, submit an XML sitemap to each major engine, and give the crawlers a couple of weeks to do their work. If you notice new content isn't appearing after that window, check your logs for crawl errors or indexation issues, and consider a one‑time paid inclusion only if you need a definitive, fast result.

Building Links the Right Way: Avoiding Invisible Tactics

Link building remains one of the most powerful ways to signal relevance to search engines. Rob, a client who had been using small text links at the bottom of his sites, decided to switch to transparent GIFs that carried the same anchor text in the alt attribute. He was concerned that search engines might view the hidden links as deceptive, and wondered whether the practice was safe.

Search engines evaluate links based on their intent and context. The most transparent and reliable approach is to present the link in a way that is visible to both users and crawlers. Hidden links - those that are invisible to visitors but still count as votes for a site - have been flagged by major search engines for violating their webmaster guidelines. Google’s guidelines specifically advise against hidden text or links, noting that they can be abused to manipulate rankings.

From a practical standpoint, a link that appears on a page for human readers earns two benefits. First, it provides a real navigation point for anyone who clicks it, reinforcing the user experience. Second, it signals to crawlers that the link is a legitimate signal of trust, not a crafted attempt to inflate page authority. Even if the link is placed on a low‑traffic page, visibility helps build a natural link profile that search engines favor.

If a client prefers a more subtle link strategy, the recommended solution is to style the link so that it blends into the surrounding content without obscuring it. Using muted colors, small font sizes, or placing the link within a sidebar can maintain a low visual profile while remaining fully transparent to search engines. In most cases, this approach is safer and more effective than hiding the link entirely.

Ultimately, the safest path is either to keep the link visible in a design that fits the site’s aesthetic or to remove it altogether if it provides no clear benefit to users. By avoiding deceptive practices, site owners maintain a strong relationship with search engines and ensure that every link earns genuine value.

Speed vs. Sustainability: What to Expect from SEO Efforts

Tim, the VP of Marketing for a medical center, reached out with a common dilemma: he wanted to see tangible improvements in search engine rankings before a full site redesign could be launched. He had already invested in a Keyword Effectiveness Index study and was looking for a quick win.

SEO, unlike paid advertising, is a gradual process that relies on repeated, consistent efforts. The search engines evaluate pages over time, considering factors such as content quality, site speed, mobile friendliness, and external signals. Expecting a dramatic jump in rankings overnight is unrealistic, especially when competing against well‑established competitors in the healthcare niche.

For a brand that needs immediate visibility, the most straightforward option is to invest in paid campaigns on Google AdWords and Overture. These platforms deliver search results in seconds and allow you to target specific keywords, geographic regions, and even patient intent. However, they do not contribute to organic rankings and should be used as a bridge while the organic strategy takes shape.

Building a sustainable organic foundation begins with a technical audit that checks crawlability, index status, page speed, and mobile optimization. Next, perform keyword research to identify high‑intent terms that are realistic to rank for. Then, create a content calendar that addresses those terms with well‑structured, user‑centric articles, FAQs, and multimedia assets. Finally, implement a link building plan that focuses on earning natural, authoritative backlinks from relevant industry sites.

Monitoring progress through tools like Google Search Console, Ahrefs, and Screaming Frog allows you to see how your pages evolve. Adjust the strategy based on data, not assumptions. Patience, consistency, and adherence to best practices are the real drivers of long‑term SEO success. In the meantime, paid search can keep the brand visible while the organic engine builds its momentum.

Optimizing Titles and Subtitles to Drive Click‑Throughs

Katie, the editor of a scientific magazine, had a habit of using evocative, poetic titles for her articles - “The Pursuit of Shadows,” for example, even when the piece was an in‑depth study of cloud science. She was wondering whether this creative approach might hurt her search visibility.

Search engines read title tags as one of the most important signals that define a page’s content. If the headline fails to contain the primary keyword phrases that users are typing, the page may rank lower for those queries, regardless of the quality of the content inside. While a poetic title can capture attention in print or on social media, it often misses the keyword cue that drives organic clicks.

A practical compromise is to pair the creative phrase with a descriptive subtitle that includes the keyword. For instance, “The Pursuit of Shadows: An In‑Depth Look at Cloud Science.” The primary keyword appears in a prominent position while the artistic flair remains. This structure satisfies both editorial intent and SEO requirements.

When you write the actual title tag that appears in search results, keep it concise - under 60 characters is best - so the full text shows in the SERP. Use the main keyword at the beginning of the tag, followed by a brand or descriptor. The meta description can then elaborate, offering a compelling teaser that encourages clicks. By aligning the visible headline with the keyword strategy, you boost both relevance and engagement.

Over time, search engines reward pages that consistently match user intent. A well‑crafted title that balances creativity with keyword relevance can increase click‑through rates, improve dwell time, and provide the organic signals that elevate rankings. Experiment with variations, monitor performance, and refine the approach as you gather data from your audience’s behavior.

Meta Tags 101: When to Use Keywords and Descriptions

Matus, a new subscriber to the newsletter, noticed that the site’s meta keywords often contained generic terms that didn’t appear in the body copy. He wondered whether this was a cover‑up or a deliberate tactic.

For most search engines, especially Google, the meta keyword tag no longer carries any ranking weight. The tag exists in the HTML, but the crawlers skip it when building the index. Because of this, the content of the meta keyword field has little to no direct impact on visibility. That explains why many sites simply omit the tag entirely or fill it with broad descriptors that reflect the overall theme of the site.

Meta descriptions, on the other hand, still play an important role in attracting clicks from search results. Even though the description doesn’t directly influence rankings, it is a key component of the snippet that users see in the SERP. Craft a compelling summary that includes the primary keyword and a call to action, keeping the length between 150 and 160 characters. A well‑written description can improve click‑through rates, which indirectly supports SEO success.

In practice, a unique meta description for each page provides a better user experience. It signals to both users and crawlers that the content is distinct and tailored to a specific query. When a page contains a high volume of keywords, the description should avoid keyword stuffing; instead, focus on clarity and relevance. By matching the description to the content, you increase the likelihood that the snippet will match user intent and entice them to visit.

When developing your SEO strategy, prioritize on‑page factors that actually influence rankings - content quality, heading structure, internal linking, and technical health. The meta keyword tag can be left out or used sparingly, while the meta description should be crafted thoughtfully for each page. By following these guidelines, you ensure that your site communicates clearly to both search engines and human visitors.

Jill Whalen of High Rankings is an internationally recognized search engine optimization consultant and host of the free weekly High Rankings Advisor search engine marketing newsletter.

She specializes in search engine optimization, SEO consultations and seminars. Jill's handbook, "The Nitty-gritty of Writing for the Search Engines" teaches business owners how and where to place relevant keyword phrases on their Web sites so that they make sense to users and gain high rankings in the major search engines.

Suggest a Correction

Found an error or have a suggestion? Let us know and we'll review it.

Share this article

Comments (0)

Please sign in to leave a comment.

No comments yet. Be the first to comment!

Related Articles