Search

Paid Search Engine Inclusion Programs - The Details (1) Inktomi + AltaVista

0 views

Inktomi Paid Search Inclusion Overview

Paid search inclusion (PFI) is a way for site owners to guarantee that their pages appear in a search engine’s results list, even if the pages are new or lack organic traffic. Inktomi, one of the early players in the paid search space, offers a program that works through a network of submission partners. The process begins with the webmaster submitting the URL to one of the partners, who in turn alerts Inktomi’s crawling engine. From there the Inktomi spider follows a predictable pattern: it visits the page, evaluates its content, and, if the page meets the inclusion criteria, adds it to the index. The inclusion happens a day or two after the first crawl, making the program useful for businesses that need a quick online presence.

The key advantage of Inktomi’s model is its partnership structure. By collaborating with several third‑party vendors, Inktomi can cover a wide geographic and technical base, ensuring that submitted URLs reach a variety of user groups. For webmasters, this means that the choice of partner can influence the timing of the initial crawl, though the eventual indexing is controlled solely by Inktomi. The program’s pricing is uniform across partners, simplifying budgeting. A single click in any partner’s dashboard triggers the same behind‑the‑scenes workflow, so users need not worry about different fee structures.

Because Inktomi’s system is built around a dedicated spider, the crawl frequency is relatively high compared to other paid inclusion programs. Once a page is accepted, the spider revisits it every two days, unless an external event (such as a major site update or a new submission from a different partner) prompts a faster cycle. This ensures that the index stays fresh, but it also means that webmasters must monitor their server logs for potential crawl pressure. The high revisit rate can impact server resources, especially for large sites, so planning for additional bandwidth is advisable.

Another notable aspect is the transparency of the submission process. Most partners provide a management area where the webmaster can see the status of each URL - whether it is pending, approved, or rejected. The dashboard also displays the number of times the page has been crawled, giving a rough sense of how active the inclusion process is. While this information is useful, it is important to remember that the dashboard shows only the state of the submission, not the real‑time indexing status. To confirm that a page has truly entered the search results, a search query using the exact URL or a unique keyword should be performed on the Inktomi interface.

In sum, Inktomi’s PFI offers a straightforward, partner‑driven workflow that guarantees quick inclusion for those who need a fast, paid path into the search index. The program’s consistency, coupled with its high crawl frequency, makes it a reliable choice for marketers who want to bypass the slower, organic route.

Inktomi Partner Breakdown

Inktomi’s paid inclusion program is delivered through a variety of submission partners, each with its own strengths and network. The primary partners are ineedhits.com, Outrider, Position Technologies, and VeriSign. These partners differ in their geographic focus, technical capabilities, and the way they integrate with the core Inktomi crawler.

Ineedhits.com, based in the United Kingdom, offers a simple web interface that accepts a list of URLs and processes them in batches. Once the webmaster submits the URLs, the site sends an automated confirmation back to the user and forwards the data to Inktomi’s ingestion pipeline. The partner’s dashboard shows each URL’s status in real time, along with click‑through statistics and keyword lists. However, the click‑through data is known to be incomplete - often only 70–75 % of actual clicks are recorded. Despite this shortfall, the system remains popular because it provides a quick turnaround and the ability to monitor performance across multiple search engines.

Outrider, a London‑based firm affiliated with Position Technologies, operates a different model. Its submission portal focuses on a smaller set of high‑traffic URLs and integrates closely with Inktomi’s proprietary spider. After submission, Outrider’s system triggers an immediate crawl from the Inktomi spider, usually within one to two hours. The partner’s account service page displays the current status and click data, but users have reported that less than 50 % of the actual clicks appear in the reports. Outrider’s interface includes a feature that shows a graphical view of click trends, which helps in quick diagnostics, but the incomplete reporting remains a drawback.

Position Technologies, the parent company of Outrider, offers an even more comprehensive control center. In addition to the standard URL status and click logs, Position Tech’s dashboard provides keyword analytics that show the exact search terms visitors used to find the site. This depth of data is valuable for refining marketing strategies. Position Tech also offers a “Pure Web Search” function, which allows users to see search results stripped of paid advertisements or other enhancements. This function is available publicly, giving marketers a glimpse into how their pages appear in a clean, organic environment.

VeriSign, a well‑known domain registration and security provider, runs a PFI program that focuses primarily on domain registration and SSL certification. After a webmaster submits a URL via VeriSign’s portal, the Inktomi spider is triggered in a similar manner to the other partners. However, VeriSign’s reporting system appears to have a significant flaw: the platform reports no clicks even when server logs show multiple hits. This disconnect means that users must rely on third‑party analytics tools to gauge the true impact of their paid inclusion efforts.

While the partners differ in user experience and reporting detail, the underlying technical process is consistent. All submissions funnel into the same Inktomi crawler, ensuring uniform indexing. The choice of partner therefore boils down to how much insight and data a webmaster requires from the submission process. For those who need a detailed analytical view, Position Technologies stands out, whereas users looking for quick, simple submissions may prefer ineedhits.com or VeriSign.

Spidering Behavior and Timing

After a URL has been submitted to Inktomi via one of its partners, the search engine’s spider begins its routine. The initial crawl typically occurs within 24 to 48 hours of submission. On the first day, the spider visits the URL to confirm that the content is accessible and that the page meets the inclusion criteria. The spider’s user agent string is easily identifiable: it includes “Slurp” followed by the spider’s hostname, for example, Slurp/si; slurp@inktomi.com. Recognizing this pattern can help administrators log the exact times the spider touches their site.

Once the first crawl is complete, the Inktomi spider moves into a maintenance phase, revisiting the page every two days. This bi‑daily rhythm is not rigid; the spider may pause for a day or skip a cycle if it detects that the page has not changed. In practice, however, most pages are visited on a strict schedule. Some webmasters have noticed deviations, especially after the first month of the program. On certain occasions, the spider paused for up to four days before the next visit, possibly due to a system maintenance window or a high load on the server.

The crawler does more than simply fetch the content; it also requests the site’s robots.txt file during each cycle. By inspecting the directives, the spider ensures compliance with the webmaster’s crawl preferences. The crawler’s request for the robots.txt file occurs every five days, providing an additional checkpoint to confirm that the page is still accessible. If the file blocks the crawler, the page may be excluded from the index despite the paid submission.

When the page is first submitted, the spider also visits the domain’s root URL (for example, http://www.example.com/) to assess overall site health. This initial check is separate from the page-level crawl and typically occurs within the same 24‑hour window. The spider’s user agent string in this case remains the same; it only changes the target URL. This process helps Inktomi determine whether the site is generally reachable and whether it should consider indexing other pages under the same domain.

In addition to the dedicated Inktomi spider, each partner sometimes runs its own crawler. For instance, Position Technologies employs a shared spider that visits the site an hour after submission. The partner’s spider’s user agent is a standard browser string (e.g., Mozilla/4.0 (compatible; MSIE 4.01; Windows NT)). These partner spiders perform a quick health check and feed the data back to Inktomi. While the partner spider’s visits are less frequent, they can provide early indicators that the page is accessible before the Inktomi crawler fully engages.

Understanding the spider’s behavior is essential for maintaining optimal site performance. Webmasters should ensure that the server can handle the potential spike in traffic when the crawler lands. Monitoring server logs for the specific user agent strings can help isolate crawler traffic from regular visitors, enabling accurate traffic analysis. By aligning server capacity with crawler activity, site owners can avoid performance bottlenecks and guarantee that the page stays in the index without unnecessary downtime.

Reporting and Analytics Quality

The PFI program’s real value lies not only in ensuring that a page appears in the search results, but also in providing actionable data about how users interact with the site. Each partner offers a dashboard where the webmaster can view the status of each URL and the associated click‑through data. However, the depth and accuracy of this data vary significantly across partners.

In the case of ineedhits.com, the subscription management area displays the current status of every submitted URL. The dashboard also offers a “Click‑Thru Hits Report” that lists the number of clicks and the search terms that led users to the site. Users can then click through to the search engine results on AOL, HotBot, or MSN to see the exact page in context. Despite the promise of this feature, the reporting system captures only a fraction of the actual clicks. In practice, roughly 70–75 % of the total hits and keywords are recorded. This discrepancy can lead to an underestimation of a page’s reach, especially for high‑traffic sites.

Outrider’s account service page follows a similar pattern. The status and click counts are visible, but the platform states that it captures less than half of all clicks generated by the Inktomi crawler. Webmasters who rely solely on this data may miss important trends or fail to notice that their pages are not performing as expected. The graphical view of click trends can be helpful, yet it still suffers from the same reporting gap.

Position Technologies offers the most robust analytics. Their subscriber area not only shows URL status but also provides a comprehensive keyword list. The platform logs the exact search terms visitors used, allowing for a detailed understanding of user intent. In addition to this, Position Tech’s dashboards include a graphical representation of click traffic, helping webmasters spot seasonal or campaign‑driven spikes. However, even this advanced system has a limitation: only about 40 % of the actual hits and keywords are displayed. This shortfall indicates that the platform may not capture all user interactions, perhaps due to a filtering mechanism or server‑side logging limitations.

VeriSign’s reporting is the most opaque. In the subscriber area, no click or keyword data appears, even when server logs show clear evidence of traffic. This lack of visibility forces users to rely on third‑party analytics, such as Google Analytics or Adobe Analytics, to gauge the effectiveness of their paid inclusion. While this approach can be effective, it introduces additional overhead for the webmaster.

Given these variations, it is prudent for site owners to complement partner dashboards with their own analytics solutions. By comparing data from multiple sources, webmasters can identify gaps in reporting and make more informed decisions about marketing spend and content strategy. Accurate analytics also help in refining keyword targeting, optimizing landing pages, and ultimately maximizing return on investment from paid search inclusion.

PositionTech and Outrider Specifics

Position Technologies and Outrider share a close operational relationship, with Outrider functioning as a London‑based front for Position Tech. Their joint focus on the UK market offers unique insights into how paid inclusion performs in a competitive environment. The two partners not only submit URLs to Inktomi but also run their own crawler to verify site accessibility before the main spider arrives.

When a webmaster submits a URL through Outrider, the site’s system immediately triggers a quick crawl from the Inktomi spider. This initial visit typically lands within one to two hours of submission, which is remarkably fast compared to other partners. The user agent string is the standard Inktomi signature: Mozilla/3.0 (Slurp/si; slurp@inktomi.com). If the page is accessible, the crawler logs the visit and signals that the page is ready for indexing. Outrider’s own crawler, identified by a generic browser string (e.g., Mozilla/4.0 (compatible; MSIE 4.01; Windows NT)), lands an hour after submission to perform a quick health check. This dual‑crawling approach ensures that any server errors are caught early.

From a reporting perspective, Outrider’s dashboard offers real‑time status updates and click data. However, the data is incomplete - less than half of the actual clicks generated by the Inktomi crawler appear in the reports. Users can view a graphical representation of traffic, but the underlying numbers may mislead decision makers. In contrast, Position Technologies provides a more detailed analytics suite. The dashboard includes keyword lists, click counts, and a “Pure Web Search” feature that removes paid elements from search results. Position Tech’s graphical view is more sophisticated, offering trends over time and the ability to drill down into specific search terms.

One advantage of Position Tech’s offering is the inclusion of a search form that exposes the “Pure Web Search” interface to the public. By visiting the URL for the pure search function, marketers can see how their pages appear in an unfiltered environment. This feature is especially useful when trying to balance paid inclusion with organic search visibility. However, the system’s reporting still suffers from a significant gap: it captures only about 40 % of the true click and keyword data. Users must supplement the data with external analytics to get a full picture.

Because Position Tech and Outrider rely on a shared Inktomi spider, the crawl schedule remains the same for both partners. After the initial two‑hour crawl, the page is revisited every two days. If the page receives a major update or a new URL is submitted, the cycle may reset, prompting a faster crawl. The partner spiders provide quick feedback, but it is the Inktomi crawler that ultimately dictates when the page is indexed. For webmasters who need fast, predictable indexing, Outrider’s quick initial visit can be a decisive factor. For those who need deeper analytics, Position Technologies offers a more comprehensive suite, albeit with the caveat of incomplete reporting.

In practice, many webmasters use both partners in tandem. Outrider handles bulk submissions, while Position Tech provides the analytical depth required to refine marketing strategies. By combining the strengths of both partners, site owners can achieve a fast inclusion timeline and a robust understanding of user engagement.

VeriSign Insights

VeriSign’s paid search inclusion program distinguishes itself primarily through its brand recognition and integration with domain registration services. While the core technical process mirrors that of other partners - submitting URLs, triggering the Inktomi spider - the experience for the webmaster differs in several key areas.

After a URL is submitted via VeriSign’s portal, the first Inktomi spider visit typically occurs within 24 hours. The user agent string remains consistent with other partners, making it easy to spot in server logs. Subsequent crawls follow the standard bi‑daily pattern, though VeriSign does not provide detailed timing data to the user. Unlike ineedhits.com or Position Technologies, VeriSign’s dashboard offers no click‑through statistics. Even if a webmaster monitors server logs, the platform’s reporting system appears to ignore those hits entirely, displaying zero clicks.

For webmasters who rely on VeriSign’s platform alone, this lack of visibility can be frustrating. Without real‑time data on clicks or keywords, it is difficult to gauge whether the paid inclusion is delivering the desired traffic. This gap forces users to depend on external analytics tools such as Google Analytics, which, while powerful, require additional configuration and ongoing maintenance.

Despite the reporting shortcomings, VeriSign’s inclusion process has its own advantages. The company’s deep integration with domain registration means that site owners can bundle domain management, SSL certificates, and paid search inclusion into a single billing and support package. For businesses that prefer a one‑stop solution, this can reduce administrative overhead. Additionally, VeriSign’s infrastructure is highly reliable, with minimal downtime during crawler visits. Webmasters have reported that the site remains stable even during the initial Inktomi crawl, a testament to VeriSign’s robust hosting environment.

It is worth noting that the incomplete reporting does not reflect a technical failure of the Inktomi spider. The spider operates normally, crawling and indexing the page as expected. The problem lies in VeriSign’s analytics backend, which does not capture or display the click data. Consequently, the webmaster’s experience is more about data management than crawl performance.

For those who need both quick inclusion and actionable data, using VeriSign in conjunction with a dedicated analytics platform may be the best approach. By combining VeriSign’s reliable hosting with an external analytics solution, webmasters can ensure that their pages are indexed promptly while still monitoring user engagement effectively.

Summary of Inktomi PFI

The Inktomi paid search inclusion program offers a uniform pricing structure across all partners, making budgeting straightforward. Once a URL is submitted, the Inktomi spider initiates the crawl within a day, revisiting the page every two days. The partner choice primarily influences the initial crawl timing rather than the overall indexing process.

In terms of reporting, the partner dashboards provide varying levels of detail. Position Technologies leads with comprehensive keyword and click analytics, but still reports only about 40 % of the actual data. Ineedhits.com and Outrider offer basic click reports that capture roughly 70–75 % and less than 50 % of hits, respectively. VeriSign’s reporting is the weakest, displaying no click data at all. These gaps highlight the need for webmasters to supplement partner dashboards with third‑party analytics.

Partner spiders play a secondary role, performing quick health checks an hour or two after submission. These early visits help identify server issues before the Inktomi spider takes over. The Inktomi spider itself is the sole crawler responsible for indexing, and its schedule is fixed regardless of the partner.

Overall, Inktomi’s PFI program delivers fast inclusion and a predictable crawl cycle. However, the inconsistent reporting quality across partners can hinder accurate performance measurement. Webmasters who rely on paid inclusion must therefore consider additional analytics tools to fully understand traffic patterns and return on investment.

AltaVista infoSpider Process

AltaVista’s paid inclusion system is distinct from Inktomi’s. The program is managed by an independent partner, infoSpider, which oversees the submission and initial crawling of URLs. When a webmaster submits a page, infoSpider first performs a quick URL checker to verify that the page is reachable. This check is logged with the user agent infoSpider URL Checker, and the server IP is typically 208.185.243.149.

Within four days of submission, the AltaVista crawler, known as Scooter-3.0.3, lands on the site. The user agent string includes the crawler’s hostname, scooter2.sv.av.com, and the source IP is 209.73.162.172. The crawler visits the page, evaluates its relevance, and, if approved, adds it to AltaVista’s paid index. The inclusion typically occurs the day after the crawler’s visit, meaning that a webmaster can expect to see their page in search results within a week of submission.

Unlike Inktomi’s predictable bi‑daily cycle, AltaVista’s crawling schedule is driven by its weekly index run for paid inclusions. The crawler visits submitted pages one to three days before the index run, but the exact timing is not publicly documented, making it difficult to predict when a page will be indexed. The lack of a regular rhythm is a significant difference from the Inktomi model, where the crawler’s schedule is transparent.

AltaVista runs at least one dedicated spider for PFI pages. A secondary crawler, Scooter-3.2, also visits the site. This spider uses the hostname scooter3.sv.av.com and the source IP 209.73.162.143. The purpose of this secondary spider appears to be a backup or to handle edge‑case pages that the primary crawler might miss.

Reporting for AltaVista is minimal. The infoSpider service section only allows the webmaster to check the status of submitted URLs. No click‑through or keyword data is provided. This contrasts with the richer dashboards offered by Inktomi partners. Webmasters who rely on AltaVista’s paid inclusion will need to use external analytics solutions to gauge traffic and performance.

In summary, AltaVista’s paid inclusion process is slower and less transparent than Inktomi’s. The initial check occurs shortly after submission, but the final indexing can take up to a week. The crawler’s schedule is driven by a weekly index run, with no predictable rhythm. Reporting is limited to status checks, so webmasters must supplement with third‑party analytics to monitor success. Despite these drawbacks, AltaVista remains a viable option for those targeting its specific user base or seeking additional diversity in their paid search strategy.

Suggest a Correction

Found an error or have a suggestion? Let us know and we'll review it.

Share this article

Comments (0)

Please sign in to leave a comment.

No comments yet. Be the first to comment!

Related Articles