Search

Reading Your IIS Log Files with ColdFusion

0 views

Parsing a Single IIS Log File with ColdFusion

When an IIS web server records traffic, it writes each request into a plain‑text file. Those files are great for troubleshooting, but they are not immediately usable for analytics or reporting. ColdFusion offers a straightforward way to turn each line of an IIS log into a set of variables you can inspect, display, or store in a database. Below you’ll find a step‑by‑step walkthrough that starts from the beginning of a log file and ends with a ready‑to‑use data set.

First, decide where your log files live. By default IIS writes them to a folder like C:\inetpub\logs\LogFiles, but you can choose any directory you prefer. In the code snippets that follow we’ll use a placeholder variable file2load to hold the full path to the log file. ColdFusion’s <cfparam> tag lets us give this variable a sensible default value while still allowing a developer or administrator to override it if necessary.

Next, load the entire file into a single string variable with <cffile>. This tag is efficient for moderate‑sized log files, and it makes it easy to break the file into individual lines later. The READ action reads the file contents, and the variable attribute tells ColdFusion where to store the data. After the file is loaded, you’ll be ready to start looping through the lines.

In a log file, lines beginning with a hash symbol (#) are comments generated by IIS to describe the column layout. Those lines aren’t real requests, so they must be ignored. The loop that follows reads each line and checks its first character. If the character is a hash, the loop skips that line and continues on to the next one. If not, the line is a real request and we move on to parsing it.

The real parsing logic is a two‑step process. First, split the line on spaces to get an array of values. Each position in the array corresponds to a specific field: the date, time, client IP, username, server IP, server port, HTTP method, URI stem, query string, status code, user agent, and referrer. In ColdFusion you can use a nested <cfloop> to iterate through the array, counting the index so you know which field you’re at. A simple Value counter does the trick: increment it with each iteration, then use a series of <cfif> tests to assign each piece of data to a variable with a descriptive name.

Once the line is parsed, you can display the values, store them, or perform any other operation you need. In the example below, the parsed data is echoed to the browser inside a styled block for quick visual confirmation. This makes it easy to spot parsing errors right away - if a field looks wrong, you know that the split logic or the indexing might need adjustment.

Prompt
<!------ Define defaults ------></p> <p><cfparam name="file2load" default="C:\inetpub\logs\LogFiles\W3SVC1\u_ex20060101.log"></p> <p><cfset entry_ws3_number = "ws3srv"></p> <p><cfset entry_URL = "www.yoursite.com"></p> <p><!------ Load the file ------></p> <p><cffile action="READ"</p> <p> file="#file2load#"</p> <p> variable="log_data"></p> <p><cfoutput></p> <p> <hr></p> <p> <!------ Break by line ------></p> <p> <cfloop index="rc" list="#log_data#" delimiters="#chr(13)##chr(10)#"></p> <p> <!-- Skip comment lines --></p> <p> <cfif left(rc, 1) eq "#"></p> <p> <!--- Do nothing ---></p> <p> <cfelse></p> <p> <!-- Parse the line --></p> <p> <cfset Value = 0></p> <p> <cfloop index="id" list="#rc#" delimiters="#chr(32)#"></p> <p> <cfset FirstValue = id></p> <p> <cfset Value++></p> <p> <cfif Value eq 1> <cfset date = FirstValue></p> <p> <cfelseif Value eq 2> <cfset time = FirstValue></p> <p> <cfelseif Value eq 3> <cfset c_ip = FirstValue></p> <p> <cfelseif Value eq 4> <cfset cs_username = FirstValue></p> <p> <cfelseif Value eq 5> <cfset s_ip = FirstValue></p> <p> <cfelseif Value eq 6> <cfset s_port = FirstValue></p> <p> <cfelseif Value eq 7> <cfset cs_method = FirstValue></p> <p> <cfelseif Value eq 8> <cfset cs_uri_stem = FirstValue></p> <p> <cfelseif Value eq 9> <cfset cs_uri_query = FirstValue></p> <p> <cfelseif Value eq 10> <cfset sc_status = FirstValue></p> <p> <cfelseif Value eq 11> <cfset User_Agent = FirstValue></p> <p> <cfelseif Value eq 12> <cfset Referer = FirstValue></p> <p> <!-- End parsing --></p> <p> </cfloop></p> <p> <!-- Display values for quick inspection --></p> <p> Date: #date#<br></p> <p> Time: #time#<br></p> <p> Cust_IP: #c_ip#<br></p> <p> Username: #cs_username#<br></p> <p> Site_IP: #s_ip#<br></p> <p> Site_Port: #s_port#<br></p> <p> Method: #cs_method#<br></p> <p> Site_path: #cs_uri_stem#<br></p> <p> Query: #cs_uri_query#<br></p> <p> Status: #sc_status#<br></p> <p> Browser: #User_Agent#<br></p> <p> Referer: #Referer#<hr></p> <p> </cfif></p> <p> </cfloop></p> <p></cfoutput>

That snippet does the heavy lifting. It shows the full life cycle of a log file: from disk, through parsing, to human‑readable output. In a production scenario you would replace the <cfoutput> block with logic that writes each record to a database or pushes it to a reporting tool. The rest of this guide shows how to capture the parsed data and keep it organized for later analysis.

Storing Parsed Data in a Relational Database

Once you have the individual fields extracted, the next step is to persist them. Storing IIS log data in a database turns raw text into actionable metrics: you can run SQL queries, generate charts, or feed the data into a business‑intelligence platform. The database schema you create should mirror the columns you’re interested in tracking. In the example below the table iis_logs contains columns for date, time, IP addresses, HTTP method, URI, status code, user agent, referrer, and two custom columns that tie the log back to your application environment.

ColdFusion’s <cfquery> tag lets you run INSERT statements against your data source. For the date and time fields, the CreateODBCDate() and CreateODBCTime() functions convert the string values produced by IIS into the correct format for most relational databases, such as MySQL or SQL Server. Using these helpers removes the need to write custom parsing logic for the date and time, and it keeps the code portable across database engines.

Below is a complete <cfquery> block that inserts a single log record. Notice how the variables we parsed earlier map directly to the column names in the INSERT statement. In a real application you would wrap this INSERT inside the loop that parses each line, so each request results in a new row in the table.

Prompt
<cfquery name="qInsertEntry" datasource="iis_logs"></p> <p> INSERT INTO iis_logs (</p> <p> entry_date,</p> <p> entry_time,</p> <p> entry_Cust_IP,</p> <p> entry_Username,</p> <p> entry_Site_IP,</p> <p> entry_Site_Port,</p> <p> entry_Method,</p> <p> entry_Site_path,</p> <p> entry_Query_Strings,</p> <p> entry_Status,</p> <p> entry_Browser,</p> <p> entry_Referer,</p> <p> entry_ws3_number,</p> <p> entry_URL</p> <p> )</p> <p> VALUES (</p> <p> CreateODBCDate(#date#),</p> <p> CreateODBCTime(#time#),</p> <p> '#c_ip#',</p> <p> '#cs_username#',</p> <p> '#s_ip#',</p> <p> '#s_port#',</p> <p> '#cs_method#',</p> <p> '#cs_uri_stem#',</p> <p> '#cs_uri_query#',</p> <p> '#sc_status#',</p> <p> '#User_Agent#',</p> <p> '#Referer#',</p> <p> '#entry_ws3_number#',</p> <p> '#entry_URL#'</p> <p> )</p> <p></cfquery>

There are a few practical considerations to keep in mind when writing logs to a database:

  • Indexing. The most common query patterns for log data involve filtering by date range, status code, or IP address. Add indexes on entry_date, entry_Site_IP, and entry_Status to accelerate those lookups.
  • Batch inserts. For large log files, inserting each row individually can be slow. ColdFusion’s <cfqueryparam> and bulk insert techniques can improve performance. If you’re using SQL Server, the BULK INSERT command or the OPENROWSET feature can load thousands of rows in a single round‑trip.
  • Data retention. Log tables can grow quickly. Implement a retention policy that archives older data to a separate table or compresses it into a file. Automate that process with a scheduled job or a cron job on Linux.
  • Data integrity. Validate the incoming data before insertion. For example, check that sc_status is a numeric HTTP status code or that the IP addresses are in valid dotted‑decimal format.

    Once your data sits in a relational table, you can start querying it. Simple reports like “daily page views” or “average response time” become a matter of writing straightforward SQL. More advanced analyses - such as detecting bots or spotting unusual traffic spikes - can be built on top of the same data source.

    Extending the Solution to Process Multiple Log Files

    In a production environment, IIS typically generates one log file per day, and each file can be several megabytes. To automate the ingestion of all log files in a directory, you’ll want a loop that walks through the folder, reads each file, and runs the same parsing logic. ColdFusion’s <cfdirectory> tag makes it simple to list files that match a pattern, such as all files ending with .log

    The workflow is essentially the same as for a single file, but you add a few layers of iteration:

    1. Use <cfdirectory action="list"> to capture the filenames.
    2. Loop over the resulting query object, setting file2load to the full path of each file.
    3. Read the file into a string variable.
    4. Parse each line and insert the resulting record into the database.

      The code below shows the entire process. It also demonstrates how you can keep the parsing logic in a separate cfset block or a custom component, reducing duplication and making the main script cleaner.

      <cfparam name="LogsDir" default="C:\inetpub\logs\LogFiles\W3SVC1">

      <!------ List all log files in the directory ------>

      <cfdirectory action="list"

      directory="#LogsDir#"

      name="LogFiles"

      filter="*.log">

      <!------ Loop through each file ------>

      <cfloop query="LogFiles">

      <!-- Build the full path to the current file -->

      <cfset file2load = "#LogsDir##DirectorySeparator##Name#">

      <!------ Load the file into a string variable ------>

      <cffile action="READ"

      file="#file2load#"

      variable="log_data">

      <!------ Parse and insert each line ------>

      <cfoutput>

      <cfloop index="rc" list="#log_data#" delimiters="#chr(13)##chr(10)#">

      <cfif left(rc, 1) eq "#"> <!-- Skip comment lines -->

      <cfelse>

      <cfset Value = 0>

      <cfloop index="id" list="#rc#" delimiters="#chr(32)#">

      <cfset FirstValue = id>

      <cfset Value++>

      <cfif Value eq 1> <cfset date = FirstValue>

      <cfelseif Value eq 2> <cfset time = FirstValue>

      <cfelseif Value eq 3> <cfset c_ip = FirstValue>

      <cfelseif Value eq 4> <cfset cs_username = FirstValue>

      <cfelseif Value eq 5> <cfset s_ip = FirstValue>

      <cfelseif Value eq 6> <cfset s_port = FirstValue>

      <cfelseif Value eq 7> <cfset cs_method = FirstValue>

      <cfelseif Value eq 8> <cfset cs_uri_stem = FirstValue>

      <cfelseif Value eq 9> <cfset cs_uri_query = FirstValue>

      <cfelseif Value eq 10> <cfset sc_status = FirstValue>

      <cfelseif Value eq 11> <cfset User_Agent = FirstValue>

      <cfelseif Value eq 12> <cfset Referer = FirstValue>

      </cfloop>

      <cfquery name="qInsertEntry" datasource="iis_logs">

      INSERT INTO iis_logs (

      entry_date,

      entry_time,

      entry_Cust_IP,

      entry_Username,

      entry_Site_IP,

      entry_Site_Port,

      entry_Method,

      entry_Site_path,

      entry_Query_Strings,

      entry_Status,

      entry_Browser,

      entry_Referer,

      entry_ws3_number,

      entry_URL

      )

      VALUES (

      CreateODBCDate(#date#),

      CreateODBCTime(#time#),

      '#c_ip#',

      '#cs_username#',

      '#s_ip#',

      '#s_port#',

      '#cs_method#',

      '#cs_uri_stem#',

      '#cs_uri_query#',

      '#sc_status#',

      '#User_Agent#',

      '#Referer#',

      '#entry_ws3_number#',

      '#entry_URL#'

      )

      </cfquery>

      </cfif>

      </cfloop>

      </cfoutput>

      </cfloop>

      Running this script once a day - via a Windows scheduled task or a cron job on Linux - keeps your iis_logs table up to date without manual intervention. Since each log file is processed only once, the script can be safely re‑run even if a file is accidentally deleted or corrupted; it will simply be omitted from the next run.

      For larger deployments where logs grow fast, you might consider a streaming approach: instead of reading the entire file into memory, read it line by line. ColdFusion’s <cffile action="READ"> can be paired with cffile action="READLINE" in a loop, which reduces memory overhead and allows the script to handle gigabyte‑sized logs without crashing.

      Keeping Your Logs Fresh: Automation and Maintenance Tips

      Once you have a working ingestion pipeline, the real power comes from automation. Scheduling the ColdFusion script to run after each IIS log rollover ensures that data is always current. Use Windows Task Scheduler to trigger the script at a convenient time - often late at night when traffic is low - to avoid any performance impact on the web server.

      In addition to scheduling, it helps to add basic logging to your ColdFusion job itself. Capture the start time, the number of files processed, and any errors that occur. Store that information in a separate table or write it to a plain text file; the logs of your log processor become valuable troubleshooting data if something goes wrong.

      Remember that your database will accumulate data over time. Implement a retention policy that moves older records to an archive table or compresses them into a CSV archive. A simple DELETE statement that runs once a month can keep the main table at a manageable size, while the archived data remains accessible for compliance or forensic purposes.

      Finally, consider building dashboards that pull directly from the iis_logs table. Tools like Power BI, Tableau, or even custom CFML pages can visualize key metrics: daily visitors, bounce rates, popular pages, and more. With the data already in a relational format, creating these insights becomes a matter of writing a few SQL queries and turning them into interactive charts.

      By following the steps outlined above, you transform raw IIS logs into a powerful analytics resource. The ColdFusion scripts handle the heavy lifting of parsing and inserting data, while the database keeps everything structured and searchable. Automating the process ensures that you always have up‑to‑date information at your fingertips, ready to drive decisions and improve your web presence.

Suggest a Correction

Found an error or have a suggestion? Let us know and we'll review it.

Share this article

Comments (0)

Please sign in to leave a comment.

No comments yet. Be the first to comment!

Related Articles