Monitoring an Iguana instance and storing the data in a CSV file

One of our customers came to us seeking a method of monitoring the performance of an Iguana instance over a period of time. Thankfully, the foundations needed to accomplish this task were already in place. All we needed to do was create a fairly straightforward script to tie it all together.

We present such a solution below. The script should act as the main module for a From Translator component. The polling time of the component is up to you, but we suggest a conservative polling time such as 5 minutes (300,000 milliseconds). Keep in mind that every time the script is triggered, a new row will be added to the CSV log. As a result, the faster the polling time is for the component, the more rapidly the file will grow.

This script makes use of the monitor module that Eliot provides here. Before using the code below, you must copy and paste the source for that module into a shared module named “monitor” and add it to your project.

IMPORTANT: The main module provided here makes use of a few functions that were added in Iguana 5.5+. Specifically, the openLogFile() function accesses a few functions in the os.fs table to perform file operations not available in vanilla Lua. If these functions are not available in your version of Iguana,  we recommend that you either replace these function calls with the appropriate shell commands passed into os.execute() or io.popen(), or that you remove some of them. You can manually create the “monitor_logs” directory if desired, but there should always be some logic in place to check if the CSV file exists before you write to it.

--[[
This program continuously monitors an Iguana instance for performance
statistics and writes out the data values collected to a CSV log.
]]

monitor = require 'monitor'

-- The logs generated by the program will be stored in this directory.
MONITOR_LOGS_DIR = "monitor_logs"

-- Controls whether or not the program will perform file operations while
-- being edited in the Translator.
AUTO_CREATE_FILES = not iguana.isTest()

function trace(Arg) end

function main()
   local Success, Result = pcall(generateLogEntry)

   -- Attempt to send an email to a specified address if the program failed to
   -- generate a report.
   if not Success then
      local EmailMessage = 'The Iguana channel "' .. iguana.channelName() ..
         '" failed to generate a monitor log due to the following error:n' ..
         Result

      -- Replace the "server", "to", and "from" fields in the following function
      -- call with your own information. You may also need to supply the "username",
      -- "password", and possibly "use_ssl" fields as well depending on the amount
      -- of authentication required by your email server.
      local EmailSuccess = pcall(net.smtp.send, {server="mail.domain",
         to={"name@domain"},
         from="name@domain",
         header={Subject="Iguana Channel Failure"},
         body=EmailMessage,
      })

      -- If we failed to send an email for whatever reason then we can at least
      -- log a warning message for the channel.
      if not EmailSuccess then
         iguana.logWarning("Channel failed to generate a monitor log due to " ..
            "the following error:n" .. Result)
      end
   end
end

function generateLogEntry()
   -- You will need to modify the arguments given to monitor.status() if these
   -- values differ for the Iguana instance being monitored.
   local Report = monitor.status{user='admin',
      password='password',
      url='http://localhost:6543/'}

   -- Open a file handle for the CSV log which can be written to.
   local FileHandle, FileExisted = openLogFile()

   -- If we're writing to a new log file then we should add a header row.
   local Data = ""
   if not FileExisted then
      local Header = generateRow(Report, node.nodeName)
      Data = Data .. Header .. "n"
   end

   local Row = generateRow(Report, node.nodeValue)
   Data = Data .. Row .. "n"

   if FileHandle ~= nil then
      FileHandle:write(Data)
      FileHandle:close()
   end

   -- The data is returned so that it can be viewed in annotations.
   return Data
end

function openLogFile()
   -- Create a directory to store the logs in if one doesn't exist already.
   if not os.fs.access(MONITOR_LOGS_DIR) and AUTO_CREATE_FILES then
      os.fs.mkdir(MONITOR_LOGS_DIR)
   end

   -- You can change how often the program will create a new log file by
   -- modifying the format string given to os.date(). For example, to create a
   -- new file daily instead of weekly, change the "%W" in the format string to
   -- "%w".
   local LogFilepath = MONITOR_LOGS_DIR .. os.date("/Week%W-%Y.csv")

   -- If the log file exists already then we will append to it, otherwise a new
   -- one will be made.
   local Mode, FileExisted
   if os.fs.access(LogFilepath) then
      Mode = "a"
      FileExisted = true
   else
      Mode = "w"
      FileExisted = false
   end
   trace(Mode)

   -- Don't open the file unless the appropriate flag has been set. We perform
   -- this check to prevent modifying the file system while editing until you're
   -- ready to test the script.
   local FileHandle, ErrorMsg
   if AUTO_CREATE_FILES then
      FileHandle, ErrorMsg = io.open(LogFilepath, Mode)
      assert(FileHandle ~= nil, ErrorMsg)
   end
   return FileHandle, FileExisted
end

-- NodeFunc should be a function that can be called on a node in the report to
-- retrieve some value from it. This will vary with the type of row being
-- generated.
function generateRow(Report, NodeFunc)
   local Status = Report.IguanaStatus
   local Row = {}

   for i = 1, #Status do
      local Node = Status[i]
      if filterNode(Node) then
         -- Add a comma to the row if this isn't the first entry.
         if #Row > 0 then
            table.insert(Row, ",")
         end
         table.insert(Row, escapeCSVvalue(NodeFunc(Node)))
      end
   end

   return table.concat(Row)
end

-- To ignore specific attributes in the report, add the attribute's name to
-- this table and assign it a value of true.
IGNORE_LIST = {
   LogDirectory = true,
   LicenseKey = true
}

-- This function filters a node to determine if its values should be inserted
-- into the CSV log. We filter out non-attribute nodes to prevent adding the
-- Channel tags to the log. We also use the list defined above to filter out
-- specific attributes that aren't relevant to monitoring Iguana's performance.
function filterNode(Node)
   return Node:nodeType() == "attribute" and not IGNORE_LIST[Node:nodeName()]
end

-- Embedded commas in CSV values are dealt with by surrounding the value in
-- a pair of double-quotes. As a result, we must also escape any double-quotes
-- in the value by doubling them up. For the most part the values found in the
-- report should be comma-free, but we do this escaping anyways in case
-- something like a file path has an embedded comma.

function escapeCSVvalue(Value)
   return '"' .. Value:gsub('"', '""') .. '"'
end

Additional Notes

Monitoring other Iguana instances

By default, the script monitors the host Iguana instance; however, this is not a necessity. Changing the parameters passed to the monitor.status function in generateLogEntry() will allow you to monitor other Iguana instances instead (so long as you enter the correct URL and authentication information). In fact, the script could be tweaked to monitor a whole farm of Iguanas should the need arise.

Controlling how often the log file is generated

The file name for the CSV log is specified in openLogFile(). You can control how often a new log file is created by tweaking the format string given to os.date(). For example, changing the “%W” to “%w” in the format string would cause a new log file to be created every day instead of every week. You will likely want to play around with this depending on the polling time that you set for the From Translator component.

Furthermore, the script could be modified to write out log entries to multiple CSV files rather than a single one. For example, you could write out each log entry to a daily CSV file as well as a monthly one. This would provide you with different views of Iguana’s performance over time.

Excluding data in the XML report

An important thing to note about the script is that the XML data structure returned by the monitor module contains a lot of information, most of which is written to the CSV log by default. We opted to leave most of this data intact since the relevant information will vary per application. Instead, we have provided the framework for a method to exclude information from the CSV logs for your specific needs. You can add attribute names found in the XML report to the IGNORE_LIST table to prevent the corresponding data values from being added to the CSV log. This approach works well if you want to store most of the data found in the XML report. If you only want to store a small number of data values in the CSV log, a better approach may be an inclusive one. Specifically, the script could be modified to use a data structure like the IGNORE_LIST to determine which data values in the XML report should be saved, and then ignore any values not found in said data structure.

Warning! Modifying the IGNORE_LIST table in the middle of the week could cause the columns in your CSV log to become misaligned, since the header row for each CSV log is added when the file is first created. As such, if there is already a CSV log for the current week present on the host machine when you change this table, we recommend that you rename the old log file first before restarting the channel. This way, a new log file will be created with the correct header row.

Error handling and notification

The script includes a nice feature that attempts to send out an email to a specified email address if it can’t add a log entry to the current CSV file. We recommend that you modify the arguments given to net.smtp.send in the main() function to include your own information. We also recommend that you test out this functionality before starting the channel. Regardless, even if it can’t send out an email, it will at least log a warning message for the channel when a problem occurs. It’s a good idea to check the state of the channel now and then to make sure it’s still generating log entries.

Leave A Comment?

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.