Import Logs into Excel from Iguana

Introduction

The Excel Adapter Modules can be used to import and export data to and from Iguana. The two channels work Log 1: Export Logs to DB and Log 2: Import DB Logs to Excel work together to demonstrate how to import data from Iguana into Excel.

The code for this channel was built as part of our own reporting requirements for dashboards analyzing data out of our own IT systems. We make use of pivot tables in Excel to analyze sales, financial and other data.

Most of time the data is coming from applications with web service APIs.  Iguana is a convenient tool to extract and transform this data into databases which we can query and then serve up the data to Excel.

This channel gives a demonstration of the technology to analyze Iguana’s own logs. There are two channels needed:

  • Log 1: Export Logs to DB: This channel is querying Iguana via the web service log query api and populates a SQLite database of information on each log message.
  • Log 2: Import DB Logs to Excel – this a web service channel that makes SQL queries against the database populated by the Log 1: Export Logs to DB channel and serves up the Excel spreadsheet with the data from this database.

For best results I recommend you use the Office 365 version of Excel on Windows or OS X – although many customers have had success with the previous couple of versions of Excel.  The last two versions on Windows are quite reasonable – Excel 2013 on Mac treats dates a little oddly – I would recommend only using the latest version of Excel on Mac.

Some hospital networks have very locked down security settings with Microsoft Office. It may be better to try this on a home computer where Excel is installed before trying it within a work context.

If you have any questions please contact us at support@interfaceware.com.

Using the Channels [top]

  • Make sure you have a working copy of Microsoft Excel installed
  • Import the Log 2: Import DB Logs to Excel and Log 1: Export Logs to DB channels from the Builtin: Iguana Excel repository
  • Make sure both channels are running
  • Click on the URL for the Log 2: Import DB Logs to Excel channel, it should be something like http://localhost:6544/loginfo/
  • You will be asked to login, use a valid Iguana username and password
  • After successfully logging in you should see something like:
  • Click on the link to Download a spreadsheet, then open it
  • It’s super important to Enable macros! This is required because the communication between Excel and Iguana is implemented using VBA (Visual Basic for Applications) macros within Excel.
    Different versions of Excel will display slightly different alarming dialogs warning you that the Excel spreadsheet has macros like this:Screen Shot 2016-05-11 at 4.46.23 PM
    Note
    : While a malicious programmer can write malware in this language, in this case you can trust the code.
  • Once the spreadsheet is open, select the GetData tab
  • Click on the Get Data From Iguana button
  • Enter the Iguana password for the user shown in field C3 into the dialog box
    Note: Older versions of Excel may open the dialog in a odd place off the screen so look around if you do not see it.
  • This will fetch up to 4000 rows of data from Iguana and put it into the LogData tab:

    Tip: This spreadsheet is very similar to the one used for the Export from Excel to Iguana example. We simply greyed-out the buttons we are not using, and set the enabled field to FALSE for the associated Excel Table. The reason we did this was to highlight how we used different Excel Tables (Query and Export) for the two channels.

    You can use the spreadsheet with both examples, simply change Enabled to TRUE.

  • Then click on the Refresh Reports button. This will refresh all the Pivot Tables in the spreadsheet
  • Examine the pivot table reports in the MessageByChannelByType and TypeByChannel tabs

The querying to Iguana is implemented using VBA code in the spreadsheet. In most cases though it is not necessary to edit this code. This is achieved by the use of a Excel table called Query which configures what web services will be called and where to put the data.

The Query table is in the GetData tab. These are the columns in the table and their meanings:

  • Enable: The line will be ignored if this is not set to TRUE. This is useful for being able to disable a query temporarily without deleting it
  • Status: Gives information on how many rows were last retrieved and when and will display errors in red if they occur
  • Table: This is used by the VBA code to determine the name of the tab and Excel table to put the data coming from Iguana into. The logic is such that if it cannot find a tab and table of the name given then it will automatically create it
  • URL: This is the address the Iguana webservice that the data is being requested from. Not just any web service will do – it has to serve up data using a specific comma delimited format which I describe further down in this document
  • The six ParamName[1,2,3] and ParamValue[1,2,3] values allow up to three GET variables to be passed through to Iguana when the request for the web service data is made. The ParamName argument gives the name of the argument and ParamValue the value

For the Log 2: Import DB Logs to Excel channel web service at https://localhost:6544/loginfo/feed there are two GET variables it recognizes:

  • channel: If present then the log source is restricted to only log messages from the given named channel
  • limit: This limits the number of rows of data that will be served up by the back end web service – the default value is 5000

In the example shown below you can see there two queries defined. Both are going to the same https://localhost:6544/loginfo/feed web service but they populate different tabs and have different GET parameters defined.

The second query uses the channel parameter to restrict the log entries returned to just those log entries from the Log 2: Import DB Logs to Excel and puts the results into the MyNewDataSet Excel table in the tab of the same name. The first one has a limit of 4000 rows, puts the data into the LogData tab and takes all log data. Here’s a screenshot showing this configuration:

If a table does not exist it will be created in a tab of the same name, if the tab does not exist it will also be created. This makes it very easy to add more log imports.

We can see the data in MyNewDataSet tab and created from the first query defined in the table:

To get a feel for how the data is generated it’s best to look at the translator code of the Log 2: Import DB Logs to Excel channel. A handy way to see it is to leverage the import sample data feature inside the the IDE:

  1. Click on the middle of the sample data control to see the Add From HTTP Request Log:
    Screen Shot 2016-05-12 at 9.20.12 AM
  2. Click on the Add From HTTP Request Log button to see the list of requests:
    Screen Shot 2016-05-12 at 9.20.23 AM
  3. Select the one with channel=Log+Excel+Server. Then click on the sample data edit button Screen Shot 2016-05-12 at 9.27.02 AM to see the HTTP transaction:
    Screen Shot 2016-05-12 at 9.28.14 AM
  4. You can see the GET variable channel is defined with the value “Log+Excel+Server” – this is “Log 2: Import DB Logs to Excel” with the spaces replaced by + sign characters to conform with URI escaping rules.
  5. The channel uses the web service dispatching framework shown in the Webservices with Permissions channel from the Builtin: Iguana Webservices repository. The benefit of this is that you can deliver different data to different people depending on the roles they have. You can see that the /loginfo/feed path is assigned a function handler which defined in the getloginfo file imported using require:
  6. We can navigate to that file in the Project Files panel of the Editor:

    Alternatively we can click the serve (function) link in main(), and then the Action (function) link:

  7. This takes you to the actual code which handles the request in Iguana.
  8. Here you can see it is a simple handler that takes in the web request and makes a SQL select query.
  9. It makes use of some helper functions to convert the code into an HTTP response with the data in comma delimited format. This format is designed to be very simple and efficient to parse in Excel:
    HTTP/1.1 200 OK
    Cache-Control: max-age=0
    Content-Type: text/html; charset=utf-8
    Content-Length: 4601
    
    MessageId,Channel,TimeStamp,LogType,EventCode,EventName,PatientName
    20160511-13959,Log 2: Import DB Logs to Excel Excel Server,Wed May 11 15:00:00 2016,Info,NULL,NULL,NULL
    20160511-14095,Log 2: Import DB Logs to Excel,Wed May 11 15:00:00 2016,Info,NULL,NULL,NULL
    20160511-14192,Log 2: Import DB Logs to Excel,Wed May 11 15:00:00 2016,Info,NULL,NULL,NULL
    20160511-52826,Log 2: Import DB Logs to Excel,Wed May 11 15:00:00 2016,Info,NULL,NULL,NULL
    20160511-52904,Log 2: Import DB Logs to Excel,Wed May 11 15:00:00 2016,Info,NULL,NULL,NULL
    20160511-56529,Log 2: Import DB Logs to Excel,Wed May 11 16:00:00 2016,Info,NULL,NULL,NULL
    20160511-56607,Log 2: Import DB Logs to Excel,Wed May 11 16:00:00 2016,Info,NULL,NULL,NULL
    20160511-56704,Log 2: Import DB Logs to Excel,Wed May 11 16:00:00 2016,Info,NULL,NULL,NULL
    20160511-57015,Log 2: Import DB Logs to Excel,Wed May 11 16:00:00 2016,Info,NULL,NULL,NULL
    20160511-67620,Log 2: Import DB Logs to Excel,Wed May 11 16:00:00 2016,Info,NULL,NULL,NULL

Creating a new handler is easy. These are the steps:

  1. Add a new local file called report with this content:
    local excel = require 'excel.converter'
    
    local function Report(R,A)
       local Table =  {}
       for i=1,10 do
          local Row = {}
          Table[#Table+1] = Row
          Row.item = "Item "..i
          Row.data = i
       end
       trace(Table)
       local Prepped = excel.transpose(Table)
       local Body = excel.flatwire(Prepped)
       net.http.respond{body=Body}
    end
    
    return Report
    
  2. Then insert two lines into the main file to register the new handler:
  3. That registers the new handler, now save a milestone and restart the channel.
  4. The go to the Excel spreadsheet and edit the parameter table to call the new service, the values for the new row in the Query parameter table are:
    • Enabled: TRUE
    • Table: Can be any value – NewReport will do.
    • URL: This must the base URL of your Iguana web service followed by /loginfo/report. In the example below I have HTTPS enabled on my Iguana instance which is listening on port 6544 so the URL is https://localhost/loginfo/report – your one might be http://localhost/loginfo/report etc.
  5. Click the Get Data from Iguana button – you may need to enter you Iguana password.
  6. This should populate the data in what ever tab named with the same value you put into the Table column from step 4, like so:

In this case we have trivial data but from here it should be possible to see how one can:

  1. Add new handlers.
  2. Customize what parameters they take.

This gives you the tool to generate live Excel spreadsheets of data in for any kind of problem you need to approach.

How it works [top]

I will take a closer look to see how the code works. The Log 1: Export Logs to DB channel polls periodically and implements what I call the “web watcher” pattern.

There are lots of web APIs where you can query for all the new items that have changed in a given time: It could be invoices, patients, new appointments etc. A good way to work with such APIs in Iguana is to leverage the persistent store2 module to store the time of the last object queried and use it for when we poll. This is exactly what we do in this channel, in this case keeping track of the timestamp of the last log entry we examined and querying for all the log entries since then.

For each log entry it populates a row in SQLite database table called LogInfo. The key field ID for the table is MessageId which is the unique message ID that identifies each log entry in Iguana. The other basic fields are:

  • Channel: The name of the channel or Iguana Service for service entries.
  • TimeStamp: The timestamp of the message – stored in Unix Epoch Time – number of seconds since 1970.
  • LogType: The type of log message – info, error, message etc.

There are also some HL7 specific fields:

  • EventCode: The HL7 event code like ADT, ORM etc.
  • EventName: The HL7 event name like A01, A03, etc.
  • PatientName: The name of the patient as found in the PID segment.

When a log entry is encountered that is a message, the channel attempts to parse the message and extract the HL7 data fields. If the entry is not an HL7 message these fields will be NULL.

Of course this is all very arbitrary – this code could easily be extended into a full clinical repository by putting in the effort to map out more data into a set of tables.

You can browse the main two files of this channel online in Github:

Moving to the Log 2: Import DB Logs to Excel channel it has two Excel spreadsheets incorporated into it:

  • IguanaFeed.xlsm: This is the template for the spreadsheet that is served up when requested.
  • other/excel/core.xlsm: This contains the master copy of VBA code that we use to power the Excel side of the data exchange between Iguana and Excel.

The VBA code required to make this system work is quite substantial and so by separating the concerns of template Excel files it is easier to maintain. One of the challenges with VBA code is that Microsoft Office doesn’t give you a modern environment for maintaining change history on that code – one has to be very careful. By maintaining the VBA code separately in core.xlsm it makes it easier to have confidence in changing the IguanaFeed.xslm template file without fear of modifying that core VBA code. It also makes it easier to upgrade multiple channels using this Excel adapter to use the latest version of the VBA code.

So the code in excel.sheet implements the following:

  1. It reads in IguanaFeed.xslm.
  2. It unzips it into memory – everything is stored in a Lua table.
  3. It modifies the URLs of the web services referenced in the spreadsheet to map to the base URL used by the user to get to the web service. So let’s say you access the web page using http://192.168.0.12/loginfo/ then all the URLs in the configuration will be re-written to start in this way. This is useful if you have your Iguana on a host with a proper domain name like myiguana.mycompany.com – the URLs are automatically remapped.
  4. It modifies the name of the user that is stored in the spreadsheet on the fly to be the name of the logged in user. So if you log in as “fred.bloggs” and request a spreadsheet then that name will appear in the username cell.
  5. It swaps out the entire VBA code of each spreadsheet with the code we have in other/excel/core.xlsm.
  6. It re-zips up the new Excel file we have generated on the fly.
  7. And dispatches it back as an HTTP response to the user.

Another part of the implementation is the comma delimited format used between Iguana and Excel. Writing parsing code in VBA is not that convenient so the design of the comma delimited format is intended to be easy to parse. There are no quotations around strings. We use the following escaping protocol:

  • Comma , –> @C
  • Newline \n –> @N
  • Ampersand @ –> @A

The beauty of this is that parsing the comma delimited data on the Excel side is simple since one can just split on newlines and commas without worrying if the comma is in the middle of the actual data.

For reading the VBA code the best environment is the Windows version of Excel since the VBA environment is better. The code does however work on both Mac OS X and Windows.

To achieve that we had to:

  1. Avoid using formats like JSON. There is a JSON parser available for Visual Basic but it’s only available on Windows since it calls specific WIN32 API calls. Using the comma delimited format is simpler to parse and also has less overhead than putting the equivalent data in JSON.
  2. On windows for the HTTP(S) calls the code uses a Microsoft HTTP library called WinHttp.WinHttpRequest.5.1. On Mac OS X there are bindings to popen and invoke the curl command line tool which is part of Mac OS X environment.
  3. Avoid using popular extra components commonly used for Excel scripting like the Microsoft Scripting library which only work on Windows.

The code is written efficiently with respect to memory management in Visual Basic to try and maximize speed – it can load a few thousand rows in a fraction of a second.

The code is well commented. For investigation at that level I would recommend using the Windows version of Excel since the Visual Basic debugger environment is much more advanced that the Mac OS X version of Excel.

More information [top]

Leave A Comment?