Working with Databases


If you are running a high volume data center – something like Pharmacy One – one topic you might be interested in is what I call pipe-lining, or batching of data from several messages together into a single transaction.

Iguana is extremely fast – see the introductory video. The database becomes the biggest bottleneck.

Now, if you think about the underlying mechanics of transactions in databases, the bottleneck is always the time needed to physically sync data down to the disc storage. So, with databases, if you insert, say, 100 rows within one transaction, you’ll get dramatically better performance than if you were to do 100 individual transactions. Relating this to HL7: if the nature of your data set is that you are always inserting data, this is a possible optimization that you can do.

This type of technique is worthwhile considering if you are processing large batches of HL7 transactions. Say for instance you have files which contain hundreds of HL7 messages. Then an algorithm you might want to consider is this:

  1. Use a From Translator component and Lua file handling APIs to break the incoming files into chunks of 100 or say messages and enqueue these blocks.
  2. In a To Translator component break the chunks of 100 into individual messages.
  3. Parse each message and map out the data into a single set of tables as you obtain from db.tables{}
  4. Then do one call to conn:merge{} which commits all data mapped from the 100 messages in a single database transaction.

This has the potential to give you a much faster throughput for a large amount of batched HL7 data. The performance gains can be dramatic as writing 100 records as opposed to writing a single record is very little extra overhead for a database engine, so saving the data may not be 100 times faster but it certainly will be many times faster!

If are interested in doing something like this implemented contact us at and we can arrange a quote from our professional services team to assist you.

Leave A Comment?