At our 2015 user conference I gave a presentation of building an adapter for salesforce.com. Well this year as part of managing our own growth we made the choice to migrate to using salesforce.com for our company.
It was a great opportunity to try using our own adapter for salesforce.com for a real project and see how it worked.
It worked well – it made the migration of data from our previous CRM easy. In our previous CRM the sales team had needed to work around some of the limitations of that system by putting structured data into text fields – with Iguana it was a breeze to parse that data and map it into the salesforce adapter.
I made a few improvements to the original adapter. With that adapter I came up with the idea of caching HTTP read operations. Since then we took that idea and refined it a few times to come up with the HTTP with caching module – thanks to Bret Dawson for his ideas on that. I ported the other caching code over to use the new store2 module. The big advantage of this is that the store2 module reduces the chance of any usage interfering with other channels making use of the store module.
Tools I found useful to add to the mix was a module to encrypt passwords in an external file and tools.global.find to eliminate global symbols that had unintentionally been added to the mix.
Iguana 6 has made it much easier to create and maintain a much more comprehensive set of libraries. The new capabilities with local project files, nested directories and GIT repo support have been game changers in terms of the ease of development within Iguana.
Eating my own dog food with the salesforce.com migration revealed a few things to me in terms of how I could take the salesforce adapter to the next level. One thing that became apparent was that in practice when you implement salesforce.com the custom fields are always being changed. So I took another look at how I wrote code to query the self describing APIs for salesforce.
In my first salesforce adapter this was a manual process. In my new version I figured out to make it happen automatically – it queries the salesforce APIs to find the object descriptions and then caches the information in a local JSON file. This all happens under the hood – from the user’s perspective the adapter “just works” – it captures all the custom field operations without any special effort.
The last improvement I made was to figure out a better solution how to make it easy to type in the fields for the adapter. In the original adapter I wrote it such that it leveraged Iguana’s help system to autocomplete on the fields for each object. This screen shot shows how it works for Contacts for instance:
The problem I found with this was that objects in saleforce.com tend to be quite large with a number of fields. That started to get a bit awkward with mapping many fields – while I could write the call over multiple lines, auto-completion doesn’t work that well with that approach.
So I came up with the idea of leveraging the DBS schema. I wrote a library that would allow me to create DBS schema on the fly programmatically. With this technique the API takes on a new structure. Each object has a method which returns a new empty instance of the object – like for Accounts the method is “accountNew”. This is table record object as you would get with DBS schema. You populate the object and then submit it using the accountModify() method. As you can see from the following screenshot the result object is extremely annotation friendly:
I put the same change into to the accountList() and other generated List() methods so that these two return the DBS grammar based objects. It makes for a much richer translator experience by much greater leveraging of the annotations.
It’s been a fun journey – the exciting thing is that these same techniques can be leveraged to improve the experience for any web API adapter for Iguana.
Documentation and information on obtaining this adapter can be found on the adapter in the repository section of this wiki.