Back in February I was lucky enough to be asked to speak at London’s Calling on building performant applications on the Salesforce Platform. You can see a full recording of the session here but I wanted to write a small post around the background to why I thought it would be an interesting session (hopefully it was!) and why I feel it is an important but underappreciated area that needs more focus.
Back in Time
I have had the fortune in the past to develop solutions for a number of different platforms including Windows desktop apps, .Net, Ruby and Java server backends, and mobile apps for iOS and Android. When developing for these platforms, one common item was a pre-defined amount of resources that I had available for use. For the desktop apps I was building for a common machine setup within the organisations I was working for and knew that the application had to be usable on a machine with a certain specification alongside something like Outlook or Word otherwise it was not going to be acceptable to the end user. The same was true for my mobile applications, the specifications for an iPhone (for example) are fixed and so I knew that for the app to meet the user’s experience expectations it had to feel snappy and responsive. For server solutions, the aim was to maximise performance of the solution to ensure that the server was used as effectively as possible once purchased - an actual physical server.
Then Along Came the Cloud…
And scaling resources became super easy. Heroku, AWS, Azure etc. all provide extremely simple and effective ways to scale your solutions as you need to. Yes you should still work to ensure that your app is performant and using the resources as best as it can but with the low cost for scaling when needed it is a problem you can start to defer if you are willing to pay to do so. As for desktop apps, they started fading out as more experiences moved to the web and mobile, which in itself is now so powerful a machine that for most applications people create performance tuning is not a massive requirement (games are probably where this is needed most).
But Salesforce Is Different
Salesforce’s scale comes from a different method - the platform is multi-tenant so you are not paying for a specific set of resources or a certain sized box but rather for access to a certain set of application abilities as part of a license. It has been a few years since Salesforce even released any details of how many servers it runs to power the platform but it is irrelevant as it is obfuscated away from users and developers.
In order to keep this all running in a fair and orderly fashion Salesforce have added governor limits to the platform that help ensure that no single person can monopolise the platform and slow things down for everyone else. However, for developers working with these limits is different to working on other platforms and often requires careful thought and planning around what can and cannot be done in differing situations.
Governor Limits Do Not Imply Performance
Salesforce is a stateless system that relies on the database as a system of record for the solution at any one time. All interactions you make typically end with some CRUD operations that change the data stored in the database but the app servers themselves do not hold state in the background. From Salesforce’s perspective it means app servers can be rebooted quickly if there are issues, but for developers it means we need to think about transactions time as all of our users follow the same typical thread
- we load a record or set of records
- we interact with these records
- a change is saved to the database (insert/update/delete/undelete)
- we perform some background logic based upon this change
For a Salesforce developer we need to focus on how we can make our code involved in the above process as efficient as possible to improve the performance of our application for the user. We should be aiming to not only stay within the governor limits but to minimise our resource usage so that our application is more responsive to our users. Typically a Salesforce developer is working hard to ensure they stay within the governor limits and when testing as long as everything works and is bulkified they are happy.
Many people will have read the oft cited Nielsen research on repsonse times and perception - but if we just worked to the governor limits that means we could have up to 10 seconds of processing going on in the background (plus time in DML, SOQL, HTTP callouts that are not counted) which would easily take us into the territories of unresponsiveness where the user’s attention has to be worked at to be maintained. And it is not just inefficient code that can cause this - Salesforce has also added more and more tools that allow non-coders to do complex processes on the platform - but what impact do they have on the time a transaction takes to run? What does this mean for the user?
Thats Why I Think We Should Start Talking Performance Again
Partly because a lot of people have got out of the habit of tuning for performance, the “if the limits aren’t broken then all is good” mentality. Partly because the number of tools we have at our disposal which are not traditional code but that do impact is growing and we need to work at ensuring they do not impact our app’s responsiveness. And partly because I feel that this is an under-researched area which as a community we can do more to work on. Most of the articles and information you will find out there on this is from Salesforce which is great, they are helping us and actively trying to help make things better. But that also serves to highlight how we as a community can do better to help improve this pool of knowledge and make our applications leaner and quicker.