The agileBase strapline is ‘fast, friendly, flexible’. By ‘fast’ we usually mean fast to develop and prototype, i.e. you can typically get working business databases built and running in minutes or hours.
However, it has another meaning too – we’ve always strived to have the fastest response times as users interact with apps built on the platform. That’s down the the many optimisations we build in to both the server and user interface – too many to list here – as well as the choice of technology (props to PostgreSQL, the world’s most advanced open source database) and high performance hosting (thanks Linode).
As the number of customers grows and the sizes of their systems balloon, we need to always bear performance in mind.
Today we’re excited to announce the most effective single change we’ve ever made to improve performance – the ability to use caching on high transaction rate views.
Caching in general has been an option for a while (materialization in database terminology). When you have complex views that are slow to query, but don’t update that much, it’s great. A common example would be a financial report totalling up sales per month. There may be millions of rows of data involved. Caching can make querying, filtering and charting much more responsive, so data loads in milliseconds rather than seconds.
To date, the options have been to update the cache every 10 minutes or once a day, which is fine for those sorts of views where new data isn’t added that often.
However, what about views which are complex, perhaps with many calculations and lots of data, but which are also updated and used very frequently? Stock figures, sales jobs or orders for example. These are the ones which have most impact on people in day to day use, and on the system in general, which affects everyone.
The system will now update the cache whenever someone saves a new record, updates one or deletes one. This makes querying the data very fast. Typically, there are a lot more queries of data than there are edits, so the performance of the whole system is improved. For further efficiency, not every edit triggers a cache update, all changes are rolled up into one when the user moves from the editing screen back to the list of records.
What delay there is is also psychologically and practically in the right place. When users first open a tile to look at data, they want the response to be immediate. It’s important that searching for data is also very rapid. However, a short delay just after pressing save can be tolerable – the user has just finished doing something, as opposed to starting it. And of course the UI reacts immediately to let them know the system is ‘saving’.
Google has some good notes on the user perception of performance delays. As they say,
0 to 100ms: Respond to user actions within this time window and users feel like the result is immediate. Any longer, and the connection between action and reaction is broken.
We try to keep the data querying actions within this timeframe. Then
300 to 1000ms: Within this window, things feel part of a natural and continuous progression of tasks. For most users on the web, loading pages or changing views represents a task.
Reloading the list of data after a record save falls into this category.
Setting this up
As an administrator, if you have a view you’d like to speed up, how do you do so?
In the admin interface, just go to the manage tab for the view, click ‘advanced options’ and under ‘cache view rows’, select ‘cache view, update on record save’.
Welcome to a faster agileBase!
Source: Agilebase