How We Built a Snappy User Experience with Caching
At Impart we are obsessed about delivering great experience for security users. A great security experience spans the entire practitioner journey and requires deep empathy for the security professional, starting with having a deep understanding of their pain points and day to day job duties, and ending with a snappy experience that gives them what they want, when they want it. Don’t take my word for it, Check out our Lighthouse scores for our Console:
These scores in aggregate are better than homepage for the company who invented the Lighthouse benchmark!
However, providing this type of experience isn’t free. There is deep product and engineering work that has to happen to deliver on the promise of a great experience, and there are too many products that skip steps, focusing on delivering checkboxes to buyers rather than delivering amazing experiences for users. This is the opposite of how we do things at Impart - every aspect of what we do is thoughtful (including how we named our company), and carefully crafted to delight our users.
One example of the engineering investment that we make in our product experience is caching.
What is Caching?
Caching is a well known distributed systems technique whereby you make a copy of data that you need locally after you access it the first time, so that you don’t have to fetch it again if you need to access that data a second time. By storing the data locally, you no longer need to make the round trip between wherever the data was originally stored, and where you need the data.
The trade off being made is simply that you are trading storage for speed - by using local storage capacity (which is cheap), you can gain speed increases in the performance of your application and ultimately the user experience.
In practice, there are many places that a developer can cache data. They can cache in a database, a network element, a browser, a server, memory, local storage, and anywhere else where data can be stored. Here’s some examples of how we’ve implemented caching to optimize the security practitioner’s experience with Impart.
API Response Caching within the Browser
We optimize the experience of our single page application by caching API Responses within the browser. By Caching API Responses, we save on the round trip time of requesting duplicate data and allows us to instantly render the front end experience that we need to without having to go back to the server. From a user standpoint this allows us to accomplish a few things:
First, when the user clicks the back button in their browser they are able to return instantly to their previous view and our single page application is able to instantly render their prior view with all of the previously retrieved data. This is very applicable for the security practitioner, who according to our user research, often spends their time “peeking” at different metrics and vulnerabilities to check for anomalies or errors on specific API endpoints, but then going back to reference other parts of their API.
Second, this allows us to provide overall a snappy experience that feels like a native application, instead of like a traditional client server web application. Since we are providing a SaaS security service, most of our data exists outside of the browser and so most of our experience involves retrieving data from other data sources including our cloud.
Caching CORS requests in the Browser
One of the lesser known aspects of API First applications is that there are often many hidden OPTIONS requests upon initial page load that don’t actually result in user facing experiences. These are called CORS Preflight Requests. We actually had forgotten about these until we dog fooded our product on our own SaaS application and noticed dozens of new API endpoints showing up in our dashboards!
While these CORS Preflight requests are perfectly harmless and actually promote better application development, if there are enough of them then they can start to add additional latency into the application which affects initial page load times. In our case, we noticed enough OPTIONS calls that they were causing up to 80ms of page load.
By caching these requests in the browser, we were able to reduce latency and start making impactful API cals right away, saving up to almost 80ms in our application page load time.
Serverless Edge Functions for Cache Control
We use Serverless tech at Impart in order to optimize the experience for our users. One example of this is that we wrote our own serverless functions to handle cache control for inbound and outbound requests. By implementing these controls using serverless functions at the edge of our infrastructure, we minimize latency and allow for efficient performance by providing a per request decision about caching policies based on the content-type and other header information, rather than relying on less granular policies that may not be as efficient or precise.
Caching all Authorization Requests
Lastly, with most API first architectures, it is a highly recommended security best practice that every API call needs to be authorized before it can be used. This can help avoid potential data vulnerabilities, for example malicious users accessing data that doesn’t belong to them by reusing APIs endpoints but changing API endpoint parameters. (A very common API security problem).
While it is critical for developers to secure every endpoint and API call, repeatedly calling the authorization service can result in bottlenecks and slowdowns in the user experience as the calls add up. We’ve addressed that by using smart caching within our infrastructure to cache authorization requests to improve performance.
At Impart we are obsessed about delivering great experience for security users. A great security experience spans the entire practitioner journey and requires deep empathy for the security professional, starting with having a deep understanding of their pain points and day to day job duties, and ending with a snappy experience that gives them what they want, when they want it.
Want to see for yourself? Give our product a try by signing up for our beta program. Stay tuned for more about how we provide great experiences for security teams.