How we boosted Magnolia response times with dynamic page caching

Published on March 23, 2016 by Guest blogger



A few months ago my colleague Alexander Wert and I were helping a client from the banking sector to improve the performance on their newly deployed web application. The application in question is used for a wizard-based step by step data submission, but at the time was having an unsatisfactory load time.

We started with a quick check using the Google developer tools, just to get a feeling where the time was lost. As time was lost equally on server and browser, we decided to check the server-side first. The server-side was running Magnolia on top of the Tomcat server, so nothing special actually. We chose inspectIT(open-source APM tool for java) for the performance diagnosis and adapted the Tomcat start script to start with the inspectIT agent.

Out of the box we were able to get the basic information about HTTP request timings – with one user accessing the start page the average response was approximately 1.6 seconds. Knowing that Magnolia was used as the CMS, we instrumented application services and Magnolia filters and rendering engine classes. This was done in order to get more information on where exactly the time was lost – the Magnolia or the application services.

As inspectIT provides detailed trace information for every request, we got the information we were looking for. Here is a screenshot from the tool showing one request execution path:

 

As you can see 99.9% of the HTTP request time (1.617s out of 1.624s) was spent in the Magnolia’s DefaultRenderingEngine.render() method. This method seems to be recursive and it’s rendering one-by-one all the elements that will be displayed on the HTML page. In addition, the reported CPU time of the method is approximately the same, meaning that the method is fully utilizing the CPU. This was an obvious problem, because the CPU will become the bottleneck as soon as more than one user is accessing the page. Furthermore, we concluded that the start page displayed to the user is always the same and has no dynamic content whatsoever, but on the other hand rendering is happening with every request. So we wondered if the caching can help us here.

Luckily enough, Magnolia introduced the Dynamic Page Caching module from version 5.4. The operations installed the module quickly and we were able to test again. The inspectIT proved that the caching was doing its job – the requests were executed in a few milliseconds. How cool is that:

 

 

For the end have a look at the comparison in the Google Developer Tools Timeline for that first HTML request. Dynamic caching gave us a great performance boost and better experiences for the user. Thus, take the inspectIT or any similar tool, get a better insight into your application and try to improve performance today.

 

Google Developer Tools Timeline without dynamic caching

 

 

Google Developer Tools Timeline with dynamic caching

 

 

Original post published on Novatec blogs

 

Ivan Senić, NovaTec Consulting GmbH

 

For more than 5 years now, Ivan Senić has been part of the Application Performance Management department at NovaTec Consulting GmbH. Ivan is involved in the core development of the open source performance monitoring tool inspectIT and other APM related topics. He took part in numerous performance firefight projects, where he helps in identifying the performance problems and in providing solutions. In addition to the performance diagnosis and the development of APM solutions, his goal and passion is to establish tools, activities and mindsets as an integral part of software development processes to guarantee high performance application.



Comments



{{item.userId}}   {{item.timestamp | timestampToDate}}

About the author Guest blogger

Magnolia has an amazing community of partners and clients, among them quite a few wordsmiths. From time to time, they put their expertise into blog posts and share them on this platform.


See all posts on Guest blogger

Demo site Contact us Free trial