Ensuring Peak Performance for Magic Circle Law Firm's Applications

Our client, a leading Magic Circle law firm, wished to implement Intapp Time; a cloud-based solution designed to enhance timekeeping processes for law firms. As the software is a business critical element in tracking billable hours and is often used simultaneously by a large number of employees, it was important that the firm ensure the system could cope with the expected levels of concurrent usage, before going live. Prolifics Testing took responsibility for the complete management of the performance testing programme, including co-ordinating testing activities across a number of key Global locations.

As Intapp Time can be installed and used on a wide range of devices, including desktop, tablet and mobile, the decision was taken to drive the load via its API’s. The API communication was handled via an integration layer, also within the scope of these tests.

Scope of Performance Testing:

The API activities in scope included Save, Release, Get Client, Get Matter, Get Task, and Get Time Entries. We also needed to include testing the synchronisation of data from offline devices, to accurately simulate all usual activities on the system during the normal working day, long with the integration to the Global Practice Management System (GPMS). We selected Tricentis NeoLoad as the most suitable tool for the tests and deployed load generators locally in each of the global locations under test.

Performance Testing Approach: 

Benchmarking exercises were first performed by a team comprising of project users and testers. These benchmarks targeted key scenarios outlined by SMEs. Assessments were conducted connecting both directly and via Citrix, to establish a baseline and enable comparative analysis against the results from the formal tests.

NeoLoad was then used to generate thousands of requests in the application, simulating realistic end-user behaviour. Load generation was distributed across multiple global locations, including the UK, Central Europe, the Americas, Asia-Pacific, and the Middle East.

Test Execution:

As well as driving the performance tests directly using NeoLoad, the team ran a series of benchmarking tests in parallel, to measure performance directly on laptops and via Citrix. We also employed a network wide desktop management and monitoring tool (Systrack) to monitor the local load of the application, particularly during the initial sync with the application, where the handshake involves the transfer of clients, matters and other information.

Executing the performance tests at load quickly indicated a bottleneck with the API integration platform’s inability to successfully forward records to the Global Practice Management System (GPMS). Configuration changes were made to the Message Queue (MQ) size, which partly resolved the issues but highlighted that data was taking up to 2 hours to reach the GPMS.

To troubleshoot these issues, the team split the testing up, firstly focusing on Save only entries, which showed no issues, proving the system could reach its target of 10k Save actions within 30 mins, while the health of other systems remained stable.

Subsequent testing focused only on Posted entries revealed significant limitations in throughput, as only 6K of the 10K posted entries were successfully recorded in the GPMS. Delays were attributed mainly to slow writes back to Intapp Time from the API Integration layer, Jitterbit. Configuration changes were made to the processing method (sequential to parallel) and the parallel processing count was increased. The tests were then re-run, resulting in a proven throughout of 10K Posted time entries reaching the GPMS within approximately 80 mins, aligning with the team’s performance objectives and criteria.

After these adjustments, an initial rollout was planned for 80 early adopters, as the system had demonstrated the capacity to cope with the expected load. Monitoring showed that CPU usage on the Remote Desktop Server spiked at 80% during the initial sync, but levelled off at 40% afterwards. These metrics led to precautionary scaling recommendations to prevent system overloads during broader deployment phases.

Summary:

Performance testing Intapp Time yielded significant value for both the project and the client. Without carrying out performance testing, these issues would have been difficult to troubleshoot and fix in a production environment, especially considering the criticality of this application to the business.

Prolifics Testing consultants used our expertise in the legal sector and our experience performance testing Intapp Time to identify the issues with the integration layer. Using the scripts and mapping ouput logs to input data and records in the GPMS, we also ensured that data integrity was not compromised under load and that all records were recorded correctly, despite errors occurring, demonstrating the resilience of the solution and considering the importance of the billing data.

In summary, performance testing unearthed several foundational issues that, if left unresolved, would have likely escalated into larger, more complex problems. This pre-emptive action has been invaluable in mitigating risks and ensuring a more stable and reliable system for our client.

Click here to connect with one of our Legal Sector QA experts for a no-obligation discussion and how we can help.

Jonathan Binks - Head of Delivery
Prolifics Testing UK

Discover more about the benefits of our Testing Services

Scroll to top