In March 2023, the University of Kent embarked on an expansion of its Tribal SITS:Vision platform to accommodate its growing administrative and academic needs. This expansion necessitated a rigorous evaluation of the platform's performance, particularly under high-load scenarios that are typical during key points in the academic cycles, such as enrolment and examinations.
Prolifics Testing were selected to carry out the performance testing, which was tailored to assess the robustness and scalability of the SITS platform while under load. The focus was on four critical user journeys - 'Clearing on the Web', 'Home Student Enrolment', 'Module Selection', and 'Staff/Exam Board' processes. These were identified as some of the most traffic-intensive operations on the platform, representing a broad spectrum of user interactions during peak periods.
The objective of testing was to simulate real-world usage through the e:Vision portal, ensuring that the system could handle increased user loads efficiently without any degradation in performance. This step was crucial for the University of Kent to maintain a high level of service quality and system reliability, essential for its daily operations, especially around these key peaks in system usage.
Prolifics Testing's approach was to use our JMeter in the cloud service, accessing our pre-built load injectors and in house accelerators/ dashboards to increase efficiency of the tests while allowing our customers real time visibility into the status of the running tests.
A thorough test plan was developed, detailing the test scope, expected loads, and the roles and responsibilities of the collaborative teams. This plan was instrumental in guiding the testing process, ensuring that every aspect of the performance evaluation was methodically covered. Once the plan was agreed and signed off, our team could proceed with the detailed test preparation.
Using our SITS experience and pre-existing re-usable functions, we developed JMeter scripts against the University’s application aiming to simulate real-world user interactions on the platform. These scripts were crucial for generating the necessary system load to realistically test the platform's performance.
Utilising AWS load injectors allowed our team to instantly provision the hardware to simulate over 3000 concurrent users, at a low cost. This setup allowed for a realistic replication of the platform's usage during peak periods, providing a solid foundation for the tests.
The testing comprised nine distinct scenarios, each designed to evaluate the system under different load combinations. This included individual peak load tests for focused analysis and a comprehensive stress test combining all scenarios. The stress test was particularly significant as it aimed to determine the system's maximum capacity and identify any potential breaking points.
The performance testing was run over 5 days and yielded significant insights. The initial tests, focusing on individual user journey scenarios, were largely successful, indicating that the platform could handle specific tasks under peak load conditions without major issues.
The first test, targeting the Clearing process, was executed smoothly. With 350 simulated users, the system showed low CPU and RAM usage on both the database and application servers, suggesting good performance under this specific load.
However, the second test, combining Clearing and a portion of the Enrolment load, encountered setup issues with enrolment accounts, leading to a test failure. This necessitated script adjustments and a rescheduled run.
The rerun of the first test in a distributed mode across five machines replicated the previous day's success, with a slight increase in resource usage but no critical errors.
The second test, now including both Clearing and Enrolment loads, was successful with minimal errors, indicating improved system handling post-adjustments.
Subsequent tests, such as the Enrolment and Module Selection peaks, were executed successfully, though minor connection timeout errors and application server issues were noted.
Tests combining Module Selection and Enrolment loads showed the system's ability to handle combined user activities, despite a few errors and high resource usage, particularly in CPU and RAM on application servers.
Day 4 and 5
The Staff Board test and combined load tests involving Clearing, Enrolment, and Staff Board processes ran without major issues, showcasing the system's resilience.
The final stress test, however, revealed limitations. With a load surpassing the system's expected peak capacity, numerous connection errors and high resource usage were observed, pinpointing areas needing enhancement.
These results provided a clear picture of the system's strengths and limitations, highlighting its capability to handle specific and combined user loads efficiently while also identifying critical stress points requiring attention.
The performance testing delivered substantial benefits. It provided a clear understanding of the system's capacity and its limits under various operational scenarios. This knowledge is crucial for the university's future planning, especially in managing high-traffic periods efficiently.
A significant outcome of the testing was the identification of specific areas where improvements were needed. These insights are vital for making targeted enhancements to the system, ensuring better resource allocation and server capacity during peak usage times. As a result, this will directly contribute to enhancing the user experience for both students and staff, particularly during critical academic periods.
Moreover, the testing played a key role in risk mitigation. By pinpointing potential system breakdown points, the university could proactively implement measures to maintain system stability and reliability. This proactive approach is essential for maintaining continuous service availability.
Additionally, the detailed results from the testing provide a solid foundation for informed decision-making regarding IT infrastructure investments and upgrades. With a better understanding of the system's capabilities and limitations, the university can make more strategic decisions in its IT planning.
Overall, the performance testing has instilled confidence in the platform's ability to meet real-world demands, supporting the university's commitment to operational efficiency and high levels of user satisfaction. It also sets a precedent for integrating ongoing performance monitoring into the university's IT management practices, ensuring continual system optimisation.