
As we’ve written about previously, there are huge benefits to automating the test suite when integrating Azure or other Cloud solutions.
Benefits include reducing bottlenecks, increasing accuracy and predictability, and freeing up project time for new developments rather than ongoing maintenance.
Of course, this relies on knowing what is best to automate in the first place. There are many types of tests that can be automated to streamline the software release lifecycle, which are broken down here:
Functional Testing
This type of testing validates the implementation of the software at various levels and conforms with the original system requirements.
Unit Tests – These tests are generally developed alongside the code by the development team and help ensure the code implementation is aligned with the requirements or specification.
Integration Tests – This involves combining multiple components or sub-components of the system together to ensure the software is working as intended across system boundaries and that configuration is correct.
E2E Tests – This is a form of integration test but takes the concept a step further by testing the End-To-End process from start to finish. This type of testing is complicated and hard to repeat as state can be affected by the act of testing.
Non-Functional Testing
This testing validates the aspects of the system which are built around user or owner expectations rather than specific functionality, for example a user expects a website to respond within one second of clicking a button.
Performance Tests – these validate the overall performance of the system where performance is important for the business or is noticeable by the users of the system. This might be an overnight batch process and is expected to complete in one hour or less, or for a website the page responds in a reasonable time.
Security / Penetration Tests – These are critical tests before putting a system live, especially in a publicly available solution. This is always the case in Cloud software as the software is always potentially available to the public if not configured securely. The principle here should be to expose the minimum surface area to the outside world as is absolutely necessary for the system to operate.
Tooling
NUnit and Moq – These are publicly available libraries used by the development team to automate unit testing of classes and methods in the codebase. We integrated the test suite into the automated builds using Azure Devops yaml builds to compile and execute all tests on check in to the central repository. This allows excellent visibility of the state of the codebase and offers quick feedback to the team if something is broken. Test coverage reports were then used to indicate which areas of the codebase had lower coverage and could then be targeted by the team. On average we achieved around 75% code coverage across the entire solution. F-tek believe anything above 75-80% code coverage of the business critical logic of a system is adequate, much more that this yields diminishing returns on investment and is often spent covering very basic code structures rather than business logic.
Specflow – This is a tool intended to help bridge the gap between the development team and the business users or analysts. Tests are written in a more human-readable format in stages of Given X, When Y, Then Z should be true/false. This pattern is akin to the unit testing Arrange (Setup), Act (Execute), Assert (Check or Validate). F-tek introduced Specflow to enable the manual tests to be automated in a fashion that made sense to the testing team and could be read and understood by the business. The development and test teams could then work together to implement the underlying test code driven by the human-readable steps to operate the system in the desired way.
One great feature of Specflow is that steps can be re-used and combined to form new tests, thereby reducing maintenance and offering re-usable components for new test scenarios. These tests were also integrated into the CI/CD process of the solution in Azure DevOps but required deployment of the Azure artefacts prior to testing them, as they are generally more focused on the runtime behaviour of a system and cover the Integration and E2E test suites.
However, as of 31st December 2024, Specflow is no longer supported and is not compatible with .NET 8 libraries. Companies relying on Specflow for BDD testing will need to select an alternative technology to maintain relevant and effective testing practices. Several options are available for replacing Specflow, including ReqNRoll, Fitnesse, and Gauge, among others. These tools provide modern features and ongoing support, ensuring that teams can continue to leverage Behaviour-Driven Development (BDD) effectively in their workflows.
JMeter – Used for performance and load testing the system and individual API’s. It enables us to replicate real user load running multiple workers against the target system and recording the response times and error rates. The initial step was to identify target workloads and define relevant performance metrics that met the business and user requirements. Once these were defined, we were able to create relevant test scenarios which could be run at scale. As a Service Orientated Architecture (SOA) we performance tested individual services of the system, however, it is important to note that the performance of the whole system is not a sum of its parts and should be tested again to assess the overall performance meets the business requirements. This often introduces extra interaction latency between dependent services.
Penetration Testing
This activity was and normally is outsourced to an external party to perform an independent evaluation of the system. This approach removes any prior knowledge or bias from the testing phase and will give a true understanding of the security benchmarks used by external hackers of IT systems. Using software best practice of “least privilege” access and “security in depth” we were able to pass the assessment easily on the first pass as there were no unexpected parts of the system accessible from external users. In addition, the underlying code practices were implemented to prevent common attack vectors such as SQL injection, cross-site script execution or session hijacking, amongst others.