Book Title: Performance Testing - Microsoft Dynamics 365 Finance and Operations For Sales Order Creation Web Service by Deploying Blazemeter and Jmeter
Book Title: Performance Testing - Microsoft Dynamics 365 Finance and Operations For Sales Order Creation Web Service by Deploying Blazemeter and Jmeter
net/publication/343334368
CITATIONS READS
0 297
2 authors:
Some of the authors of this publication are also working on these related projects:
Support Vector Machine (SVM) Aggregation Modelling for Spatio-temporal Air Pollution Analysis View project
All content following this page was uploaded by Shahid Ali on 08 September 2020.
ABSTRACT
Microsoft Dynamics 365 is a cloud-based product that supports various users with different production
capacities. That is why, to ensure reliable and fast operation of Microsoft Dynamics 365, conducting
performance tests is a vital measure for maintaining the leading position of the product in this segment of the
software industry. Performance testing is an important and necessary test process to ensure that there are no
problems with the performance of the product, such as: long download time, poor response time and poor
scalability. In addition, these types of testing activities help to identify bottlenecks that interfere with the
operation of the hardware-software platform of the product resource and can lead to significant delays in
performance and even to the failure or crash of the resource as a whole. Apache JMeter and BlazeMeter are
sufficient automation tools to accelerate and utilize performance-testing processes at this stage. Based on the
results of various performance tests, software and hardware engineers will develop and implement a set of
measures to improve the performance of both individual modules and the resource as a whole. After the
implantation of fixes and updates, as well as to verify performance after scheduled and restoration service work
in the hardware-software environment of the product, it is necessary to conduct a regression test for validation
of the performance of Microsoft Dynamics 365.
KEYWORDS
Microsoft Dynamics 365, Performance Testing (Smoke, Scalability, Load, Volume, Stress, Spike), Automation
Testing Tools (Apache JMeter, BlazeMeter)
1 Executive Summary
This report intends to give an overview of the work done for the performance testing of Microsoft
Dynamics 365 for Finance and Operations order creation web service.
Based on requirements for the performance testing of Microsoft Dynamics 365 for Finance and
Operations order creation web service, the following work was carried out:
• Materials were collected and reviewed on the work performed in the project and related areas
for further use in the implementation of the project.
• The functional features of the Microsoft Dynamics 365 system, the order creation web service
function and the data exchange protocol were studied.
• In accordance with the Agile approach, a project implementation schedule was developed,
and a communication plan was drawn up.
• Based on the standards and best practices of software testing, the following were identified:
collection metrics, test deliverables, Pass/Fail criteria, suspension and resumption criteria and
the sign-off process as well as a defect report and the tracking process.
• Based on the analysis of the system and the test environment, the risks were identified, and
the test environment and project structure were described.
• A market analysis of performance testing tools was also conducted, and BlazeMeter and
JMeter tools for the implementation of the project were selected.
• Based on the business requirements and the Microsoft Dynamics 365 for Finance and
Operations order creation web service analysis, a Traceability Matrix was described, and
manual and automation test cases were prepared.
• Using the selected tools, scripts were written for automation performance testing, and both
manual and automation testing was performed. In the process of preparing automated tests,
it was found that for testing it is necessary to use the Json protocol.
• The collected results were systematised, analysed and discussed with subsequent findings
and recommendations.
• According to the test results, a pattern was revealed of a decrease in the speed of writing a
new sales order with an increase in the number of lines. Also, restrictions were identified in
the possibility of creating an order containing 150 or more lines, errors occurred while writing
orders containing 70 lines or more and the service occasionally failed when overloaded. No
system crashes were detected.
• It is recommended that attention to be given to detected issues discussed with further testing
if necessary. An action plan to further monitor and improve system performance is also
recommended.
2 Introduction
IASI company specialises in Software Testing and works as a Microsoft subcontractor. The aim of
the project is Performance Testing Microsoft Dynamics 365 and designed to accelerate the process
of formation and facilitate decision-making based on data analysis and recommendations. Microsoft
Dynamics 365 is one of the products of Microsoft Corporation. According to the company's annual
report, in the fiscal year 2018 Microsoft Corporation delivered $110.4 billion in revenue and $35.1
billion in operating income [20]. Dynamics 365 represents the next generation of modular business
applications in the cloud. The system combines enterprise resource planning (ERP) with advanced
customer relationship management (CRM) capabilities. CRM functions provide users with the ability
to unify relationship data both inside and outside the company. ERP functionality includes tools of
artificial intelligence (AI), machine learning and mixed reality (a hybrid of reality and virtual reality).
Through automation and optimisation of manual tasks and through the introduction of intelligent
workflows, system users can work more efficiently [21].
User
Epic # User Story
Story #
EPIC1. As a user I want to login to the system
New sales order
As an authorised user I want to create a new “sales order”, add 10
creation
items and save the order, so that I can verify correct “sales order”
creation, adding items and saving the order
For testing it is necessary to:
• Obtain authorised access to the service.
• Study the structure and function algorithm of the service.
• Prepare data for testing.
Figures 1, 2 and 3 show the structure of the service, namely: access the page of Microsoft dynamics
365, the page with the menu for creating a new sales order, the page for adding new lines and items,
and saving the order.
The following is the architecture of the testing environment in which the testing process is conducted.
Testing Environment
Cloud Service
BBA-SPARK
global-gateway
Local network
Local router
ax.dynamics.com
It is also necessary to consider the unavailability of the service for unauthorised users during
preparation of a project.
Since the system under test may be in a state of peak loads or even failure during the performance
testing process, the service testing schedule is coordinated and flexibly adapted to meet the company's
production requirements.
4.2 Methodology
The software development lifecycle in the modern IT industry is so fast and flexible now that it is
probably impossible to imagine it without using the Agile approach.
Even though this is a short-term project for educational purposes, the Scrum [19] approach will be
followed during its implementation.
The project Implementation is divided into five sprints:
Sprints for the Project
4.4 Schedule
Conventionally, the project implementation process can be divided into three parts:
• The initialisation part includes familiarisation with the task and the development of proposals
for the project implementation.
• The implementation of the practical part of the project and report-writing.
• Finalisation of the project and discussion of its results.
Schedule for the Project
Table 4 presents a detailed Schedule for the project. The table shows the three stages of the project
with temporary expectations, namely: the initialisation, the implementation and finalisation.
Task name
Setting and initial familiarisation with
the task
Proposals presentation
The study of the system function block
under the test
Preparation of the Test Execution
Environment
Report writing
Finalised presentation
4.6 Collection Metrics
To monitor the test execution process and further analyse its results and issue recommendations, the
following metrics are collected:
Metrics Collection
4.8 Risks
The table below provides an analysis of the risks that may arise during testing.
Risks identification
Yes No
No
No
Status =
Open
Complex Issue?
Status =
Tester re-tests Test Pass?
Fixed
Yes
Status = Closed
Users
Jmeter
ax.dynamics.com
Reports
Accordingly, BlazeMeter will be used to record test scripts and Apache JMeter will be used for
performance testing to implement part of the project related to recording, parameterisation and
execution of automated test scenarios.
Selected Tools
Tool Usage
BlazeMeter Test scripts recording
Apache JMeter with plugins Performance, Load, Volume, Spike and Stress
testing; Metrics tracking
Traceability Matrix 2
Req Requirement TS
Test Scenario TC ID Test Case Description
ID Description ID
User Test
story case Test case description
ID ID
US01. TC01. Enter into Microsoft dynamics 365 system with the valid parameters
US02. TC02. Create a new sales order, add 10 items and save the order
User Test
story case Test case description
ID ID
US01. TCA01. Enter into the Microsoft dynamics 365 system with the valid parameters
US02. TCA02. Create new sales orders 100, 200, 300 and so on, add 10, 20, 30 and so on items
and save the order
4.17 Screenshot of Scripts
The screenshots below show the body of the project and explain the idea of its architecture, scripting
and request generation.
Thread Group contains the base script library
Transaction controller
Simple controller
Authorisation of http request
Authorisation of parameters configurator
Extractor for Json format
Authorisation of assertion “200”
Simple controller
New sales order creation http request x1 line
Parameterisation generator for WEBORDID
Parameterisation generator for lines in Json
Assertion “200”
Recorder of response body
Recorder of request body
Configurator of order http request parameters
New sales order creation http request x10 lines
Parameterisation generator for WEBORDID
Parameterisation generator for lines in Json
Assertion “200”
Recorder of response body
Recorder of request body
Order http request parameters configurator
Items parameters configurator (Items.csv)
Figure 14. Script for saving the response from service to file
Using Logic controllers on the example of the implementation of one of the groups:
Test
case Test case description Expected result Actual result Status
ID
TC01. Enter into the Microsoft User should be able to see the As expected PASS
dynamics 365 system with application start page
the valid parameters
TC02. Create a new Sales order, User should be able to create a As expected PASS
add 10 items and save the new Sales order, add 10 items
order and save the order
The smoke test execution passed smoothly and validated the workability of the basic script with basic
parameters.
Figure 29. Smoke test. The scalability of the system response for 20 lines Aggregate Graph
Figure 30. Smoke test. The scalability of the system response for 30 lines Aggregate Graph
Figure 31. Smoke test. The scalability of the system response for 40 lines Aggregate Graph
Figure 32. Smoke test. The scalability of the system response for 50 lines Aggregate Graph
Figure 33. Smoke test. The scalability of the system response for 100 lines Aggregate Graph
Figure 34. Smoke test. The scalability of the system response for 140 lines Aggregate Graph
A comparative analysis of the time taken to process the recording of a new order by the system for
100 and 140 lines revealed a “CLRError” defect in the record. This defect is described in detail in the
section “9 Discussion”. To identify the moment of occurrence of this defect, an additional testing of
the system was carried out in the range from 10 lines to 140 lines with a step up of 10 lines each time.
Figure 35. Smoke test.The scalability of the system response for 10 to 140 lines Aggregate
Graph
Figure 36. Smoke test.The scalability of the system response for 10 to 140 lines Aggregate
Report
Overloads of the test load generating stand, that could affect the test results during the test execution,
were not observed.
Figure 39. Smoke test. The scalability of the system response
for 10 to 140 lines Response Time Graph
Figure 40. Defect “CLRError”. Smoke test. The scalability of the system response
for 10 to 140 lines View Results Tree
According to the test results, it was possible to establish the turning point at which the defect begins
to appear, namely in the order record with 70 lines. Further condemnation of the defect is continued
in section “9 Discussion”. For clarity, a number of metrics are shown that demonstrate the absence
of the influence of the stand generating the load on the test result.
The Load test Aggregate Graph provides a visual indication of a significant decrease in the
performance of the data recording service when the load increases, even to the appearance of errors.
Figure 44. Load test. Bytes Throughput Over Time
Overloads of the test load generating stand, that could affect the test results during the test execution,
were not observed.
For clarity, a number of metrics are shown that demonstrate the absence of the influence of the stand
generating the load on the test result.
Figure 62. Spike and Stress test. Thread Group New Order LOAD
Figure 63. Spike and Stress test. Thread Group New Order STRESS
Figure 64. Spike and Stress test. Active Threads Over Time
This graph displays the activity of two Thread Groups during testing. It was planned to generate two
peak activities, but the test was interrupted due to critical recording errors.
Figure 65. Spike and Stress test. Summary Aggregate Graph
Figure 66. Spike and Stress test. Base load Aggregate Graph
Figure 67. Spike and Stress test. Spike load Aggregate Graph
The above graphs clearly reflect the increase in the delay time of saving a new order during peak
loads.
The results of the Spike and Stress test confirm the previously obtained results in the above tests.
Overloads of the test load generating stand, that could affect the test results during the test execution,
were not observed.
6 Discussion
In this section the project implementation stages, the issues encountered during the implementation
of the project and the defects found are discussed.
The performance testing Microsoft Dynamics CRM 365 for the order creation web service has been
studied. Based on the methodologies and standards of the testing software industry, a test plan was
developed, automation performance scripts were prepared and executed, and metrics were collected
and analysed, followed by conclusions and future recommendations.
As a result of using the BlazeMeter for script recording, an authorisation problem was discovered.
The need to use the Json protocol for testing has also been identified. The actual value of using the
BlazeMeter was reduced to familiarisation with the format of data packets. Thus, the script was
developed manually. The request in Json format has been processed and prepared for use with a Logic
Controller system.
The Logic Controller framework was used in the creation of Thread Groups for different types of
performance testing. The following performance testing types were conducted:
• Smoke test.
• Smoke test: the scalability of the system response.
• Load test.
• Volume test.
• Spike and Stress test.
In the smoke test, an examination of the base script was carried out to validate its performance. In the
smoke test for scalability of the response of the system, the dependence of the recording speed of the
sales order on the number of lines was checked. In the Load test, the reaction of the system to an
increase in load was studied, and the approximate allowable load range was determined for further
testing of the system without significant failures. In the volume test, the stability of the system under
load was checked. Due to the high dependence of the recording speed of orders on the number of
lines and the rather frequent occurrence of errors, it was decided not to overload the system and to
combine the two tests into one. In the process of conducting this test, it was planned to simulate the
basic component of the load and create two peaks, adding additional load, and to interrupt the test in
case of an error. One error occurred and the test was interrupted. All the above types of performance
testing were carried out many times because of the need to scale the load on the system.
The first issue was getting access to the Microsoft Dynamics CRM 365. The BlazeMeter recorded
script was not able to perform testing because of authentication issues. The system has a high level
of security and requires special privileges to run tests. To solve this problem, I turned to my supervisor
and received all the necessary information to complete the task.
The second issue was the saving of the sales order process. Even during manual testing, it was
revealed that the process of saving the sales orders was slow. Based on this observation, it was decided
to conduct a scalability of the system response, as a result of which a pattern was revealed with a
sharp increase in the time required to save the order with an increase in the number of lines. Also, a
bug or restriction was also found when creating an order of more than 150 lines and the occurrence
of a recording error when there are 70 or more lines in the order.
As a result of the work done for the performance testing Microsoft Dynamics CRM 365 for the order
creation web service function, the following defects were discovered: 1) inability to save an order
containing more than 150 lines; service outages for saving an order with 70 or more lines; 3) failure
of the service under heavy loads.
The low performance of this service when saving a new order in the database should also be
emphasised.
The error “Inability to save an order containing 150 or more lines” (Figure 73, Figure 73) can be
caused because of a bug in the code or restrictions in the service.
Service outages for saving an order with 70 or more lines is undoubtedly a bug and needs to be fixed.
The occurrence of “Failure of the service under heavy loads” defect can be caused either by a software
error, or by incorrect configuration of one of the services that support the saving operation, or by
insufficiently productive hardware, or by the combined influence of all of the above. To study the
causes of this defect, the access to all components of the system must be provided. Low performance
of this service when saving a new order in the database can be assumed with high probability that the
low performance of the service is caused by a lack of resources allocated to the database, although
there is also the likelihood that a security protocol algorithm, or other configuration problems, has
influenced the processing of the package.
To study the causes of this defect, the access to all components of the system must be provided.
7 Conclusion
Well-designed and prepared project documentation is an important part of the initialisation of the
project, its implementation, analysis of the results, drawing conclusions about the work done and
drawing up plans for future improvements.
The implementation plan at this stage is that the principles, methods, tools and means necessary for
the implementation are determined and laid down. Also, at this stage, the project is divided into sub-
tasks and milestones are determined that help to plan correctly and subsequently control all stages of
implementation. With a properly designed implementation plan, production time, resources and
financial costs are reduced. For example: Firstly, correctly selected tools will reduce the time spent
on recording, parameterising and executing scripts, as well as collecting the necessary metrics for
issuing recommendations. Secondly) a properly designed Communication Plan will reduce the
likelihood of conflicts and misunderstandings during the implementation process. It is well-known
that a well-prepared Implementation plan for the project gives a much greater chance of success than
failure.
Moreover, at the end of the project, the Implementation plan is used to compare the results obtained
with the expected ones and this knowledge can be further used to optimise a similar project or for
evaluative analysis when developing a proposal on a new topic. That is why all the effort spent on
the Implementation plan will pay off in the future.
Undoubtedly, to automate a process, it is necessary first to "touch" it. The tester must know and
understand the system under test and correctly describe the necessary actions for future use in test
cases and script writing. Therefore, manual testing is an integral part of any testing automation
process.
The modern tester must know and understand both the architecture of the hardware and network
technologies, as well as various operating and software systems, and cloud web services, the various
parts of which can be in different places on the planet.
Using modern specialised software allows the tester to significantly speed up the testing process;
quickly record, parameterise, find errors in automation scripts, and debug them, as well as collect the
necessary metrics for analysis. A clear confirmation of the above statement is the JMeter tool and its
various features and plug-ins, such as: Thread Groups, Config elements, Listeners, Pre and Post
Processors, Assertions, Controllers and so on, that were used in this project. Apache JMeter is a
convenient and sufficient tool not only for testing the performance testing Microsoft Dynamics CRM
365 for the order creation web service, but also for further support and development of system
performance testing.
Using parameterisation in script writing can significantly expand the range of data changes, thereby
increasing the coverage of tests. Collecting various types of metrics in a run-time test allows for
qualitative data analysis.
In the process of preparing the project, materials were used that were studied both in the school
curriculum and from external sources. Practical application of this knowledge brings valuable
experience necessary for future work in the industry.
To briefly summarize the project, this project confirms the significance of performance testing.
Obviously, performance testing helps to identify performance weaknesses in the functioning of both
software and hardware environments and to develop a plan for their elimination and performance
improvement.
8 Future recommendations
The defects identified during the conducting of this research work, if necessary, can be re-tested or
regression testing in the case of clarification and fixing of the defects will be conducted.
Since the Microsoft Dynamics CRM 365 comes in 24 languages, it is necessary to conduct
localisation testing of the system to exclude a drop-in performance when using different regional
settings by preparation of more data and by performing a deeper request parameterisation.
To validate stable performance of the Microsoft Dynamics 365 system and to extend performance
test coverage it is also necessary to:
• Conduct separate performance testing of all system services and based on the results, evaluate
recommendations for increasing their performance.
• Develop a comprehensive plan for testing overall system performance.
• Design a regression performance test for the system as a whole and conduct both regular
scheduled testing and system testing after detecting errors or making changes.
• Conduct web-application security testing.
For further work on improving performance testing processes, automating them, identifying potential
problems, and reducing costs, it is important to use the following modern innovation research ideas
such as:
• Reducing the cost of testing performance and identifying potential problems with service
performance through the implementation of new inventions [27].
• Developing special testing mechanisms to evaluate the correctness and good performance of
the system [25].
• Using a modern methodology for testing web application performance, which allows testers
to increase the performance of all testing processes, from the development of test cases and
the generation of test scripts to the test case execution [13] and for facilitating and simplifying
this process [11].
9 Referencing
[1] Abbors, F., Ahmad, T., Truscan, D., & Porres, I. (2013, April). Model-based performance testing
in the cloud using the mbpet tool. In ICPE (pp. 423-424).
[2] Baksi, A. K. (2013). Exploring nomological link between automated service quality, customer
satisfaction and behavioural intentions with CRM performance indexing approach: Empirical evidence
from Indian banking industry. Management Science Letters, 3(1), 1-22.
[3] Batada, I., & Rahman, A. (2012). Measuring system performance & user satisfaction after
implementation of ERP. In Proceedings of Informing Science & IT Education Conference (InSITE)
(pp. 603-611).
[4] Beckner, M. (2017). Administering, Configuring, and Maintaining Microsoft Dynamics 365 in the
Cloud. Berlin, Germany: Walter de Gruyter GmbH & Co KG.
[5] Chhetri, M. B., Chichin, S., Quoc Bao Vo, & Kowalczyk, R. (2013, June). Smart CloudBench--
Automated Performance Benchmarking of the Cloud. In 2013 IEEE Sixth International Conference on
Cloud Computing (pp. 414-421). IEEE.
[6] Erinle, B. (2017). Performance Testing with JMeter 3. Birmingham, England: Packt Publishing.
[7] Grechanik, M., Fu, C., & Xie, Q. (2012, June). Automatically finding performance problems with
feedback-directed learning software testing. In 2012 34th International Conference on Software
Engineering (ICSE) (pp. 156-166). IEEE.
[8] Hooda, I., & Singh Chhillar, R. (2015). Software test process, testing types and techniques.
International Journal of Computer Applications, 111(13).
[9] How to run performance test for dynamics 365? (2018). Retrieved from
https://community.dynamics.com/365/financeandoperations/b/howtodynamics365/posts/how-to-run-
performance-test-for-dynamics-365
[10] Huang, P., Ma, X., Shen, D., & Zhou, Y. (2014, May). Performance regression testing target
prioritization via performance risk analysis. In Proceedings of the 36th International Conference on
Software Engineering (pp. 60-71). ACM.
[11] Jayasinghe, D., Swint, G., Malkowski, S., Li, J., Wang, Q., Park, J., & Pu, C. (2012, June).
Expertus: A generator approach to automate performance testing in iaas clouds. In 2012 IEEE Fifth
International Conference on Cloud Computing (pp. 115-122). IEEE.
[12] Jin, G., Song, L., Shi, X., Scherpelz, J., & Lu, S. (2012). Understanding and detecting real-world
performance bugs. ACM SIGPLAN Notices, 47(6), 77-88.
[13] Kao, C. H., Lin, C. C., & Chen, J. (2013, July). Performance testing framework for rest-based web
applications. In 2013 13th International Conference on Quality Software (pp. 349-354). IEEE.
[14] Khanapurkar, A., Parab, O., & Malan, S. (2014). U.S. Patent No. 8,756,586. Washington, DC: U.S.
Patent and Trademark Office.
[15] Kiran, S., Mohapatra, A., & Swamy, R. (2015, August). Experiences in performance testing of web
applications with Unified Authentication platform using JMeter. In 2015 international symposium on
technology management and emerging technologies (ISTMET) (pp. 74-78). IEEE.
[16] Lewis, W. E. (2017). Software testing and continuous quality improvement. Auerbach publications.
[17] Load Testing Dynamics CRM / 365 with LoadRunner. (2018). Retrieved from
https://community.dynamics.com/crm/b/crmperformancetesting/posts/dynamics-crm-365-
performance-testing-with-loadrunner
[18] Luszczak, A. (2018). Using Microsoft Dynamics 365 for Finance and Operations: Learn and
understand the functionality of Microsoft's enterprise solution. Springer.
[19] Maximini, D. (2015). The Scrum Culture. Switzerland: Springer International Publishing.
[20] Microsoft Annual Report 2018. (2018). Retrieved from https://www.microsoft.com/en-
us/annualreports/ar2018/annualreport
[21] Microsoft Dynamics 365 Software. (n.d.). Retrieved from
https://www.softwareadvice.com/nz/crm/dynamics-365-profile/
[22] Patil, S. S., & Joshi, S. D. (2012). Identification of Performance Improving Factors for Web
Application by Performance Testing. Int. J. Emerg. Technol. Adv. Eng., 2(8), 433-436.
[23] Rodrigues, A. G., Demion, B., & Mouawad, P. (2019). Master Apache JMeter - From Load Testing
to DevOps: Master performance testing with JMeter. Birmingham, England: Packt Publishing.
[24] Sadiq, M., Iqbal, M. S., Malip, A., & Othman, W. M. (2015). A Survey of Most Common Referred
Automated Performance Testing Tools. ARPN Journal of Science and Technology, 5(11), 525-536.
[25] Segura, S., Galindo, J. A., Benavides, D., Parejo, J. A., & Ruiz-Cortés, A. (2012, January). BeTTy:
benchmarking and testing on the automated analysis of feature models. In Proceedings of the Sixth
International Workshop on Variability Modeling of Software-Intensive Systems (pp. 63-71). ACM.
[26] Varela-González, M., González-Jorge, H., Riveiro, B., & Arias, P. (2013). Performance testing of
LiDAR exploitation software. Computers & Geosciences, 54, 122-129.
[27] Zhou, J., Zhou, B., & Li, S. (2014, July). Automated model-based performance testing for PaaS cloud
services. In 2014 IEEE 38th International Computer Software and Applications Conference
Workshops (pp. 644-649). IEEE.