You are viewing the documentation for Blueriq 16. Documentation for other versions is available in our documentation directory.
1 Introduction
The performance tests for Blueriq follow a user-centric approach. Since Blueriq is a platform and not an end-application, a reference application is used to test the performance from a user perspective.
The performance tests focus on Dynamic Case Management applications. DCM applications are based on all 3 Blueriq propositions: Decision Management (decision logic for (automated) decisions), Customer Experience Management (intelligent forms and documents) and Dynamic Case Management (dynamic handling of cases).
The performance tests are based on:
- A realistic Dynamic Case Management application;
- A typical infrastructure (hardware, database,…) configuration with 1 application server and a database server;
- A realistic load (number of users and action they perform) for a single server;
For reporting the performance a standard metric (Apdex) is used that represents the performance from the viewpoint of the end-user. Apdex normalizes the measurements into a single number from 1 to 0 that reflects end-user satisfaction ranging from satisfied to tolerating to frustrating.
With this performance report customers get an insight in what they can expect of the performance of Blueriq applications. In a customer application however, the performance depends on a large number of variables like infrastructure, custom code, Blueriq model, etc. Therefore, it is strongly recommended that appropriate performance tests are performed on the application that is modelled with Blueriq before taking it into production.
2 Reference application
The reference application used for performance testing is a Dynamic Case Management portal for subsidy application for the stimulation of renewable energy. The DCM application consists of a self-service portal for applicants (producers of renewable energy) where they can apply for a specific subsidy (the case), enter the relevant data and follow the progress of the case. For the civil servants (in several roles) of the government agency the DCM application offers a mid-office portal to support decision making and collaboration in the handling of applications. For the mid-office manager a business activity dashboard is available with management information graphics.
User roles
The application has the following user roles
Role | Description |
Applicant | Entrepreneur applying for a grant. Uses a self-service portal for the application. |
Intaker | Servant responsible to intake the application |
Assessor | Grant application assessor, making the final decision |
Advisor | Experts consulted in some scenarios before the application is granted/rejected |
Manager | A manager that uses the Business Activity Monitoring dashboard to monitor the cases |
Application flow
The application of a subsidy has 3 main phases: Apply (aanvragen), Asses (beoordelen), Grant (uitreiken). Within these main phases the flow of processes and tasks is dynamic. The possible tasks, decisions, milestones and roles are visualized in the following picture (dutch):
Application interface
The user interface of the reference application is fully based on the out-of-the-box Blueriq dashboards.
Screen | Content |
Applicant Dashboard | The personal dashboard of the applicant with:
|
Applicant Case dashboard | A dashboard with an overview of the actual case:
|
Application Form | A dashboard with an intelligent form for entering application information:
|
Knowledge Worker Overview dashboard | The personal dashboard of the knowledge worker with:
|
Knowledge Worker Case dashboard | A dashboard for the knowledge worker with an overview of the actual case:
|
Knowledge worker tasks | A dashboard with a form for entering application relevant information:
|
Manager BAM dashboard | A basic manager information dashboard with 4 graph widgets:
|
Database contents
The database of the typical Dynamic Case Management systems consists of ~100.000 cases:
- Cases
- 99.782 cases in database - Tasks
- 947.804 tasks in database - Aggregates
- 115.859 aggregates in database - Trace
- 951.000 entries in the trace database
Decision complexity
The decision logic in the reference application is kept simple and consists of straightforward validations on pages, preconditions on tasks, milestones etc.
Timer
The reference application includes a timer to re-evaluate tasks. For testing purposes this timer is set hourly to simulate additional load on the system.
3 Simulating application usage
The simulation which is executed on the application during the performance test scenario depends on three variables:
- the user load that must be simulated on the system during the test
- the activities these users perform (application scenarios)
- the total duration of the test execution
User load
The user load on the application is based on an estimation of 1.000.000 cases per year:
1.000.000 cases / year
2739 cases / day
~225 cases / hour (distributed over 12 hours daily (9-21)
Application scenarios
In a Dynamic Case Management application each case can be handled in a unique way. To simulate different kind of flows of user activities, the performance tests defines a number of different user scenarios as described in the table below. The total user load (225 cases / hour) is distributed over these scenarios as indicated in the column #Cases/hour.
| Scenario | Description | # Cases / hour |
1 | Fully automated handling (rejection) of grant application | Straight-through handling of an application. Only activities by the Applicant. | 22 |
2 | Handling by assessor | Handling of the case by the Intaker and Assessor | 55 |
3 | Handling by assessor with additional advice by Advisor | Same as scenario 2 but with an additional step by an Advisor. | 66 |
4 | Manager views business activity dashboard | Not really the handling of a case but a manager opening the BAM dashboard. | 5 |
5 | Handling by assessor (with Caselist) | Same as scenario 2, but with a Caselist | 55 |
6 | Handling by assessor with timers |
| 22 |
| Total |
| 225 |
Duration of test execution
The current performance test is a stability test and runs for 2 hours with a virtual user load as on a typical business day (no stress test).
4 Test environment
For the performance tests a typical hardware configuration is used that consists of two separate servers for the runtime and database. The specification of the test environment is described in the test reports.