Digital Decisioning systems can be built and deployed in any architectural framework. Many successful Digital Decisioning systems are used to extend and modernize legacy applications, with the core Decision Service deployed to a mainframe environment. Some Digital Decisioning systems are deployed to kiosks or are embedded into machinery or control equipment. Most are deployed using a Service-Oriented Architecture and are integrated with Business Process Management Systems, Event Processing Systems, or both.
Using a Service-Oriented Architecture means breaking up systems functionality into a series of coherent services that cooperate to deliver the total functionality required. Each element has a well-defined set of interfaces that allow other services to rely on it. These interfaces also hide the details of the implementation, allowing services to be built in different languages or following different design approaches, yet still collaborate. The services are said to be “loosely coupled” because they do not need to know much, if anything, about the innards of other services—they can rely on the defined interfaces and standard protocols for accessing them.
The core of a Digital Decisioning system is conceptually a Decision Service, a service that answers decision-making questions for other services. Decision Services do not need to be deployed as a formal service. They can also be deployed as a mainframe program module or an embedded software component. The use of a SOA is a perfect fit for Decision Services as it allows Decision Services to be deployed as loosely coupled, coherent, well-behaved services that mesh easily with other services. They follow all the guiding principles of any service and implement a single decision or a closely related set of decisions.
The use of an SOA platform is no different for a Decision Service than it is for any other service. The definitions of the Decision Service and its interfaces should be kept in a service repository like any other service. SOA platform technologies should be used to keep track of the use of Decision Services by other services, to version the interfaces for the Decision Service and to do impact analysis before making changes.
A Business Process Management System (BPMS) allows the definition and execution of all the necessary tasks to execute a business process and so fulfill a business need. These tasks may involve data entry, integration of multiple systems, human tasks such as inspections or reviews, and automated tasks. Many business processes include tasks best implemented using Digital Decisioning systems with automated decision-making tasks. In addition, Digital Decisioning can be used to determine which transactions need review or need to be routed to which users.
A BPMS is made up from software components that aid with the following:
- Defining and managing the tasks in a business process
- Integrating multiple systems into a single process flow
- Executing both human and system tasks to achieve a business outcome
- Monitoring and reporting on process execution
Define and manage tasks
Business processes are often long-running sequences of tasks that cut across organizational boundaries. These sequences of tasks must be defined, managed, and successfully coordinated to ensure a business outcome. The order to cash process, for instance, handles all the tasks between receiving an order and getting paid for it. A strong BPMS will support a wide range of tasks and will provide a collaborative environment for business and IT teams to jointly define the tasks, the interconnections of those tasks, and the sequence. Some of these tasks will be decision-making tasks. Those that are automated will be represented by Digital Decisioning .
Integrate multiple systems
Many business processes cut across organizational silos and require functionality and data embedded in multiple existing systems. A BPMS will allow data to be brought into the process from multiple systems and written back out to them in a well-defined way. It will also allow the invocation of functionality in existing systems, both as services in a SOA and otherwise. This integration must be defined and managed along with the necessary data transformations to move data between them. A BPMS is built on a service-oriented foundation, making the integration of services and of service-enabled applications straightforward. This includes the integration of the necessary Decision Services.
A process engine handles the execution of automated tasks and the coordination of human tasks. It handles time outs when systems or people respond too slowly, puts transactions that need review on work lists, parcels out work to available resources, and much more. When these tasks are represented by Digital Decisioning systems, they are invoked appropriately.
Monitor and report
Finally, a BPMS provides accurate information on the execution of business processes. Status information is presented to allow supervision of specific running processes, and performance information shows how well or poorly the process behaves overall. This monitoring information is integrated with information about the performance of the Digital Decisioning systems that support the process.
Event Processing Systems
An Event Processing System allows the correlation of events from any source over any time frame so that an appropriate action can be taken. As shown in Figure 10.3, many different event sources create a “cloud” of events that the event processing system evaluates and correlates so that action can be taken in response. Often, Event Processing Systems are used in conjunction with Digital Decisioning, with the Event Processing System determining what question to ask and the Digital Decisioning system providing the appropriate answer.
Event processing systems are made up of software components that aid in the following:
- Capturing live data from various sources
- Enriching this live data with information from within the business
- Seeking correlations and evaluations of conditions on enriched live data in real time
- Triggering action on detected patterns through a variety of external connectors that provide simplified access to execution components
- Tools to allow business users easily handle constant change to the environment by adding and modifying existing pattern and action logic expediently and reliably.
Live data capture
A distinguishing feature of event processing systems is their ability to work with data in motion, in addition to data that has already come to rest in a database or other secondary storage. Data is acted on directly as it comes through messages, buses, or data streams by keeping it in an in-memory cache. A superior event processing system will provide many native connectors to directly access many forms of live data to ease the integration into existing IT infrastructure.
Enrich live data
Captured live data forms the basis for understanding the constantly changing environment. However, the components that enable the enrichment of this data provide the value-added discrimination that a business can bring to interpret the changing conditions. Additional information from a user’s profile in conjunction with live data can lead to new intelligence that can bring out added opportunities or mitigate risk. This enrichment can include adding existing predictive analytic models to the live data. For instance, a customer churn prediction may be added to live data about customer behavior. Enrichment may also involve building predictive analytic models using the streaming data possibly in conjunction with non-streaming data.
Evaluation and correlation
Making connections between various data points in real time is a critical feature in an event management system. A variety of engines are possible—from those that specialize in correlating vast data streams but in narrow windows of time to those that can match a wide variety of data inputs across longer time periods. Depending on the primary application, the right kind of event correlation engine can be used. Common characteristics of a good engine include scalability and the ability to handle large throughputs of incoming data in real time.
Typically, instantaneous correlations are possible by large in-memory caches that classify and collect various bits of relevant information that come in. Evaluations on this subset of organized data are constantly performed to correlate them to the desired patterns using a variety of algorithms. A similar syntax as that used in business rules is often used to define these evaluations and correlations.
Once a pattern is detected, the event processing system acts by triggering a defined output channel. A good event processing system will have numerous native connectors to activate common action components. Simple actions can typically be acted on by the event processing system itself, but more complex responses are best delegated to the appropriate sub-system within the business infrastructure. For example, once a condition is detected, if there are complex conditions that must be evaluated to decide upon the best course of action, triggering a response in a Digital Decisioning system may be ideal. In other examples where a business process must be activated in response to a condition, a BPMS can be triggered. In some cases, the action may just be to update a dashboard showing a business KPI to keep a human apprised of the occurrence.
Business user tooling that helps with the authoring and maintenance of the event patterns and action triggers is an essential aspect of an event processing system in a business setting. Business user tooling should include pattern development tools, pattern-testing tools with simulated data, and a strong versioning and audit system to control changes made to the system. Ideally, these sets of tools should be provided through visual graphical user interfaces that require no programming knowledge to fit within the profile of an empowered business user.