top of page

How RPA and BPMS can solve the real digital transformation bottleneck

Most of the projects we do here at DI Blue have to do with enabling our client businesses to work smarter, faster and excellent. Therefore, business process automation plays a very important role in almost everything we do. We use low-code platforms such as Netcall Liberty Create as one of the core enablers to build what we call speed companies, but in practice, throughout the evolution of these kind of speed projects, we have to deal with many different challenges, which cannot all be tackled by 'the platform'.



A real-life use case

Imagine a B2B use case where the client (let’s say, a department of a large engineering company, responsible for warehousing, asset maintenance and logistics support) wants to improve the way (internal) customers send material reservations to them and how the fulfillment is done.

Currently, there is a team dedicated for order entry, which come in mainly through email and telephone. The actual order entry is done in the ERP backend, and from there on, the process of fulfillment goes forward semi-"automatically".


However, there is little transparency across the chain for both the customers and the internal team and the level of process control and quality management is very low. In the past, some proof of concepts were done to redesign the business process, but the integration with the ERP system repeatedly came out as the key bottleneck, rendering much uncertainties and continuity risks to the organisation, resulting in low trust to push these projects further. Then low-code comes around the corner.


Igniting digital transformation


A business process like this serves as good example for candidate flows (so called "seed processes") for low-code to get track as a rapid digital transformation tool. Imagine that what you design as a model can be transformed into a working application and vice versa, whenever you want to change it.


As long as the candidate process creates a business impact, meaning efficiency improvement (more speed and less costs) and/or customer satisfaction / improved CX, an “oil stain” effect could take place, being the concept of automation spreading across the organisation, connecting the silos, improving product and service quality and eventually creating more headspace and resources for innovation.

Now this is the theory: in larger companies which have bit of history, reality is really is a lot different. Let’s have a look at the full eco-system that low-code needs to connect with to make a real impact.


The challenges when implementing low-code

First of all, in general, low-code systems do not primarily focus on the graphical user experience as a primary goal/objective. They help you to automate your business flows. That means that it might not be a very good idea to open up your standard low-code portal to your direct customers, however, you need them as they are the key actors of your business process.

What often happens as a result is that the underlying logic is embedded either in an existing toolset or platform (like an intranet portal page) or that specific applications are built for that purpose.


In our example, a dedicated B2B application was created, basically serving as the customer-facing product catalogue and ordering system, using the APIs to send orders and get order statuses. This kind of solution, although very logical and handy, adds an extra layer of complexity.

From there on, all goes well within the scope of the BPMS. Using the case management concept, the process flows forward and back, taking business rules into account, serving the organisation with workflows, sensors, analytics, and much more automations and makes the process flows run super-fast, during the business process runtime.


However, once data has to be persisted you run into problems. Although certainly possible, the goal of the BPMS could never be to hold master or transactional data itself (respecting the key principle of “separation of concerns” ensuring agility of the IS/IT landscape) – therefore you want to store the data in the source systems. In our case, an ERP system.


Integration is the key bottleneck


So there you go, you end up with the same set of problems as mentioned earlier: integration is difficult, expensive and unfortunately, the key inhibitor of change. This is not really due to the BPMS itself, but because of the fact that you need to build new integration capabilities between the BPMS and the source system(s).


And yes, of course many companies have middleware layers or even full blown enterprise service buses in place – but, really, have you ever seen them being used without major hick-ups and high costs involved? That’s exactly what you wanted to avoid with the introduction of BPMS.


And it gets even worse when you need to connect with legacy systems that lack the basic integration capability or are just so outdated that it is very unwise to add new features to it. These kind of situations will instantly kill your BPMS opportunity and the problem is – you can’t really explain it, as your key audience most probably will be “business guys” who may lack the technical skills to understand the problem (“How hard can this be?”) and whose opinion might be more tenacious due to past experiences dealing with these kind of problems. Your project will be killed without the chance for a next one.


Robotic Process Automation and its magic

Now think of the following. Imagine you do not have to build any technical integration, but that you are able to use the same screens that are currently used for manual order entry into the ERP system. You add a virtual worker to your team.


Instead of “replacing human work with robots” (which more or less is the key proposition currently used for using RPA and, in our opinion, a very negative one), also called “Digital Labour”, you amplify your solution to overcome the legacy bottleneck, enabling your teams to focus on really important stuff. No need to work through technical integration scenarios or complex data mappings: script the scenario in to the software robot and connect it to your BPMS API.


Basically, this is how it could work:

The key objections to this concept either are in the category of “This is a silly work-around, you are not fixing the real problem” or of the likes of “Screen-scraping has been out there for a very long time and has not proved itself really in end to end business scenarios, so why would this work?”.

However, you actually are solving the key problem, which is IT not being able to deliver value fast, or legacy technologies not being able to follow the speed of change, resulting in endless and expensive integration projects.


For the screen-scraping argument, RPA really is something else. With modern RPA-tools, you can build very complex scenarios, including intelligent feedback loops with data from the source systems and advanced orchestration, just as you do with the BPMS part of your solution. In fact, RPA can be seen as “screen level”-process management, serving very well as a powerful extension of a business process automation solution.


At Digital Innovation we use Netcall to realize these use cases.

bottom of page