もちろん、DX成功の鍵はひとつだけではありません。しかし、他のあらゆるものが揃ったとしても、これなくしては、DXを着実に推進させ成功に導くことはできないと、確信を持って言えるものがあります。それは、DXを推進する常設の専任部署である「COE(Center of Excellence)」です。特に、DXの取り組みの肝となるビジネスモデル、ビジネスプロセスを変革するためには、BPM(Business Process Management)と呼ばれる、包括的な方法論が有効であることから、BPM-COEと呼ぶ専任チームの立ち上げが重要と考えます。
一方、BPM(Business Process Management)は、ビジネスプロセスを適切に運営するための方法論です。現状(as-is)のプロセスの改善だけでなく、新しいビジネスモデルに基づく、あるべき(to-be)プロセスの設計と展開、安定的運用と継続的な監視までをカバーする包括的なものです。
なお、マトリックスには記載しておりますが、プロセスマイニングの対象とはならない、ビジネスモデル層については、ビジネスモデルキャンバス(BMG:Business Model Canvas)、プロセスモデルキャンバス(PMG:Process Model Canvas)といったツールが活用できます。
Reference Matrix for Process Mining Tool Selection
In recent years, process mining has been further recognized and understood as a useful solution for promoting and establishing DX. In addition, as shown by the recent acquisition of myInvenio by IBM and Signavio by SAP, there is no doubt that process mining will become increasingly important as an indispensable component of corporate IT system development and operation as it is incorporated into the solutions of major IT companies.
Needless to say, the adoption of process mining tools is essential for companies to improve their business processes and to renovate and develop their systems using process mining, and the selection of the best suited tool for your company is a major key to success.
In this article, I will explain a matrix that will help you determine what functions of process mining tools are particularly necessary for your company.
●Horizontal axis of the matrix: Time
From the perspective of time, there are three dimensions which are completed processes in the past, processes in progress at the moment, and future processes to be executed in the future.
In general, data analysis is done on completed historical data. The same is true for process mining analysis. By analyzing completed event log data with process mining tools, we can automatically model current processes and analyze them from various perspectives (basic analysis).
For example, the following basic analysis is available.
Frequency analysis
Performance analysis (analysis from the perspective of time required and cost)
Variant analysis
Conformance checking (comparative analysis of current process and ideal process)
etc.
There are some process mining tools which can do continuous monitoring and if problems such as deviations are detected, alerts are sent to the relevant parties by importing ongoing event log data to the process mining tool at a frequency close to real time.
For future processes that will take place in the future, the following functions will be supported.
Simulation (What-IF Analysis)
Simulate how much improvement (throughput reduction and cost reduction, etc.) can be obtained if the current process is improved in some way.
Modeling
Model the flow of the ideal process to be implemented in BPMN format.
Forecasting
predict how in-process projects will be processed in the future and how much time will be required by using AI.
Recommendations
Based on the results of the above predictions, the tool proposes the best measures to prevent problems from occurring and prolonging the processing time.
Automated Process Improvement (AutoPI)
A process mining tool automatically executes measures for process improvement under certain conditions to achieve a quick remedy.
●Vertical axis of the matrix: Business layer
The business layer is a factorization into more detailed components from a process perspective. Administratively, the higher the layer, the more “strategic” it is, and the lower the layer, the more “tactical” it is, and the more “operational” (day-to-day on-site management) it needs to be.
At the top is the business model. From there, the granularity becomes finer, including the value chain that grasps the processes of the entire company from end-to-end, and the individual processes that make up the value chain.
Any business process can be broken down into a number of sub-processes. One more sub-process is composed of finer-grained tasks, and those tasks are composed of multiple activities.
For example, if we consider a sub-process called “invoice processing” in the accounting department, this includes activities such as “receiving invoices,” “checking the contents of invoices,” “registering invoices in the accounting system,” and “processing payments for registered invoices.
Among these activities, in the case of “receive invoice,” each task step is executed one by one, such as “open the email with the PDF invoice attached” and “download the attached PDF invoice.
In addition, these task steps are executed in the smallest units of PC operations, such as clicking on the mail software icon, clicking on open mail, and clicking on the attachment. These are called “atomic activities” because they cannot be decomposed any further.
Process mining basically analyzes the activity layer (or task step layer, as the case may be) from the process layer. transactional data recorded in IT systems are often at the activity level, which is relatively coarse-grained. In many cases, transaction data recorded in IT systems is at a relatively coarse activity level.
Therefore, task mining is used to analyze task steps and atomic activities with finer granularity. Task mining is still in its infancy, and it is still at the stage of trial and error for deeper analysis besides BI-like aggregation. However, by using it together with process mining, it can contribute to process automation, especially with RPA.
Now, in light of your company’s business process issues, which should be the target of analysis: past, present, or future? Also, at what granularity should the process be analyzed as a business layer?
With the person in charge of the tool vendor, let’s look at this matrix together to understand the extent to which these functions can be implemented while recognizing where the company is aware of the issues.
For the business model layer, which is not subject to process mining, tools such as Business Model Canvas (BMG) and Process Model Canvas (PMG) can be used.
ロボティック・プロセス・マイニング・ツールは、複数の作業者が一定期間にわたって生成したUIログを分析し、頻繁に繰り返された手順の流れ(シーケンス)を発見します。これらは「デジタル作業ルーチン」と呼ばれています。各ルーチンは分析され、例えばRPA(Robotic Process Automation)のボットや、アプリケーションオーケストレーションスクリプトを介して自動化できるかどうかを判断します。
Interest in process mining was born in Japan in late 2018. About two years have passed since then, and interest in process mining has grown even more, with an increasing number of companies, especially major corporations, introducing process mining and achieving a certain level of success.
For example, KDDI, a major telecommunications carrier, IHI, a major heavy industry company, Hitachi Transport System, a major logistics company and MISUMI, a major mold trading company, are actively using process mining, and use cases of their efforts are being reported.
However, there are not a few Japanese companies that are still skeptical about process mining, and it can be seen that there are some myths that can be called Japan-specific myths.
This article aims to debunk two myths that Japanese companies tend to hold about process mining.
Myth 1: Japanese companies’ systems are too complex to be analyzed by process mining
It is true that Japanese companies rarely adopt packages like SAP as the default, and they often perform extensive customization to match current business procedures or develop from scratch. In addition, it is not uncommon for the system configuration to be very complex and bizarre, as it has been repeatedly modified in response to changes in business operations.
For this reason, some people assume that the event log is also complex and that process mining analysis will not work. However, this is an illusion.
Many European companies, which are leading the way in the adoption of process mining, are still running legacy systems that have grown in age and complexity. The transaction data extracted from such systems is certainly not of high quality, and it takes a lot of man-hours to format the data into an event log that is suitable for process mining analysis.
However, it is not impossible, and even if it is not possible to draw a beautiful process model, it is possible to get an overview of the complexity of the current system, and, using process mining as a starting point, to design better business processes and define requirements for the supporting business system.
Reality: As long as there is some kind of event log, process mining is feasible for any complex system.
Myth 2: There are many tasks that are not performed in the system, such as manual processing of various paper documents and Excel operations, so process mining is useless.
The fact that much of the work is done outside the system, such as by hand or in Excel, is not limited to Japanese companies; a recent survey by Forrester, an IT research firm, found that a whopping 70 percent of U.S. and European companies do some kind of manual work in their operations.
The manual part, such as processing paper slips, cannot be captured as digital data and cannot be analyzed by process mining, of course. Even operations in office suites, such as Excel, are not automatically recorded as transactional data.
However, manual operations can be digitized by using OCR and other tools, or incorporated into workflows to leave a high percentage of digital footprints. In addition, Excel operations, for example, can be captured as a PC operation log or user interaction log by task mining.
Thus, with the advancement of digitalization, an environment is being created in which process mining analysis can be easily performed day by day.
In addition, even if a business process involves a lot of manual and PC operations, if the business system is used some part of it, it is possible to describe a process model based on milestone activities and discover where inefficiencies and bottlenecks exist.
And the inefficiencies and bottlenecks discovered by process mining are often manual operations or Excel operations, and as improvement measures based on the results of process mining, digitization and digitalization of manual operations and standardization through workflow systems are promoted. This creates a virtuous circle that further increases the effectiveness of process mining analysis.
Reality: Process mining for operations that involve manual operations and Excel can also lead to satisfactory results.
HFS report on Top 10 Process Intelligence Products
米ITサービス調査会社大手のHFS Researchが、2020年9月、「HFS Top 10 Process Intelligence Products 2020」と題したレポートを発行しました。
HFSでは、40人を超える業界のリーダーたちにインタビューを行い、有望なプロセスインテリジェンス製品として14製品を選出しました。そして、大きくは、「革新(Innovation)」、「実行(Execution)」、「顧客の「声(Voice of the customer)」の3つの切り口で14製品を評価し、ランク付けを行いトップ10を決定しています。
In this article, I’ll explain the flow of using process mining to improve business processes, contrasting it with the procedure of treatment in a hospital.
Process mining aims to discover various issues and problems hidden in the process by visualizing invisible business processes from the event log data.
In terms of this “visualization of the process”, process mining is often likened to an X-ray. However, just as in the treatment of diseases, the ultimate goal is not the discovery of the lesion (Inefficiencies and bottlenecks) but the implementation of appropriate treatment (improvement measures) and the return to a healthy state, in other words, the realization of an improved “ideal process(to be proess)”.
Let’s start by outlining the flow of medical activities in a hospital. Broadly speaking, there are two stages: the “diagnostic stage” and the “treatment stage”.
●Medical activities
Diagnostic stage
Patient
The starting point for treatment is when a patient comes in with some kind of symptom such as fever or cough.
Preliminaryinterview
First, we will ask questions about the extent of your current symptoms and conduct an interview.
X-rays
Using an X-ray machine, the area where the lesion is thought to exist will be photographed.
x-ray photograph
The presence of the lesion is confirmed by looking at the X-ray photograph.
Diagnosis
From the results of the X-ray photos, you can determine what diseases the patient have.
Physical examination
In addition, various physical exam and tests will be performed to verify the correctness of the above diagnosis.
●Treatment Stage
Treatment policy
The course of treatment is based on the results of the diagnosis and the patient’s wishes. For example, it’s about whether to carry out surgery or how to treat medication.
Surgery
If it is better to remove the lesion, surgery will be performed.
Medication
The treatment is performed by administering medications alone or in conjunction with surgery.
Recovery
The etiology has been eliminated and the symptoms are gone. Treatment is complete.
Next, we’ll outline the steps to improve business processes along the path of diagnosis and treatment at the above hospital.
●Business Process Improvement
Understanding the current situation – Diagnostic stage
Process with problems – Patient
Select processes that are experiencing problems as phenomena, such as long throughput, high operating costs, customer complaints, etc., as targets for improvement.
Process Setup – Preliminary interview
Basic information related to the process to be improved, such as an overview of the process, the number of processes, and the department or person in charge, will be organized through interviews. If there are any specifications or manuals for the system involved in the process, check them as well.
Process Mining – X-ray
Based on the event log data of the process to be improved, we analyze it using a process mining tool and create a flowchart of the current process.
As is process – X-ray photograph
We analyze the current process from various perspectives, such as frequency and time required.
Problem identification – Diagnosis
Based on the results of the above analysis, we identify the areas that are causing problems or issues as a phenomenon, i.e. inefficient procedures that are taking too long, or bottlenecks that are piling up pending cases.
On-site interview and observation – physical examination
To identify the problem areas, we conduct interviews with the person in charge at the site and conduct observational surveys to identify the root cause.
The root causes of process inefficiencies and bottlenecks are: too many meaningless steps, too many mistakes, too many reworkings, and too few people assigned to deals that need to be done.
Improvement Activities – Treatment Stage
Improvement Policy – Treatment Policy
Once we have identified the various problems and issues related to the process and the root causes of these problems and issues, we plan improvement measures.
As a major improvement policy, it is important to first clarify the objectives, such as reducing throughput, reducing costs, and improving customer satisfaction.
Implementation of improvement measures – Surgery and Medication
There are a variety of options for improvement measures, ranging from major to minor modifications.
BPR (Business Process Re-engineering), which is a zero-based re-engineering of the process, can be compared to surgery. Replacing manual tasks with RPA software robots might be like replacing an artificial heart.
If a small change in procedure could improve the time required, it would be a disease that could be treated with simple medication.
Improved Process (To be process) – Recovery
Once the desired process has been achieved as a result of effective improvement measures, the project is complete.
Just as regular check-ups are necessary in the treatment of a disease, it is important to continuously monitor the target process to ensure that problems do not recur or new problems arise.
Thanks to process mining and task mining, you are able to find inefficient processes and bottlenecks in the process. But that’s not where it’s done, is it?
Needless to say, process mining and task mining make it easy to uncover problems through data analysis, but they don’t tell you how you can solve problems. (It’s plausible that in the future, advanced AI capabilities will be added in process mining tools to hint at ways to improve them.
Therefore, once a problem related to the process has been found, the process mining and task mining are no longer needed for the time being.
What comes into play after finding a problem is problem-solving techniques, including Lean Six Sigma. In those problem-solving methods, you can use various frameworks such as 5WHY and factor analysis (fish bone analysis) to find the root cause behind the problem, and then plan and implement specific improvement measures.
This article introduces nine redesign methods that can be used as a reference to consider specific improvement measures. These redesign methods are systematized in BPM (Business Process Management). This is a practical method that has been developed empirically through numerous process improvement projects that have been implemented in the past.
In fact, there are nearly 30 rules of thumb for process redesign. Of these, the nine methods I’m going to share with you are the most common and most likely to produce improvements.
The nine process redesign methods can be broadly categorized into three levels (task level, flow level) and process level). Each method will be explained at each of the three levels.
TASK Level
This is a change of any kind to the individual tasks (activities) that make up the process.
1 Task Elimination
For those tasks that are taking a long time, it’s important to ask yourself if the task is worth doing in the first place, eliminate it, or reduce the number of times you do it.
For example, if there are three levels of approval tasks and they are formidable, reduce them by one level to two. In the inspection process, the number of inspection tasks could be reduced by a fraction of a percent by changing to a statistical method that only inspects a small portion of randomly selected products, rather than inspecting all products.
2 Task composition (decomposition)
When tasks are subdivided into smaller chunks, or when tasks are passed between multiple departments, the time required is often longer. Therefore, it may be effective to consolidate multiple tasks into a single task, or to consolidate tasks in your own department without passing them on to other departments. (Conversely, multiple tasks can become inefficient when they are combined into a single task. (In that case, it may be useful to break down the task).
3 Triage
In some cases, running sub-processes in a conditional branch may reduce the time required for a process. For example, in the procurement process, a task following the receipt of a purchase application would be to run different processes for amounts over 10 million and below.
Conversely, if the complexity is compounded by too many sub-processes, you may want to consider consolidating some of them.
FLOW level
It’s an improvement method for the order of tasks, not just a single task.
4 Re-sequencing
Re-sequencing is about reviewing the flow of tasks and rearranging them in the order that is most efficient and requires the least amount of work.
For example, if the procurement process includes two approval tasks, A and B, then on average 1% of the A task will be set back and 10% of the B task will be set back. In this case, bringing task B, which has more regressions, before task A will result in a relative decrease in the number of approvals for A, which will be more efficient and reduce the workload overall.
5 Parallelism enhancement
In some processes, where sequential processing is used, where the next task starts only after the previous task is completed, if the process is changed to one where multiple tasks are processed in parallel instead of sequentially, it is expected to reduce the time required for the entire process.
Changing sequential processing tasks to concurrent processing often has a significant effect on throughput reduction.
PROCESS level
It is an improvement based on another perspective besides the individual tasks and the order between them.
6 Specialization and standardization
Specialization aims to improve efficiency and customer satisfaction by dividing a process into multiple processes and assigning a person in charge to each sub-process to increase the expertise. For example, it is possible to divide a process into VIPs and general customers and provide a speedy and courteous service process especially for VIPs.
On the contrary, standardization is an attempt to unify multiple processes in the case of the same business because they are separated by product, etc.
7 Resource Optimization
When multiple people are running the same business process, the amount of work is concentrated on a particular person while other people are playing around, or when there is a bottleneck due to the lack of people in charge of the same amount of work, it is necessary to “optimize resources” by devising the assignment of people in charge or reviewing the shifts of people in charge.
8 Communication Optimization
If the process flow is driven by some kind of communication, such as a phone call, fax, or email, you may be able to improve efficiency and customer satisfaction by changing the timing of receiving or processing communication, for example.
9 Automation
For routine tasks where there is a clear set of procedures, automation with RPA can be effective. There are also multiple options for automation, such as developing an application that makes automatic decisions based on the input information.
Let’s first consider whether these nine rules of thumb can be applied to your own process improvement/innovation project or DX (Digital Transformation) promotion project when guiding a solution to an individual problem or issue.
As mentioned at the beginning, 29 rules of thumb for process redesign are presented in the “Fundamentals of Business Process Management”.
In addition, please take a look at the MOOCs (e-learning), which is based on the book, for detailed explanations of the nine process redesign methods introduced in the book.
Process mining is an “analytical method”. The mere introduction of a process mining tool doesn’t start anything. You will need to plan a series of steps as an “analytical project” and manage their execution.
However, if you have not done any research or analysis projects in the past, it does not seem to be easy to understand the steps of an analysis project. Therefore, I would like to explain the flow of process mining analysis by contrasting the flow of cooking.
First, let’s see the flow of the food. The assumed location is the kitchen of a restaurant. The first activity is “purchasing foodstuff” and the last is serving dished-up food to customers’ tables.
COOKING FLOW
1 Purchase of foodstuffs
purchase a variety of food from all over the world through food wholesalers.
2 Foodstuff
The foods to be cooked are now available. Check to see if there are any insects eating or rotting.
3 Precooking
prepare the food by chopping it with a knife or boiling it in a pot of boiling water to remove the bitterness.
4 Cooking
Cooks food using a variety of cooking utensils.
5 dishing-up and serving
dish up cooked foods and serve the finished dishes to the customers.
Role of Master Chef
Note that the role of the master chef is to oversee the entire cooking process of the restaurant.
Next, let’s explain the steps of the process mining analysis, corresponding to the above cooking steps.
process mining procedure
1 Extraction of data = Purchase of foodstuff
extract data from various systems that record and accumulate event logs that are the target data for analysis, such as ERP represented by SAP, CRM systems such as Salesforce, or proprietary business systems.
As a method of data extraction, it is common to extract data directly from a DB by SQL.
Data extraction is basically done by system engineers or system administrators, and when the database structure is complex, such as ERP, it is necessary to determine where the data to be analyzed is located, for example, with the assistance of SAP experts who have good knowledge about SAP.
2 Data to be analyzed = Foodstuff
The data extracted from the system is collectively referred to as the “event log. This is because the history of operations on the system is recorded on an event-by-event basis with a time stamp.
As a data format, it would be easier to pre-process the data in the post-process if it were provided in CSV format. In some cases, the event log may be provided in JSON format and the pre-processing of the event log in JSON format can be a bit cumbersome.
3 Data preparation = Precooking
The event log data extracted from the system is often composed of multiple files, often ten or more. It can be a file that records activity and time stamps, etc., as well as a file that contains the master data.
Basically, all the files must be combined into a single file in order to analyze by a process mining tool. In addition, the original files contain a lot of data that cannot be analyzed as it is, such as garbled parts and empty cells that should have contained some kind of value.
Therefore, it is necessary to remove or adjust for those noisy data, that is, perform data cleaning similar to the removal of unfavorable parts of food. Data preparation is the process of processing the original data into clean data that can be analyzed by a process mining tool
Data preparetaion is done by data scientists who know how to process data to make it clean, using ETL tools, Python, and other tools, languages.
4 Analysis = Cooking
Once the data has been pre-processed and the clean data is ready for analysis, it can finally be fed into process mining tools for various analyses.
The process mining tool is a very versatile tool. It takes some training and experience to become proficient, but it’s fun to visualize business processes as a flowchart from event log data that looks like nothing more than a litany of numbers to uncover inefficiencies and bottlenecks.
Analysis with process mining tools requires tool experts who are familiar with the tools used, but it is the process analyst who gives the analytical perspective on how to do the analysis. The data scientist also has a better understanding of the original data through pre-processing of the data, so they can assist in the analytical work.
5 Reporting = Dishing up and serving
create reports using graphs, tables, etc. on the issues and problems of the target process identified from various analysis results with process mining tools. Since the people receiving the report are not necessarily familiar with data analysis, it is necessary to keep in mind the visual presentation that makes it easy to understand what the issue or problem is.
Ideally, the report should be written by a process analyst, with the assistance of a process consultant with process improvement know-how (Lean, Six Sigma, etc.). It’s also good to have the support of a data scientist or tool expert, as additional analysis may be required.
Role of Project Manager
It is the project manager who correspond to the master chef of the restaurant who runs the entire process mining analysis project. A project manager does not have to be familiar with the entire process. However, you must have a good understanding of each step of the process and above all, you must have the skills to execute the project smoothly.
So far I have used the culinary metaphor to explain the standard procedure for process mining analysis. Each process is a highly challenging one that requires a certain level of skill and experience, so it is necessary for experts in each field to work well together to advance the project.
現在のやり方を強化するアプローチです。ここには、TRIZ、制約理論、リーンマネジメント、シックスシグマ、BPR(Business Process Re-engineering)が含まれます。革新(Innovation)というよりは、主に改善(Improvement)のための手法です。現在の業務プロセス、業務内容を把握し、非効率性、ボトルネックなどの問題点を発見し、改善施策を講じます。組織再編も含めた、全社的に根本的な改善を行うのがBPRです。
INTRODUCTION TO PROCESS MINING IN PRACTICE – e-learning course on Udemy
Aalstn e-learning course through Udemy that will teach you the basic knowledge you need to know when implementing process mining, scheduled for early May 2020.
Target participants
Person in charge of implementing process mining in a company or organization
Consultants who are helping to implement process mining
Those who want to become an expert in process mining
Course Features
It’s not about the theoretical aspects of process mining, but more about the content that will help you successfully apply it to your business process improvement.
A comprehensive e-learning course containing process mining principles has been offered since 2014 through Coursera by the godfather of process mining, Professor Wil van der Aalst.
However, this is the world’s first practical introductory course to process mining, as it does not rely on a specific process mining tool and is not yet offered in Japan or the rest of the world as a practical e-learning course focused on business applications to improve business processes.
An English version will be released at a later date.
Benefits for participants
You will learn the basics of process mining from practical aspect.
You will be able to effectively communicate the necessity of the introduction of process mining to your supervisors and other internal stakeholders(person in charge).
You’ll be able to convincingly communicate the value of process mining to your prospects (Process Mining Consultant).
Curriculum
What is process mining?
History of Process Mining
Business environments that make process mining indispensable
Benefits and Expected Returns of Process Mining
Processes to be analyzed
Use cases
Process Mining and Related Solutions (ETL, RPA, BPMs, DWH/Datalake)
What is the event log?
Principles of Process Mining Algorithms
Four Approaches to Process Mining
process discovery
conformance check
process enhancement
Operational Support
How to manage a process mining project
Task Mining (Robotic Process Mining)
Skill sets required for process mining practitioners
process mining tool
– What is process mining – History of process mining – Business environments that make process mining indispensable – Benefits and expected returns of process mining – Target processes to be analyzed – Use cases – Process mining and related solutions(ETL, RPA, BPMs, DWH/Datalake) – What is event log – Principle of process mining algorithm – Four approaches of process mining – Process discovery – Conformance checking – Process enhancement – Operational support – How to manage a process mining project – Basics of data preparation – Task mining/Robotic process mining – Necessary skill set for a process miner – Process mining tools
Task Mining – Three Analytical Perspectives for Improving Labor Productivity.
Based on PC operation logs, “task mining” visualizes the tasks performed on each user’s PC.
In this article, let me explain the three analytical perspectives for task mining.
First, let’s be clear about the purpose of doing task mining. That is improving labor productivity.
Productivity is generally defined as
Output/Input
Then, the “labor productivity” in task mining can be expressed by the following formula;
Labor productivity = amount of value created/labor time(cost) spent
Here, labor input, if per day, would generally be 8 hours, and 40 hours per week and 160 hours per month would be the norm. (If you are on a two-days off per week).
In a nutshell, increased labor productivity is about creating more value with the hours worked. The point is not to increase value by working longer, but to increase the value you create in the same amount of time(cost).
Now, with task mining, PC operations can be recorded and accumulated in detail through sensors installed on individual PCs, allowing analysis to be performed in order to consider improvement measures to be taken to improve labor productivity.
This analysis aimed at improving labor productivity includes the following three analytical perspectives
1 Created Value 2 Efficiency 3 Task to be improved
I will outline one by one.
1 Created Value
The first perspective of a task mining analysis is how much of your business time is spent engaged in value-creating activities.
As you can see from the labor productivity formula, labor is about creating value. Value, to put it plainly, is what contributes directly or indirectly to sales. In the case of factory labor, it is exactly the “product” as a result of creating the value.
In the case of various types of office work, it is not as clear as factory work, but if you are in charge of sales, preparing proposals and quotations are important value creation activities to create sales. The time that any department or business is creating value in some way is called “value creation time”.
On the other hand, time spent watching YouTube videos or just zoning out during work hours is not creating value. This is “non-value creation time. (Note that lunch and break times are not included in the analysis in the first place, as they are not business hours.)
The way to improve labor productivity is to increase value-creating activities as much as possible. However, it is assumed that the unit time of 8 hours will not be increased for a day. Therefore, it is necessary to work on how to reduce the amount of non-value-creating activities, in other words, the amount of slacking and idleness in the eight hours.
Therefore, first of all, task mining classifies business time into “value creation time” and “non-value creation time” from the perspective of value creation.
Value creation activities can also be divided into two categories. They are “high value” and “low value. High value is the aforementioned proposal and quotation writing, if you’re a salesperson. Low-value tasks include such as expense reimbursement and customer travel.
We should aim to reduce low-value operations as much as possible. For example, for expense reimbursement, you can simplify the procedure with a dedicated application, automate it with RPA, and eliminate travel time with web conferencing.
2 Efficiency
Even though they create the same value, it takes different amounts of time depending on the people who work faster or those who work slower. Therefore, after sorting out value creation and non-creation, the next step is to seek efficiency, in other words, to reduce the time to create value keeping the same value created.
When analyzing efficiency in task mining, it is necessary to set a reference value. In short, even if we do the same work, we can’t judge whether the work is highly efficient without setting some kind of evaluation criteria.
In general, this standard value is based on the average processing time by department or job category. The good thing is, unlike interview-based business analysis, task mining can be analyzed based on the actual “PC operation time”.
3 Task to be improved
The third analytical perspective of task mining will be the discovery of improvement potential tasks. While the previous two items (created value and efficiency) focus on business processing time, the improvement potential task focuses on the flow of work.
Firstly, we find and extract the tasks that we think could be improved somehow. The main targets for extraction are “routine patterns”, “multiple mistakes” and “repetition”.
A “routine pattern” is one in which several steps are taken in sequence. Day-off requests and business travel settlements are typical routine patterns. These procedures are often systematized into business systems, so they can be analyzed by process mining, but even if they are not systematized into business systems, they can be discovered by task mining. The “multiple mistakes” or “repetitions” are outliers found in the flow of app and file operations, many of which involve a large amount of “copy and paste”.
Although specific improvement measures for these potential improvement tasks may occur on a case-by-case basis, automation with RPA is the most likely solution.
Above, we have explained that in task mining, the analysis is carried out from the three perspectives of “created value,” “efficiency,” and “task to be improved”.
Task mining can also do other things, such as finding non-compliance processes that are related to compliance, but this is less relevant to improving labor productivity and will be discussed at another time.
「タスクマイニング」は、米ITアドバイザリ企業Gartnerが、『Gartner, Market Guide for Process Mining, Marc Kerremans, 17 Jun 2019』において初めて提唱した表現です。すでに、世界各国、また日本でも、「タスクマイニング」は、PC操作ログに基づく「業務可視化」のソリューション全般を含む一般名称として認知されつつあります。
一方、RPDは、2018年に、Marlon Dumas(Tartu大学教授)、Marcello La Rosa(Melbourne大学教授)らがPC操作ログ分析の研究を通じて、ひとつの方法論として提唱したものです。
Robotic Process Mining: Vision and Challenges Volodymyr Leno, Artem Polyvyanyy, Marlon Dumas, Marcello La Rosa, Fabrizio Maria Maggi
Discovering Automatable Routines From User Interaction Logs Antonio Bosco, Adriano Augusto, Marlon Dumas, Marcello La Rosa, and Giancarlo Fortino
AI for Business Process Management From Process Mining to Automated Process Improvement Marlon Dumas, University of Tartu Institute of Computer Science
What is RPD – Robotic Process Discovery?
Robotic Process Discovery (RPD) is essentially synonymous with “task mining”. That is, it collects and analyzes PC interaction Log, which is the history of the user’s operation of applications and files such as Excel, PowerPoint, and browsers on his or her own PC.
“Task mining” is an expression first proposed by US IT advisory firm Gartner in its report, “Gartner, Market Guide for Process Mining, Marc Kerremans, 17 Jun 2019”. The term “task mining” is already gaining fairly high recognition around the world and in Japan as a general name that includes all solutions for “business visualization” based on PC interaction logs.
“RPD”, on the other hand, is a methodology proposed by Marlon Dumas (Professor at Tartu University) and Marcello La Rosa (Professor at Melbourne University) in 2018 through their research on PC interaction log analysis.
In the case of task mining, it only connotes the big framework of PC interaction log analysis, but RPD shows the basic analysis procedure of PC interaction log mainly for the purpose of “automation of tasks by RPA”.
The following is an overview of how RPD proceeds to analyze the PC interaction log. The references are shown at the end.
Please note that this is a simplified version based on my original understanding. And it should also be noted that the above researchers have recently started to call it RPM (Robotic Process Mining) instead of RPD (Robotic Process Discovery), but I will use RPD in this article.
1 Collection and storage of PC interaction logs
The sensor (a light Javascript program) installed on each PC used by the user to be analyzed detects the user’s detailed activities such as launching applications, opening files, pressing the keyboard, clicking the mouse, etc., and sends the data to a designated server, where it is stored as a PC operation log.
The detailed activity captured by the sensor is called “atomic activity” because it is the smallest unit that cannot be decomposed any further.
2 Data Extraction and Noise Filtering
PC operation logs are very detailed, so called atomic activity. What’s more, there’s a lot of noise in there that can’t be analyzed, such as modified activities due to user error.
Therefore, after extracting PC interaction log data based on some conditions (target period, target PC, etc.), it is necessary to remove (filter) the noise first. In addition, if the notation on the recorded data is slightly different even though it is the same application, it will be processed as a different application, so we can unify the notation, correct garbled characters, and perform various data processing other than noise removal. This work is commonly referred to as “Data Preparation”.
3 Task Segmentation
In RPD, “segmentation” means to isolate a group of tasks from the PC operation log that are assumed to have followed a certain procedure. For example, “Copying and pasting information displayed on the browser screen into an Excel file” is a task to extract a series of tasks to accomplish some purpose, so-called “routine tasks”.
Unlike business systems (e.g., procurement systems) with pre-built business procedures, PC operation has a high degree of freedom for the user, and at a glance, PC operation logs look like they are just moving various applications and files at will, and business procedures are not clear.
Therefore, it is necessary to perform “task segmentation”, that is, to isolate only the data related to a single task from the PC operation log.
4 Task Simplification
The tasks extracted by the segmentation, such as data transcription, still contain some noise. Many of them are caused by user mistakes or parallel operation in other applications, but if you remove these noises, you can clearly understand the steps in each PC operation. The aforementioned example reveals a flow that reveals the following clear steps
Excel File Open (Excel) ⇒ Data Display Screen Access (Browser) ⇒ Data Copy (Browser) ⇒ Paste (Excel) ⇒ Data Display Screen Access (Browser) ⇒ Data Copy (Browser) ⇒ Paste (Excel)…
The finishing touches that make it possible to understand the procedure clearly are called “task simplification”.
5 Identification of candidate tasks which can be automated
From the PC interaction log data extracted for analysis, we were able to isolate multiple tasks through task segmentation and clearly understand the flow of each task through task simplification.
The next step is to consider which of these tasks are suitable for RPA automation and whether they are likely to produce a reasonable effect. At this stage, it is advisable to interview the person in charge in the field who is actually performing the candidate task in detail. (In reality, even the task segmentation and task-simplification stages can be done quickly with the help of field personnel.)
6 Automatable procedure discovery
This is the stage where the scope of automation with RPA is determined. The tasks identified in the previous section as being better suited for automation are not necessarily all automatable from beginning to end.
So, we will further narrow down the steps that can be automated. For example, if the procedure of the automation candidate task identified in the previous section is [A ⇒ B ⇒ C ⇒ D ⇒ E ⇒ F], then only [C ⇒ D ⇒ E ⇒ F] is to be automated (A ⇒ B remains the current one).
7 Create specifications for automation procedures
Once you have narrowed down the steps that can be automated, consider the requirements for the automatic execution of the task by any RPA tool and create a “basic design document” for the programming in the next section.
8 RPA programming
This is to be done on an RPA tool writing an actual automation procedure. After testing in the actual environment and verifying that it works without any problems, the RPA robot is ready to go live.
Although it may not be easy to get an image of the RPD just by explaining it in words, I have explained the general procedure of RPD.
Whateve you call it, RPD, RPM or task mining, the main focus is to develop various improvement measures with the main objective of improving productivity through visualization of operations at each PC. There are a variety of specific improvement measures, but we hope you understand that RPD is an analysis method with the basic purpose of “task automation” in particular.
It should also be emphasized that RPD, or task mining, is most effective when combined with “process mining”, which visualizes business processes across multiple departments.
[References]
Robotic Process Mining: Vision and Challenges Volodymyr Leno, Artem Polyvyanyy, Marlon Dumas, Marcello La Rosa, Fabrizio Maria Maggi
Discovering Automatable Routines From User Interaction Logs Antonio Bosco, Adriano Augusto, Marlon Dumas, Marcello La Rosa, and Giancarlo Fortino
AI for Business Process Management From Process Mining to Automated Process Improvement Marlon Dumas, University of Tartu Institute of Computer Science
“Process mining” was born in the late 1990s and last year turned 20 years old. In 2019, a new concept called “task mining” appeared.
In this article, I would like to organize and sort out the differences in purpose and positioning, including “SIEM: Security Information and Event Management”, which is a similar solution to process mining and task mining.
First, the difference between process
mining and task mining. In simple terms, the data to be analyzed is different.
Process mining analyzes the event logs
(transaction data) recorded and accumulated in business systems such as ERP,
CRM, and SFA. The recorded data is based on activities such as “purchase
request” and “purchase approval” when the “send” or
“update” button of the system is pressed, and the granularity of only
the “milestone” of the business Is a rough thing.
On the other hand, task mining analyzes the
detailed operations on PCs that employees operate individually, specifically,
the “PC operation log” that records application launches, file opens, mouse
clicks, copy and paste, etc. Eligible. Compared to the event log extracted from
the business system, it is “atomic” detailed data that cannot be
further decomposed and can be analyzed at the task level. Since these PC
operation logs are not recorded anywhere, install software called sensors or
agents on the PC to be analyzed and actively capture and collect PC operations
as data. A mechanism to accumulate on the server is required.
“SIEM” is a similar solution
adjacent to process mining and task mining. It analyzes security logs, network
devices, and various logs remaining on servers to find security-related issues
such as cyber attacks and data leaks, and manages IT devices as assets. And so
on.
Now, since these solutions basically
analyze data generated in the “workplace”, they can be broadly put into the
framework of “Workplace Analytics”.
Now let’s position process mining, task
mining, SIEM, and their key solutions within the framework of workplace
analytics. (See the figure below)
Look around the double arrow at the bottom
of the figure. Process mining is “process improvement oriented”,
while “SIEM” is “risk aversion and management oriented”.
Task mining is located in the middle. This is because task mining can be used
for attendance management because it allows you to understand the entire daily
work of employees. (In process mining, since only the data of operations performed
on the business system is the analysis target, it is not possible to grasp the
entire business of the day.)
In addition, process mining and task mining
can be surrounded by the framework of “process intelligence”, but SIEM is not
included because “process” is not analyzed.
And process mining is “DX-driven” because it is effective for process reform of the entire company and approach from the viewpoint of digital transformation (DX), while task mining is ultimately an automation at the task level Because it is often aimed at a certain RPA, it can be said that it is “RPA-driven”.
Let’s look at the key solutions in each
category. At this time (February 2020), two key players in the Japanese process
mining market are Celonois and myInvenio. Both tools are enterprise solutions
with rich functions and excellent operability, and the number of enterprises,
especially large enterprises, is increasing. And recently, both tools have
added a “task mining function”. By being able to create not only
event log data from business systems, but also flow charts (process models)
from PC operation logs, it can be said that it meets the analysis needs
necessary for RPA to aim for task-level automation Will be.
In the task mining category, heartcore,
myInvenio’s sole agent in Japan, provides Heartcore Task Mining. In addition,
MeeCap, which has a track record of introduction in the banking industry, has
begun to expand to a process mining function that analyzes event logs from ERP
and other sources.
In the SIEM category, Splunk and Skysea View are known, but Splunk has added a process flowchart function. However, it seems that analysis cannot be performed until the event log is imported.