首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Many present-day companies carry out a huge amount of daily operations through the use of their information systems without ever having done their own enterprise modeling. Business process mining is a well-proven solution which is used to discover the underlying business process models that are supported by existing information systems. Business process discovery techniques employ event logs as input, which are recorded by process-aware information systems. However, a wide variety of traditional information systems do not have any in-built mechanisms with which to collect events (representing the execution of business activities). Various mechanisms with which to collect events from non-process-aware information systems have been proposed in order to enable the application of process mining techniques to traditional information systems. Unfortunately, since business processes supported by traditional information systems are implicitly defined, correlating events into the appropriate process instance is not trivial. This challenge is known as the event correlation problem. This paper presents an adaptation of an existing event correlation algorithm and incorporates it into a technique in order to collect event logs from the execution of traditional information systems. The technique first instruments the source code to collect events together with some candidate correlation attributes. Based on several well-known design patterns, the technique provides a set of guidelines to support experts when instrumenting the source code. The event correlation algorithm is subsequently applied to the data set of events to discover the best correlation conditions, which are then used to create event logs. The technique has been semi-automated to facilitate its validation through an industrial case study involving a writer management system and a healthcare evaluation system. The study demonstrates that the technique is able to discover an appropriate correlation set and obtain well-formed event logs, thus enabling business process mining techniques to be applied to traditional information systems.  相似文献   

2.
Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining techniques used to discover process models from event logs, find log and model deviations, and analyze performance characteristics of processes. The representational bias (a way to model processes) plays an important role in process mining. The BPMN 2.0 (Business Process Model and Notation) standard is widely used and allows to build conventional and understandable process models. In addition to the flat control flow perspective, subprocesses, data flows, resources can be integrated within one BPMN diagram. This makes BPMN very attractive for both process miners and business users, since the control flow perspective can be integrated with data and resource perspectives discovered from event logs. In this paper, we describe and justify robust control flow conversion algorithms, which provide the basis for more advanced BPMN-based discovery and conformance checking algorithms. Thus, on the basis of these conversion algorithms low-level models (such as Petri nets, causal nets and process trees) discovered from event logs using existing approaches can be represented in terms of BPMN. Moreover, we establish behavioral relations between Petri nets and BPMN models and use them to adopt existing conformance checking and performance analysis techniques in order to visualize conformance and performance information within a BPMN diagram. We believe that the results presented in this paper can be used for a wide variety of BPMN mining and conformance checking algorithms. We also provide metrics for the processes discovered before and after the conversion to BPMN structures. Cases for which conversion algorithms produce more compact or more complicated BPMN models in comparison with the initial models are identified.  相似文献   

3.
过程挖掘可以根据企业信息系统生成的事件日志建立业务过程模型。当实际业务过程发生变化时,过程模型与事件日志之间会产生偏差,这时需要对过程模型进行修正。对于含有并行结构的过程模型修复,由于加入自环和不可见变迁等因素,有些现有的修正方法的精度会降低。因此提出一种基于逻辑Petri网和托肯重演的并行结构过程模型修复方法。首先根据子模型的输入输出库所与日志的关系,确定子模型的插入位置;然后通过托肯重演的方式确定偏差所在位置;最后根据基于逻辑Petri网提出的方法进行过程模型的修复。在ProM平台上进行了仿真实验,验证了该方法的正确性和有效性,并与Fahland等方法进行对比分析。结果表明,所提方法的精度达到85%左右,相比Fahland、Goldratt方法分别提高了17和11个百分点;在简洁度方面该算法没有增加自环和不可见变迁,而Fahland和Goldratt方法均增加了不可见变迁和自环;三种方法的拟合度均在0.9以上,而Goldratt方法略低一些。以上证明用所提方法修正后的模型具有更高的拟合度和精度。  相似文献   

4.
With organisations facing significant challenges to remain competitive, Business Process Improvement (BPI) initiatives are often conducted to improve the efficiency and effectiveness of their business processes, focussing on time, cost, and quality improvements. Event logs which contain a detailed record of business operations over a certain time period, recorded by an organisation's information systems, are the first step towards initiating evidence-based BPI activities. Given an (original) event log as a starting point, an approach to explore better ways to execute a business process was developed, resulting in an improved (perturbed) event log. Identifying the differences between the original event log and the perturbed event log can provide valuable insights, helping organisations to improve their processes. However, there is a lack of automated techniques and appropriate visualisations to detect the differences between two event logs. Therefore, this research aims to develop visualisation techniques to provide targeted analysis of resource reallocation and activity rescheduling. The differences between two event logs are first identified. The changes between the two event logs are conceptualised and realised with a number of visualisations. With the proposed visualisations, analysts are able to identify resource- and time-related changes that resulted in a cost reduction, and subsequently investigate and translate them into actionable items for BPI in practice. Ultimately, analysts can make use of this comparative information to initiate evidence-based BPI activities.  相似文献   

5.
Process discovery, as one of the most challenging process analysis techniques, aims to uncover business process models from event logs. Many process discovery approaches were invented in the past twenty years; however, most of them have difficulties in handling multi-instance sub-processes. To address this challenge, we first introduce a multi-instance business process model (MBPM) to support the modeling of processes with multiple sub-process instantiations. Formal semantics of MBPMs are precisely defined by using multi-instance Petri nets (MPNs) that are an extension of Petri nets with distinguishable tokens. Then, a novel process discovery technique is developed to support the discovery of MBPMs from event logs with sub-process multi-instantiation information. In addition, we propose to measure the quality of the discovered MBPMs against the input event logs by transforming an MBPM to a classical Petri net such that existing quality metrics, e.g., fitness and precision, can be used. The proposed discovery approach is properly implemented as plugins in the ProM toolkit. Based on a cloud resource management case study, we compare our approach with the state-of-the-art process discovery techniques. The results demonstrate that our approach outperforms existing approaches to discover process models with multi-instance sub-processes.   相似文献   

6.
It is increasingly common to see computer-based simulation being used as a vehicle to model and analyze business processes in relation to process management and improvement. While there are a number of business process management (BPM) and business process simulation (BPS) methodologies, approaches and tools available, it is more desirable to have a systemic BPS approach for operational decision support, from constructing process models based on historical data to simulating processes for typical and common problems. In this paper, we have proposed a generic approach of BPS for operational decision support which includes business processes modeling and workflow simulation with the models generated. Processes are modeled with event graphs through process mining from workflow logs that have integrated comprehensive information about the control-flow, data and resource aspects of a business process. A case study of a credit card application is presented to illustrate the steps involved in constructing an event graph. The evaluation detail is also given in terms of precision, generalization and robustness. Based on the event graph model constructed, we simulate the process under different scenarios and analyze the simulation logs for three generic problems in the case study: 1) suitable resource allocation plan for different case arrival rates; 2) teamwork performance under different case arrival rates; and 3) evaluation and prediction for personal performances. Our experimental results show that the proposed approach is able to model business processes using event graphs and simulate the processes for common operational decision support which collectively play an important role in process management and improvement.  相似文献   

7.
An automated process discovery technique generates a process model from an event log recording the execution of a business process. For it to be useful, the generated process model should be as simple as possible, while accurately capturing the behavior recorded in, and implied by, the event log. Most existing automated process discovery techniques generate flat process models. When confronted to large event logs, these approaches lead to overly complex or inaccurate process models. An alternative is to apply a divide-and-conquer approach by decomposing the process into stages and discovering one model per stage. It turns out, however, that existing divide-and-conquer process discovery approaches often produce less accurate models than flat discovery techniques, when applied to real-life event logs. This article proposes an automated method to identify business process stages from an event log and an automated technique to discover process models based on a given stage-based process decomposition. An experimental evaluation shows that: (i) relative to existing automated process decomposition methods in the field of process mining, the proposed method leads to stage-based decompositions that are closer to decompositions derived by human experts; and (ii) the proposed stage-based process discovery technique outperforms existing flat and divide-and-conquer discovery techniques with respect to well-accepted measures of accuracy and achieves comparable results in terms of model complexity.  相似文献   

8.
在跨企业、跨系统的环境中,流程数据通常记录在单独的事件日志中,这使得无法挖掘完整的端到端的执行流程,因此本算法提出仅使用事件名称以及时间戳属性对日志进行合并。首先分别获取两个系统的过程模型以及根据活动的跨系统跟随依赖关系获得的合并模型,接着将两个系统的流程一对一进行合并并按照时间戳排序,留下与合并模型路径一致的合并流程,然后从这些流程中获得一对一的实例对,即唯一主流程仅与唯一子流程可以合并,再从这些实例对中挖掘活动间的时间约束用于剩余日志的合并,重复最后两步直到所有日志均合并或无法一对一合并日志。该算法在真实的事件日志上进行了实验,达到了满意的合并效果并获得较高的准确率与召回率。  相似文献   

9.
Nowadays, business processes are increasingly supported by IT services that produce massive amounts of event data during the execution of a process. These event data can be used to analyze the process using process mining techniques to discover the real process, measure conformance to a given process model, or to enhance existing models with performance information. Mapping the produced events to activities of a given process model is essential for conformance checking, annotation and understanding of process mining results. In order to accomplish this mapping with low manual effort, we developed a semi-automatic approach that maps events to activities using insights from behavioral analysis and label analysis. The approach extracts Declare constraints from both the log and the model to build matching constraints to efficiently reduce the number of possible mappings. These mappings are further reduced using techniques from natural language processing, which allow for a matching based on labels and external knowledge sources. The evaluation with synthetic and real-life data demonstrates the effectiveness of the approach and its robustness toward non-conforming execution logs.  相似文献   

10.
11.
Service processes, for example in transportation, telecommunications or the health sector, are the backbone of today׳s economies. Conceptual models of service processes enable operational analysis that supports, e.g., resource provisioning or delay prediction. In the presence of event logs containing recorded traces of process execution, such operational models can be mined automatically.In this work, we target the analysis of resource-driven, scheduled processes based on event logs. We focus on processes for which there exists a pre-defined assignment of activity instances to resources that execute activities. Specifically, we approach the questions of conformance checking (how to assess the conformance of the schedule and the actual process execution) and performance improvement (how to improve the operational process performance). The first question is addressed based on a queueing network for both the schedule and the actual process execution. Based on these models, we detect operational deviations and then apply statistical inference and similarity measures to validate the scheduling assumptions, thereby identifying root-causes for these deviations. These results are the starting point for our technique to improve the operational performance. It suggests adaptations of the scheduling policy of the service process to decrease the tardiness (non-punctuality) and lower the flow time. We demonstrate the value of our approach based on a real-world dataset comprising clinical pathways of an outpatient clinic that have been recorded by a real-time location system (RTLS). Our results indicate that the presented technique enables localization of operational bottlenecks along with their root-causes, while our improvement technique yields a decrease in median tardiness and flow time by more than 20%.  相似文献   

12.
A novel approach for process mining based on event types   总被引:2,自引:0,他引:2  
Despite the omnipresence of event logs in transactional information systems (cf. WFM, ERP, CRM, SCM, and B2B systems), historic information is rarely used to analyze the underlying processes. Process mining aims at improving this by providing techniques and tools for discovering process, control, data, organizational, and social structures from event logs, i.e., the basic idea of process mining is to diagnose business processes by mining event logs for knowledge. Given its potential and challenges it is no surprise that recently process mining has become a vivid research area. In this paper, a novel approach for process mining based on two event types, i.e., START and COMPLETE, is proposed. Information about the start and completion of tasks can be used to explicitly detect parallelism. The algorithm presented in this paper overcomes some of the limitations of existing algorithms such as the α-algorithm (e.g., short-loops) and therefore enhances the applicability of process mining.
Jiaguang SunEmail:
  相似文献   

13.
Process mining can be seen as the “missing link” between data mining and business process management. The lion's share of process mining research has been devoted to the discovery of procedural process models from event logs. However, often there are predefined constraints that (partially) describe the normative or expected process, e.g., “activity A should be followed by B” or “activities A and B should never be both executed”. A collection of such constraints is called a declarative process model. Although it is possible to discover such models based on event data, this paper focuses on aligning event logs and predefined declarative process models. Discrepancies between log and model are mediated such that observed log traces are related to paths in the model. The resulting alignments provide sophisticated diagnostics that pinpoint where deviations occur and how severe they are. Moreover, selected parts of the declarative process model can be used to clean and repair the event log before applying other process mining techniques. Our alignment-based approach for preprocessing and conformance checking using declarative process models has been implemented in ProM and has been evaluated using both synthetic logs and real-life logs from a Dutch hospital.  相似文献   

14.
Increasingly, business processes are being controlled and/or monitored by information systems. As a result, many business processes leave their “footprints” in transactional information systems, i.e., business events are recorded in so-called event logs. Process mining aims at improving this by providing techniques and tools for discovering process, control, data, organizational, and social structures from event logs, i.e., the basic idea of process mining is to diagnose business processes by mining event logs for knowledge. In this paper we focus on the potential use of process mining for measuring business alignment, i.e., comparing the real behavior of an information system or its users with the intended or expected behavior. We identify two ways to create and/or maintain the fit between business processes and supporting information systems: Delta analysis and conformance testing. Delta analysis compares the discovered model (i.e., an abstraction derived from the actual process) with some predefined processes model (e.g., the workflow model or reference model used to configure the system). Conformance testing attempts to quantify the “fit” between the event log and some predefined processes model. In this paper, we show that Delta analysis and conformance testing can be used to analyze business alignment as long as the actual events are logged and users have some control over the process.
W. M. P. van der AalstEmail:
  相似文献   

15.
ContextThe alignment degree existing between a business process and the supporting software systems strongly affects the performance of the business process execution. Methodologies and tools are needed for detecting the alignment level and keeping a business process aligned with the supporting software systems even when they evolve.ObjectiveThis paper aims to provide an adequate support for managing such a kind of alignment and suggesting evolution actions if misalignment is detected. It proposes an approach including modeling and measuring activities for evaluating the alignment level and suggesting evolution activities, if needed.MethodThe proposed approach is composed of three main phases. The first phase regards the modeling of business process and software systems supporting it by applying a modeling notation based on UML and adequately extended for representing business processes. The second phase concerns the evaluation of the alignment degree through the assessment of a set of metrics codifying the alignment concept. Finally, the last phase analyses the evaluation results for suggesting evolution activities if misalignment is detected.ResultsThe paper analyses the application of the proposed approach to a case study regarding a working business process and related software system. The obtained results provided useful suggestion for evolving the supporting software system and improving the alignment level existing between them and the supported business process.ConclusionThe approach contributes in all phases of the process and software system evolution, even if its improvement can be needed for identifying the impact of the changes. The proposed approach facilitates the understanding of business processes, software systems and related models. This favors the interaction of the software and business analysts, as it was possible to better formulate the interviews to be conducted with regard to the objectives and, thus, to collect the required data.  相似文献   

16.
在业务过程发现的一致性检测中,现有事件日志与过程模型的多视角对齐方法一次只能获得一条迹与过程模型的最优对齐;并且最优对齐求解中的启发函数计算复杂,以致最优对齐的计算效率较低。为此,提出一种基于迹最小编辑距离的、事件日志的批量迹与过程模型的多视角对齐方法。首先选取事件日志中的多条迹组成批量迹,使用过程挖掘算法得到批量迹的日志模型;进而获取日志模型与过程模型的乘积模型及其变迁系统,即为批量迹的搜索空间;然后设计基于Petri网变迁序列集合与剩余迹的最小编辑距离的启发函数来加快A*算法;最后设计可调节数据和资源视角所占权重的多视角代价函数,在乘积模型的变迁系统上提出批量迹中每条迹与过程模型的多视角最优对齐方法。仿真实验结果表明,相比已有工作,在计算批量迹与过程模型间的多视角对齐时,所提方法占用更少的内存空间和使用更少的运行时间。该方法提高了最优对齐的启发函数计算速度,可以一次获得批量迹的所有最优对齐,进而提高了事件日志与过程模型的多视角对齐效率。  相似文献   

17.
Traditional process mining techniques offer limited possibilities to analyze business processes working in low-predictable and dynamic environments. Recently, to close this gap, declarative process models have been introduced to represent process mining results since they allow for describing complex behaviors as a compact set of business rules. However, in this context, activities of a business process are still considered as atomic/instantaneous events. This is a strong limitation for these approaches because often, in realistic environments, process activities are not instantaneous but executed across a time interval and pass through a sequence of states of a lifecycle. This paper investigates how the existing techniques for the discovery of declarative process models can be adapted when the business process under analysis contains non-atomic activities. In particular, we base our proposed approach on the use of discriminative rule mining to determine how the characteristics of the activity lifecycles in a business process influence the validity of a business rule in that process. The approach has been implemented as a plug-in of the process mining tool ProM and validated on synthetic logs and on a real-life log recorded by an incident and problem management system called VINST in use at Volvo IT Belgium.  相似文献   

18.
As the promotion of technologies and applications of Big Data, the research of business process management (BPM) has gradually deepened to consider the impacts and challenges of big business data on existing BPM technologies. Recently, parallel business process mining (e.g. discovering business models from business visual data, integrating runtime business data with interactive business process monitoring visualisation systems and summarising and visualising historical business data for further analysis, etc.) and multi-perspective business data analytics (e.g. pattern detecting, decision-making and process behaviour predicting, etc.) have been intensively studied considering the steep increase in business data size and type. However, comprehensive and in-depth testing is needed to ensure their quality. Testing based solely on existing business processes and their system logs is far from sufficient. Large-scale randomly generated models and corresponding complete logs should be used in testing. To test parallel algorithms for discovering process models, different log completeness and generation algorithms were proposed. However, they suffer from either state space explosion or non-full-covering task dependencies problem. Besides, most existing generation algorithms rely on random executing strategy, which leads to low and unstable efficiency. In this paper, we propose a novel log completeness type, that is, #TAR completeness, as well as its generation algorithm. The experimental results based on a series of randomly generated process models show that the #TAR complete logs outperform the state-of-the-art ones with lower capacity, fuller dependencies covering and higher generating efficiency.  相似文献   

19.
Organizations increasingly rely on business process analysis to improve operations performance. Process Mining can be exploited to distill models from real process executions recorded in event logs, but existing techniques show some limitations when applied in complex domains, where human actors have high degree of freedom in the execution of activities thus generating highly variable processes instances. This paper contributes to the research on Process Mining in highly variable domains, focusing on the generation of process instance models (in the form of instance graphs) from simple event logs. The novelty of the approach is in the exploitation of filtering Process Discovery (PD) techniques coupled with repairing, which allows obtaining accurate models for any instance variant, even for rare ones. It is argued that this provides the analyst with a more complete and faithful knowledge of a highly variable process, where no process execution can be really targeted as “wrong” and hence overlooked. The approach can also find application in more structured domains, in order to obtain accurate models of exceptional behaviors. The quality of generated models will be assessed by suitable metrics and measured in empirical experiments enlightening the advantage of the approach.  相似文献   

20.
Business processes leave trails in a variety of data sources (e.g., audit trails, databases, and transaction logs). Hence, every process instance can be described by a trace, i.e., a sequence of events. Process mining techniques are able to extract knowledge from such traces and provide a welcome extension to the repertoire of business process analysis techniques. Recently, process mining techniques have been adopted in various commercial BPM systems (e.g., BPM|one, Futura Reflect, ARIS PPM, Fujitsu Interstage, Businesscape, Iontas PDF, and QPR PA). Unfortunately, traditional process discovery algorithms have problems dealing with less structured processes. The resulting models are difficult to comprehend or even misleading. Therefore, we propose a new approach based on trace alignment. The goal is to align traces in such a way that event logs can be explored easily. Trace alignment can be used to explore the process in the early stages of analysis and to answer specific questions in later stages of analysis. Hence, it complements existing process mining techniques focusing on discovery and conformance checking. The proposed techniques have been implemented as plugins in the ProM framework. We report the results of trace alignment on one synthetic and two real-life event logs, and show that trace alignment has significant promise in process diagnostic efforts.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号