首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This contribution analyses the state of the art in two parts. Based on general quasi‐homogeneous and heterogeneous reactor models, their benefits and disadvantages as well as simplification options are discussed, taking into consideration catalyst deactivation. Further on, the article focuses on the issue of model trimming for industrial applications. A distinction is made between process rationalisation and the development of new processes. In process rationalisation, models can be tested/adjusted on the basis of meas ured values from industrial reactors. This is not feasible when developing new processes. Thus, the major factors influencing the reliability of prediction, such as model assumptions, initial and boundary conditions, mathematical and computation restrictions, chemical kinetics, thermal and mass transport as well as flow distribution, are analysed first. Eventually, this contribution proposes ways of model trimming and resolving problems of scale transfer.  相似文献   

2.
The article focuses on the issue of model trimming for industrial applications. A distinction is made between process rationalisation and the development of new processes. In process rationalisation, models can be tested/adjusted on the basis of measured values from industrial reactors. This is not feasible when developing new processes. Thus, the major factors influencing the reliability of prediction, such as model assumptions, initial and boundary conditions, mathematical and computation restrictions, chemical kinetics, thermal and mass transport as well as flow distribution, are analysed first. Eventually, this contribution proposes ways of model trimming and resolving problems of scale transfer.  相似文献   

3.
In internal rubber‐mixing processes, data‐driven soft sensors have become increasingly important for providing online measurements for the Mooney viscosity information. Nevertheless, the prediction uncertainty of the model has rarely been explored. Additionally, traditional viscosity prediction models are based on single models and, thus, may not be appropriate for complex processes with multiple recipes and shifting operating conditions. To address both problems simultaneously, we propose a new ensemble Gaussian process regression (EGPR)‐based modeling method. First, several local Gaussian process regression (GPR) models were built with the training samples in each subclass. Then, the prediction uncertainty was adopted to evaluate the probabilistic relationship between the new test sample and several local GPR models. Moreover, the prediction value and the prediction variance was generated automatically with Bayesian inference. The prediction results in an industrial rubber‐mixing process show the superiority of EGPR in terms of prediction accuracy and reliability. © 2014 Wiley Periodicals, Inc. J. Appl. Polym. Sci. 2015 , 132, 41432.  相似文献   

4.
Several data‐driven soft sensors have been applied for online quality prediction in polymerization processes. However, industrial data samples often follow a non‐Gaussian distribution and contain some outliers. Additionally, a single model is insufficient to capture all of the characteristics in multiple grades. In this study, the support vector clustering (SVC)‐based outlier detection method was first used to better handle the nonlinearity and non‐Gaussianity in data samples. Then, SVC was integrated into the just‐in‐time Gaussian process regression (JGPR) modeling method to enhance the prediction reliability. A similar data set with fewer outliers was constructed to build a more reliable local SVC–JGPR prediction model. Moreover, an ensemble strategy was proposed to combine several local SVC–JGPR models with the prediction uncertainty. Finally, the historical data set was updated repetitively in a reasonable way. The prediction results in the industrial polymerization process show the superiority of the proposed method in terms of prediction accuracy and reliability. © 2015 Wiley Periodicals, Inc. J. Appl. Polym. Sci. 2015 , 132, 41958.  相似文献   

5.
Water network with regeneration schemes (e.g., regeneration reuse, regeneration recycling) can reduce freshwater consumption further than water network merely with direct reuse. Regeneration reuse, compared with regeneration recycling, can additionally avoid unexpected accumulation of contaminants. Owing to these features, process decomposition can help to reduce freshwater usage and wastewater discharge of regeneration reuse water systems and achieve the results, which graphical method delivers. In this article, the effect of decomposition on water‐using process and further on regeneration reuse water system is briefly analyzed on the concentration‐mass load diagram. Then a superstructure and three sequential mathematical models, which take process decomposition into account, are in turn developed to optimize single contaminant regeneration reuse water systems. By several examples, the reliability of the models is verified. Moreover, several decomposition strategies are summarized to realize the regeneration reuse water network, which attains the targets from graphical method. The results indicate that postregeneration concentration has a major impact on the scheme of process decomposition. © 2009 American Institute of Chemical Engineers AIChE J, 2009  相似文献   

6.
The availability of predictive models for chemical processes is the basic prerequisite for offline process optimization. In cases where a predictive model is missing for a process unit within a larger process flowsheet, measured operating data of the process can be used to set up such models combining physical knowledge and process data. In this contribution, the creation and integration of such gray‐box models within the framework of a flowsheet simulator is presented. Results of optimization using different gray‐box models are shown for a virtual cumene process.  相似文献   

7.
《Ceramics International》2023,49(1):613-624
Materials reliability analysis is one of the most important substances in industrial manufacturing and practical application. Weibull statistics is a common-used approach to evaluate reliability, especially for brittle materials. However, such a process is limited by the insufficient number of samples and complex analysis steps. Herein, a machine learning-assisted strategy to analyze the reliability of the materials was proposed. The WCCo-based cemented carbides were taken as the target materials. The machine learning models coupled feature engineering methods with advanced machine learning algorithms. Through an evaluation by designed experiments, the artificial neural network algorithm is determined to be the best machine learning algorithm to accurately capture the variation of property data to identify their distribution and automatically predict the Weibull modulus for reliability evaluation. This study provided a novel approach to evaluate the reliability accurately and shows the application potential to design the process parameters of other materials.  相似文献   

8.
Langmuir‐Hinshelwood models with and without rate controlling step assumptions have been compared. Hydrogenations of toluene, cinnamaldehyde and mixture of methylesters of fatty acids were used as model systems. Integral data were obtained in semibatch reactors. The proposed models have been simplified by neglecting statistically insignificant parameters. The predicted mixture composition and surface coverages have been compared and the reliability of parameter estimates tested.  相似文献   

9.
Storage tanks are important elements of a self-operating closed processing system. The reliability and availability of storage equipment essentially depends on the flow behaviour of particulate solids in storage containers. A wide residence time distribution (i.e. too long a storage time at rest) in silos, bunkers or transportation containers, respectively, can lead to the hazardous problem of so-called “time consolidation” of particulate solids. During this hardening process, solid bridges are forming with resulting solidification and solid properties of bulk material. In principle, there are four main physico-chemical effects in bulk materials storing and handling which can produce solid briding between the particle contacts due to crystallisation, chemical reactions, solidification of high-viscous bonding agents and sintering. Generally, new adhesion force based models are presented to describe the consolidation kinetics of particulate solids. Preliminary solutions of kinetic model equations are discussed and compared with new test results and practical conclusions are drawn concerning the reliable processing, storage and transportation of bulk materials.  相似文献   

10.
11.
Computer-aided generation of chemical engineering process models. The availability of adequate process models is still the most striking bottleneck for routine application of model-based techniques in process design and operation. The development of knowledge-based software tools to support process modeling is considered an important contribution to a solution of this problem. After a brief summary of the modeling tools offered in current simulator architectures, a future knowledge-based modeling tool is sketched. In addition to the software engineering, the acquisition and formalization of the modeling knowledge – comprising meaningful submodels and generic modeling strategies – is essential for the feasibility of the approach and for the acceptance of a tool. Therefore, these two themes are discussed in considerable detail. Finally, first prototype developments towards a knowledge-based modeling tool are presented.  相似文献   

12.
The accurate prediction of the viscosity of emulsions is highly important for oil well exploitation. Commonly used models for predicting the viscosity of water‐in‐oil (W/O) emulsions composed by two or three factors cannot always fit well the viscosity of W/O emulsions, especially in the case of non‐Newtonian W/O emulsions. An innovative and comprehensive method for predicting the viscosity of such emulsions was developed based on the Lederer, Arrhenius, and Einstein models, using experimental data. Compared with the commonly applied W/O emulsion viscosity models, the proposed method considers more factors, including temperature, volume fraction of water, shear rate, and viscosity of the continuous (oil) and dispersed phase (water). Numerous published data points were collected from the literature to verify the accuracy and reliability of the method. The calculation results prove the high accuracy of the model.  相似文献   

13.
As a part of the initiative Industry 4.0, the level of digitization in the process industry is increasing. Through a close link between analytics and dynamic modeling, digitization enables the optimization in processing industry or the model‐based control. For this purpose, mathematical models of the process level enabling real time simulations have to be known. In the field of solid‐liquid separation the advancing digitization forces the development of numerical flow simulation methods and short‐cut models to deepen process understanding and mathematical process modeling. Resolved flow simulations should be seen as support for experiments to develop suitable dynamic models for short‐cut models. Based on the prerequisites, the development of dynamic process models and their application using the example of solid bowl centrifuges is described.  相似文献   

14.
This paper proposes a new class of integer‐valued autoregressive models with a dynamic survival probability. The peculiarity of this class of models lies in the specification of the survival probability through a stochastic recurrence equation. The proposed models can effectively capture changing dependence over time and enhance both the in‐sample and out‐of‐sample performance of integer‐valued autoregressive models. This point is illustrated through an empirical application to a real‐time series of crime reports. Additionally, this paper discusses the reliability of likelihood‐based inference for the class of models. In particular, this study proves the consistency of the maximum likelihood estimator and a plug‐in estimator for the conditional probability mass function in a misspecified model setting.  相似文献   

15.
Tight integration through material and energy recycling is essential to the energy efficiency and economic viability of process and energy systems. Equation‐oriented (EO) steady‐state process simulation and optimization are key enablers in the optimal design of integrated processes. A new process modeling and simulation concept based on pseudo‐transient continuation is introduced. An algorithm for reformulating the steady‐state models of process unit operations as differential‐algebraic equation systems that are statically equivalent with the original model is presented. These pseudo‐transient models improve the convergence of EO process flowsheet simulations by expanding the convergence basin. This concept is used to build a library of pseudo‐transient models for common process unit operations, and this modeling concept seamlessly integrates with a previously developed time‐relaxation optimization algorithm. Two design case studies are presented to validate the proposed framework. © 2014 American Institute of Chemical Engineers AIChE J 60: 4104–4123, 2014  相似文献   

16.
Liquid membrane technology – A survey of associated phenomena, transport mechanisms, and models. Liquid membrane permeation is the name given to a simultaneous extraction/stripping process. A summary of related phenomena, mass transfer, and models is presented. A new model has been developed to describe mass transfer in liquid membranes, in which the spherical droplets serve as reaction centres. An account of present applications demonstrates the limitations of liquid membrane processes.  相似文献   

17.
This article outlines advances in molecular modeling and simulation using massively parallel high‐performance computers (HPC). In the SkaSim project, partners from the HPC community collaborated with users from science and industry. The aim was to optimize the prediction of thermodynamic property data in terms of efficiency, quality and reliability using HPC methods. In this context, various topics were dealt with: atomistic simulation of homogeneous gas bubble formation, surface tension of classical fluids and ionic liquids, multicriteria optimization of molecular models, the development of the molecular simulation codes ls1 mardyn and ms2, atomistic simulation of gas separation processes, molecular membrane structure generators, transport resistors and the evaluation of predictive property data models based on specific mixture types.  相似文献   

18.
19.
对流量标准装置自动检定系统的检定结果计算过程进行了分析,发现流量标准装置自动检定系统在算量过程中,数学模型的选取、参数信号的采集、对名词、术语概念的理解有偏差,采样方式不合理,规程中公式差异均能对检定结果造成影响,从而使检定结果出现错误,误导送检单位使用。根据这些问题提出了相应的解决方案,达到了提高检定结果准确性、可靠性的目的。  相似文献   

20.
Optical sensors for bioprocess control. Bioprocess techniques are an important field in biotechnology. To run processes with high yields of products, to ensure product quality and for process optimization it is essential to acquire relevant process and control parameters quickly with high reliability. Therefore, various types of sensors are used; recently, beside chemosensors, new emphasis has been placed on biosensors for bioprocess control. At the same time optical transducers are gaining in significance. One reason for the increasing interest in optical sensors is the ready availability of fiber optic technologies. The combination of sensor technologies with fiber optics offers many possibilities due to miniaturisation and remote control and therefore meets the special requirements of bioprocess techniques. This review introduces various classes of optical chemo- and biosensors for bioprocess control.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号