Sample records for process quality prediction

  1. Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.

    PubMed

    Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald

    2017-07-01

    The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Near infrared spectroscopy based monitoring of extraction processes of raw material with the help of dynamic predictive modeling

    NASA Astrophysics Data System (ADS)

    Wang, Haixia; Suo, Tongchuan; Wu, Xiaolin; Zhang, Yue; Wang, Chunhua; Yu, Heshui; Li, Zheng

    2018-03-01

    The control of batch-to-batch quality variations remains a challenging task for pharmaceutical industries, e.g., traditional Chinese medicine (TCM) manufacturing. One difficult problem is to produce pharmaceutical products with consistent quality from raw material of large quality variations. In this paper, an integrated methodology combining the near infrared spectroscopy (NIRS) and dynamic predictive modeling is developed for the monitoring and control of the batch extraction process of licorice. With the spectra data in hand, the initial state of the process is firstly estimated with a state-space model to construct a process monitoring strategy for the early detection of variations induced by the initial process inputs such as raw materials. Secondly, the quality property of the end product is predicted at the mid-course during the extraction process with a partial least squares (PLS) model. The batch-end-time (BET) is then adjusted accordingly to minimize the quality variations. In conclusion, our study shows that with the help of the dynamic predictive modeling, NIRS can offer the past and future information of the process, which enables more accurate monitoring and control of process performance and product quality.

  3. Image processing system performance prediction and product quality evaluation

    NASA Technical Reports Server (NTRS)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  4. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  5. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes

    PubMed Central

    Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-01-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed‐batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647–1661, 2017 PMID:28786215

  6. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes.

    PubMed

    Downey, Brandon; Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-11-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed-batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647-1661, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  7. Predicting non-stationary algal dynamics following changes in hydrometeorological conditions using data assimilation techniques

    NASA Astrophysics Data System (ADS)

    Kim, S.; Seo, D. J.

    2017-12-01

    When water temperature (TW) increases due to changes in hydrometeorological conditions, the overall ecological conditions change in the aquatic system. The changes can be harmful to human health and potentially fatal to fish habitat. Therefore, it is important to assess the impacts of thermal disturbances on in-stream processes of water quality variables and be able to predict effectiveness of possible actions that may be taken for water quality protection. For skillful prediction of in-stream water quality processes, it is necessary for the watershed water quality models to be able to reflect such changes. Most of the currently available models, however, assume static parameters for the biophysiochemical processes and hence are not able to capture nonstationaries seen in water quality observations. In this work, we assess the performance of the Hydrological Simulation Program-Fortran (HSPF) in predicting algal dynamics following TW increase. The study area is located in the Republic of Korea where waterway change due to weir construction and drought concurrently occurred around 2012. In this work we use data assimilation (DA) techniques to update model parameters as well as the initial condition of selected state variables for in-stream processes relevant to algal growth. For assessment of model performance and characterization of temporal variability, various goodness-of-fit measures and wavelet analysis are used.

  8. Comparison of modelling accuracy with and without exploiting automated optical monitoring information in predicting the treated wastewater quality.

    PubMed

    Tomperi, Jani; Leiviskä, Kauko

    2018-06-01

    Traditionally the modelling in an activated sludge process has been based on solely the process measurements, but as the interest to optically monitor wastewater samples to characterize the floc morphology has increased, in the recent years the results of image analyses have been more frequently utilized to predict the characteristics of wastewater. This study shows that the traditional process measurements or the automated optical monitoring variables by themselves are not capable of developing the best predictive models for the treated wastewater quality in a full-scale wastewater treatment plant, but utilizing these variables together the optimal models, which show the level and changes in the treated wastewater quality, are achieved. By this early warning, process operation can be optimized to avoid environmental damages and economic losses. The study also shows that specific optical monitoring variables are important in modelling a certain quality parameter, regardless of the other input variables available.

  9. PREDICTING THE RELATIVE IMPACTS OF URBAN DEVELOPMENT POLICIES AND ON-ROAD VEHICLE TECHNOLOGIES ON AIR QUALITY IN THE UNITED STATES: MODELING AND ANALYSIS OF A CASE STUDY IN AUSTIN, TEXAS

    EPA Science Inventory

    Urban development results in changes to land use and land cover and, consequently, to biogenic and anthropogenic emissions, meteorological processes, and processes such as dry deposition that influence future predictions of air quality. This study examines the impacts of alter...

  10. Hadoop-Based Distributed System for Online Prediction of Air Pollution Based on Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Ghaemi, Z.; Farnaghi, M.; Alimohammadi, A.

    2015-12-01

    The critical impact of air pollution on human health and environment in one hand and the complexity of pollutant concentration behavior in the other hand lead the scientists to look for advance techniques for monitoring and predicting the urban air quality. Additionally, recent developments in data measurement techniques have led to collection of various types of data about air quality. Such data is extremely voluminous and to be useful it must be processed at high velocity. Due to the complexity of big data analysis especially for dynamic applications, online forecasting of pollutant concentration trends within a reasonable processing time is still an open problem. The purpose of this paper is to present an online forecasting approach based on Support Vector Machine (SVM) to predict the air quality one day in advance. In order to overcome the computational requirements for large-scale data analysis, distributed computing based on the Hadoop platform has been employed to leverage the processing power of multiple processing units. The MapReduce programming model is adopted for massive parallel processing in this study. Based on the online algorithm and Hadoop framework, an online forecasting system is designed to predict the air pollution of Tehran for the next 24 hours. The results have been assessed on the basis of Processing Time and Efficiency. Quite accurate predictions of air pollutant indicator levels within an acceptable processing time prove that the presented approach is very suitable to tackle large scale air pollution prediction problems.

  11. Implementation of Cyber-Physical Production Systems for Quality Prediction and Operation Control in Metal Casting.

    PubMed

    Lee, JuneHyuck; Noh, Sang Do; Kim, Hyun-Jung; Kang, Yong-Shin

    2018-05-04

    The prediction of internal defects of metal casting immediately after the casting process saves unnecessary time and money by reducing the amount of inputs into the next stage, such as the machining process, and enables flexible scheduling. Cyber-physical production systems (CPPS) perfectly fulfill the aforementioned requirements. This study deals with the implementation of CPPS in a real factory to predict the quality of metal casting and operation control. First, a CPPS architecture framework for quality prediction and operation control in metal-casting production was designed. The framework describes collaboration among internet of things (IoT), artificial intelligence, simulations, manufacturing execution systems, and advanced planning and scheduling systems. Subsequently, the implementation of the CPPS in actual plants is described. Temperature is a major factor that affects casting quality, and thus, temperature sensors and IoT communication devices were attached to casting machines. The well-known NoSQL database, HBase and the high-speed processing/analysis tool, Spark, are used for IoT repository and data pre-processing, respectively. Many machine learning algorithms such as decision tree, random forest, artificial neural network, and support vector machine were used for quality prediction and compared with R software. Finally, the operation of the entire system is demonstrated through a CPPS dashboard. In an era in which most CPPS-related studies are conducted on high-level abstract models, this study describes more specific architectural frameworks, use cases, usable software, and analytical methodologies. In addition, this study verifies the usefulness of CPPS by estimating quantitative effects. This is expected to contribute to the proliferation of CPPS in the industry.

  12. Profit through predictability: The MRF difference at optimax

    NASA Astrophysics Data System (ADS)

    Light, Brandon

    2007-05-01

    In the manufacturing business, there is one product that matters, money. Whether making shoelaces or aircraft carriers a business that doesn't also make a profit doesn't stay around long. Being able to predict operational expenses is critical to determining a product's sale price. Priced too high a product won't sell, too low profit goes away. In the business of precision optics manufacturing, predictability has been often impossible or had large error bars. Manufacturing unpredictability made setting price a challenge. What if predictability could improve by changing the polishing process? Would a predictable, deterministic process lead to profit? Optimax Systems has experienced exactly that. Incorporating Magnetorheological Finishing (MRF) into its finishing process, Optimax saw parts categorized financially as "high risk" become a routine product of higher quality, delivered on time and within budget. Using actual production figures, this presentation will show how much incorporating MRF reduced costs, improved output and increased quality all at the same time.

  13. Offline modeling for product quality prediction of mineral processing using modeling error PDF shaping and entropy minimization.

    PubMed

    Ding, Jinliang; Chai, Tianyou; Wang, Hong

    2011-03-01

    This paper presents a novel offline modeling for product quality prediction of mineral processing which consists of a number of unit processes in series. The prediction of the product quality of the whole mineral process (i.e., the mixed concentrate grade) plays an important role and the establishment of its predictive model is a key issue for the plantwide optimization. For this purpose, a hybrid modeling approach of the mixed concentrate grade prediction is proposed, which consists of a linear model and a nonlinear model. The least-squares support vector machine is adopted to establish the nonlinear model. The inputs of the predictive model are the performance indices of each unit process, while the output is the mixed concentrate grade. In this paper, the model parameter selection is transformed into the shape control of the probability density function (PDF) of the modeling error. In this context, both the PDF-control-based and minimum-entropy-based model parameter selection approaches are proposed. Indeed, this is the first time that the PDF shape control idea is used to deal with system modeling, where the key idea is to turn model parameters so that either the modeling error PDF is controlled to follow a target PDF or the modeling error entropy is minimized. The experimental results using the real plant data and the comparison of the two approaches are discussed. The results show the effectiveness of the proposed approaches.

  14. Modeling hydrodynamics, water quality, and benthic processes to predict ecological effects in Narragansett Bay

    EPA Science Inventory

    The environmental fluid dynamics code (EFDC) was used to study the three dimensional (3D) circulation, water quality, and ecology in Narragansett Bay, RI. Predictions of the Bay hydrodynamics included the behavior of the water surface elevation, currents, salinity, and temperatur...

  15. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.

  16. Hardwood pallet cant quality and pallet part yields

    Treesearch

    Hal L. Mitchell; Marshall White; Philip Araman; Peter Hamner

    2005-01-01

    Raw materials are the largest cost component in pallet manufacturing. The primary raw material used to produce pallet parts are pallet cants. Therefore, pallet cant quality directly impacts pallet part processing and material costs. By knowing the quality of the cants being processed, pallet manufacturers can predict these costs and improve manufacturing efficiency....

  17. Implementation of Cyber-Physical Production Systems for Quality Prediction and Operation Control in Metal Casting

    PubMed Central

    Lee, JuneHyuck; Noh, Sang Do; Kim, Hyun-Jung; Kang, Yong-Shin

    2018-01-01

    The prediction of internal defects of metal casting immediately after the casting process saves unnecessary time and money by reducing the amount of inputs into the next stage, such as the machining process, and enables flexible scheduling. Cyber-physical production systems (CPPS) perfectly fulfill the aforementioned requirements. This study deals with the implementation of CPPS in a real factory to predict the quality of metal casting and operation control. First, a CPPS architecture framework for quality prediction and operation control in metal-casting production was designed. The framework describes collaboration among internet of things (IoT), artificial intelligence, simulations, manufacturing execution systems, and advanced planning and scheduling systems. Subsequently, the implementation of the CPPS in actual plants is described. Temperature is a major factor that affects casting quality, and thus, temperature sensors and IoT communication devices were attached to casting machines. The well-known NoSQL database, HBase and the high-speed processing/analysis tool, Spark, are used for IoT repository and data pre-processing, respectively. Many machine learning algorithms such as decision tree, random forest, artificial neural network, and support vector machine were used for quality prediction and compared with R software. Finally, the operation of the entire system is demonstrated through a CPPS dashboard. In an era in which most CPPS-related studies are conducted on high-level abstract models, this study describes more specific architectural frameworks, use cases, usable software, and analytical methodologies. In addition, this study verifies the usefulness of CPPS by estimating quantitative effects. This is expected to contribute to the proliferation of CPPS in the industry. PMID:29734699

  18. Parametric Optimization Of Gas Metal Arc Welding Process By Using Grey Based Taguchi Method On Aisi 409 Ferritic Stainless Steel

    NASA Astrophysics Data System (ADS)

    Ghosh, Nabendu; Kumar, Pradip; Nandi, Goutam

    2016-10-01

    Welding input process parameters play a very significant role in determining the quality of the welded joint. Only by properly controlling every element of the process can product quality be controlled. For better quality of MIG welding of Ferritic stainless steel AISI 409, precise control of process parameters, parametric optimization of the process parameters, prediction and control of the desired responses (quality indices) etc., continued and elaborate experiments, analysis and modeling are needed. A data of knowledge - base may thus be generated which may be utilized by the practicing engineers and technicians to produce good quality weld more precisely, reliably and predictively. In the present work, X-ray radiographic test has been conducted in order to detect surface and sub-surface defects of weld specimens made of Ferritic stainless steel. The quality of the weld has been evaluated in terms of yield strength, ultimate tensile strength and percentage of elongation of the welded specimens. The observed data have been interpreted, discussed and analyzed by considering ultimate tensile strength ,yield strength and percentage elongation combined with use of Grey-Taguchi methodology.

  19. Binary classification of items of interest in a repeatable process

    DOEpatents

    Abell, Jeffrey A; Spicer, John Patrick; Wincek, Michael Anthony; Wang, Hui; Chakraborty, Debejyo

    2015-01-06

    A system includes host and learning machines. Each machine has a processor in electrical communication with at least one sensor. Instructions for predicting a binary quality status of an item of interest during a repeatable process are recorded in memory. The binary quality status includes passing and failing binary classes. The learning machine receives signals from the at least one sensor and identifies candidate features. Features are extracted from the candidate features, each more predictive of the binary quality status. The extracted features are mapped to a dimensional space having a number of dimensions proportional to the number of extracted features. The dimensional space includes most of the passing class and excludes at least 90 percent of the failing class. Received signals are compared to the boundaries of the recorded dimensional space to predict, in real time, the binary quality status of a subsequent item of interest.

  20. Predicting indoor pollutant concentrations, and applications to air quality management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lorenzetti, David M.

    Because most people spend more than 90% of their time indoors, predicting exposure to airborne pollutants requires models that incorporate the effect of buildings. Buildings affect the exposure of their occupants in a number of ways, both by design (for example, filters in ventilation systems remove particles) and incidentally (for example, sorption on walls can reduce peak concentrations, but prolong exposure to semivolatile organic compounds). Furthermore, building materials and occupant activities can generate pollutants. Indoor air quality depends not only on outdoor air quality, but also on the design, maintenance, and use of the building. For example, ''sick building'' symptomsmore » such as respiratory problems and headaches have been related to the presence of air-conditioning systems, to carpeting, to low ventilation rates, and to high occupant density (1). The physical processes of interest apply even in simple structures such as homes. Indoor air quality models simulate the processes, such as ventilation and filtration, that control pollutant concentrations in a building. Section 2 describes the modeling approach, and the important transport processes in buildings. Because advection usually dominates among the transport processes, Sections 3 and 4 describe methods for predicting airflows. The concluding section summarizes the application of these models.« less

  1. Data governance in predictive toxicology: A review.

    PubMed

    Fu, Xin; Wojak, Anna; Neagu, Daniel; Ridley, Mick; Travis, Kim

    2011-07-13

    Due to recent advances in data storage and sharing for further data processing in predictive toxicology, there is an increasing need for flexible data representations, secure and consistent data curation and automated data quality checking. Toxicity prediction involves multidisciplinary data. There are hundreds of collections of chemical, biological and toxicological data that are widely dispersed, mostly in the open literature, professional research bodies and commercial companies. In order to better manage and make full use of such large amount of toxicity data, there is a trend to develop functionalities aiming towards data governance in predictive toxicology to formalise a set of processes to guarantee high data quality and better data management. In this paper, data quality mainly refers in a data storage sense (e.g. accuracy, completeness and integrity) and not in a toxicological sense (e.g. the quality of experimental results). This paper reviews seven widely used predictive toxicology data sources and applications, with a particular focus on their data governance aspects, including: data accuracy, data completeness, data integrity, metadata and its management, data availability and data authorisation. This review reveals the current problems (e.g. lack of systematic and standard measures of data quality) and desirable needs (e.g. better management and further use of captured metadata and the development of flexible multi-level user access authorisation schemas) of predictive toxicology data sources development. The analytical results will help to address a significant gap in toxicology data quality assessment and lead to the development of novel frameworks for predictive toxicology data and model governance. While the discussed public data sources are well developed, there nevertheless remain some gaps in the development of a data governance framework to support predictive toxicology. In this paper, data governance is identified as the new challenge in predictive toxicology, and a good use of it may provide a promising framework for developing high quality and easy accessible toxicity data repositories. This paper also identifies important research directions that require further investigation in this area.

  2. Data governance in predictive toxicology: A review

    PubMed Central

    2011-01-01

    Background Due to recent advances in data storage and sharing for further data processing in predictive toxicology, there is an increasing need for flexible data representations, secure and consistent data curation and automated data quality checking. Toxicity prediction involves multidisciplinary data. There are hundreds of collections of chemical, biological and toxicological data that are widely dispersed, mostly in the open literature, professional research bodies and commercial companies. In order to better manage and make full use of such large amount of toxicity data, there is a trend to develop functionalities aiming towards data governance in predictive toxicology to formalise a set of processes to guarantee high data quality and better data management. In this paper, data quality mainly refers in a data storage sense (e.g. accuracy, completeness and integrity) and not in a toxicological sense (e.g. the quality of experimental results). Results This paper reviews seven widely used predictive toxicology data sources and applications, with a particular focus on their data governance aspects, including: data accuracy, data completeness, data integrity, metadata and its management, data availability and data authorisation. This review reveals the current problems (e.g. lack of systematic and standard measures of data quality) and desirable needs (e.g. better management and further use of captured metadata and the development of flexible multi-level user access authorisation schemas) of predictive toxicology data sources development. The analytical results will help to address a significant gap in toxicology data quality assessment and lead to the development of novel frameworks for predictive toxicology data and model governance. Conclusions While the discussed public data sources are well developed, there nevertheless remain some gaps in the development of a data governance framework to support predictive toxicology. In this paper, data governance is identified as the new challenge in predictive toxicology, and a good use of it may provide a promising framework for developing high quality and easy accessible toxicity data repositories. This paper also identifies important research directions that require further investigation in this area. PMID:21752279

  3. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.

  4. IR-thermography for Quality Prediction in Selective Laser Deburring

    NASA Astrophysics Data System (ADS)

    Möller, Mauritz; Conrad, Christian; Haimerl, Walter; Emmelmann, Claus

    Selective Laser Deburring (SLD) is an innovative edge-refinement process being developed at the Laser Zentrum Nord (LZN) in Hamburg. It offers a wear-free processing of defined radii and bevels at the edges as well as the possibility to deburr several materials with the same laser source. Sheet metal parts of various applications need to be post-processed to remove sharp edges and burrs remaining from the initial production process. Thus, SLD will provide an extended degree of automation for the next generation of manufacturing facilities. This paper investigates the dependence between the deburring result and the temperature field in- and post-process. In order to achieve this, the surface temperature near to the deburred edge is monitored with IR-thermography. Different strategies are discussed for the approach using the IR-information as a quality assurance. Additional experiments are performed to rate the accuracy of the quality prediction method in different deburring applications.

  5. Differentiating Between Attachment Styles and Behaviors and their Association with Marital Quality.

    PubMed

    Sandberg, Jonathan G; Bradford, Angela B; Brown, Andrew P

    2017-06-01

    The purpose of this study was to distinguish between the influence of attachment styles and behaviors on marital quality for couples. Data were gathered from 680 couples in a married relationship. Results showed attachment style and behaviors predicted marital quality for both men and women, with higher levels of attachment related to greater quality. Attachment behaviors predicted more of the variance in quality than did styles. Specific implications regarding how therapists may wish to foster behaviors that promote attachment security in marriages are discussed. © 2015 Family Process Institute.

  6. Sibling influences on gender development in middle childhood and early adolescence: a longitudinal study.

    PubMed

    McHale, S M; Updegraff, K A; Helms-Erikson, H; Crouter, A C

    2001-01-01

    The development of gender role qualities (attitudes, personality, leisure activities) from middle childhood to early adolescence was studied to determine whether siblings' gender role qualities predicted those of their sisters and brothers. Participants were 198 firstborn and second-born siblings (Ms = 10 years 9 months and 8 years 3 months, respectively, in Year 1) and their parents. Families were interviewed annually for 3 years. Firstborn siblings' qualities in Year 1 predicted second-born children's qualities in Year 3 when both parent and child qualities in Year 1 were controlled, a pattern consistent with a social learning model of sibling influence. Parental influence was more evident and sibling influence less evident in predicting firstborns' qualities; for firstborns, sibling influences suggested a de-identification process.

  7. An In-Process Surface Roughness Recognition System in End Milling Operations

    ERIC Educational Resources Information Center

    Yang, Lieh-Dai; Chen, Joseph C.

    2004-01-01

    To develop an in-process quality control system, a sensor technique and a decision-making algorithm need to be applied during machining operations. Several sensor techniques have been used in the in-process prediction of quality characteristics in machining operations. For example, an accelerometer sensor can be used to monitor the vibration of…

  8. Development and status of data quality assurance program at NASA Langley research center: Toward national standards

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.

    1996-01-01

    As part of a continuing effort to re-engineer the wind tunnel testing process, a comprehensive data quality assurance program is being established at NASA Langley Research Center (LaRC). The ultimate goal of the program is routing provision of tunnel-to-tunnel reproducibility with total uncertainty levels acceptable for test and evaluation of civilian transports. The operational elements for reaching such levels of reproducibility are: (1) statistical control, which provides long term measurement uncertainty predictability and a base for continuous improvement, (2) measurement uncertainty prediction, which provides test designs that can meet data quality expectations with the system's predictable variation, and (3) national standards, which provide a means for resolving tunnel-to-tunnel differences. The paper presents the LaRC design for the program and discusses the process of implementation.

  9. Advanced Water Quality Modelling in Marine Systems: Application to the Wadden Sea, the Netherlands

    NASA Astrophysics Data System (ADS)

    Boon, J.; Smits, J. G.

    2006-12-01

    There is an increasing demand for knowledge and models that arise from water management in relation to water quality, sediment quality (ecology) and sediment accumulation (ecomorphology). Recently, models for sediment diagenesis and erosion developed or incorporated by Delft Hydraulics integrates the relevant physical, (bio)chemical and biological processes for the sediment-water exchange of substances. The aim of the diagenesis models is the prediction of both sediment quality and the return fluxes of substances such as nutrients and micropollutants to the overlying water. The resulting so-called DELWAQ-G model is a new, generic version of the water and sediment quality model of the DELFT3D framework. One set of generic water quality process formulations is used to calculate process rates in both water and sediment compartments. DELWAQ-G involves the explicit simulation of sediment layers in the water quality model with state-of-the-art process kinetics. The local conditions in a water layer or sediment layer such as the dissolved oxygen concentration determine if and how individual processes come to expression. New processes were added for sulphate, sulphide, methane and the distribution of the electron-acceptor demand over dissolved oxygen, nitrate, sulphate and carbon dioxide. DELWAQ-G also includes the dispersive and advective transport processes in the sediment and across the sediment-water interface. DELWAQ-G has been applied for the Wadden Sea. A very dynamic tidal and ecologically active estuary with a complex hydrodynamic behaviour located at the north of the Netherlands. The predicted profiles in the sediment reflect the typical interactions of diagenesis processes.

  10. Evaluation of TIGGE Ensemble Forecasts of Precipitation in Distinct Climate Regions in Iran

    NASA Astrophysics Data System (ADS)

    Aminyavari, Saleh; Saghafian, Bahram; Delavar, Majid

    2018-04-01

    The application of numerical weather prediction (NWP) products is increasing dramatically. Existing reports indicate that ensemble predictions have better skill than deterministic forecasts. In this study, numerical ensemble precipitation forecasts in the TIGGE database were evaluated using deterministic, dichotomous (yes/no), and probabilistic techniques over Iran for the period 2008-16. Thirteen rain gauges spread over eight homogeneous precipitation regimes were selected for evaluation. The Inverse Distance Weighting and Kriging methods were adopted for interpolation of the prediction values, downscaled to the stations at lead times of one to three days. To enhance the forecast quality, NWP values were post-processed via Bayesian Model Averaging. The results showed that ECMWF had better scores than other products. However, products of all centers underestimated precipitation in high precipitation regions while overestimating precipitation in other regions. This points to a systematic bias in forecasts and demands application of bias correction techniques. Based on dichotomous evaluation, NCEP did better at most stations, although all centers overpredicted the number of precipitation events. Compared to those of ECMWF and NCEP, UKMO yielded higher scores in mountainous regions, but performed poorly at other selected stations. Furthermore, the evaluations showed that all centers had better skill in wet than in dry seasons. The quality of post-processed predictions was better than those of the raw predictions. In conclusion, the accuracy of the NWP predictions made by the selected centers could be classified as medium over Iran, while post-processing of predictions is recommended to improve the quality.

  11. Sugeno-Fuzzy Expert System Modeling for Quality Prediction of Non-Contact Machining Process

    NASA Astrophysics Data System (ADS)

    Sivaraos; Khalim, A. Z.; Salleh, M. S.; Sivakumar, D.; Kadirgama, K.

    2018-03-01

    Modeling can be categorised into four main domains: prediction, optimisation, estimation and calibration. In this paper, the Takagi-Sugeno-Kang (TSK) fuzzy logic method is examined as a prediction modelling method to investigate the taper quality of laser lathing, which seeks to replace traditional lathe machines with 3D laser lathing in order to achieve the desired cylindrical shape of stock materials. Three design parameters were selected: feed rate, cutting speed and depth of cut. A total of twenty-four experiments were conducted with eight sequential runs and replicated three times. The results were found to be 99% of accuracy rate of the TSK fuzzy predictive model, which suggests that the model is a suitable and practical method for non-linear laser lathing process.

  12. Predicting Software Suitability Using a Bayesian Belief Network

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  13. Decision Making Processes and Outcomes

    PubMed Central

    Hicks Patrick, Julie; Steele, Jenessa C.; Spencer, S. Melinda

    2013-01-01

    The primary aim of this study was to examine the contributions of individual characteristics and strategic processing to the prediction of decision quality. Data were provided by 176 adults, ages 18 to 93 years, who completed computerized decision-making vignettes and a battery of demographic and cognitive measures. We examined the relations among age, domain-specific experience, working memory, and three measures of strategic information search to the prediction of solution quality using a 4-step hierarchical linear regression analysis. Working memory and two measures of strategic processing uniquely contributed to the variance explained. Results are discussed in terms of potential advances to both theory and intervention efforts. PMID:24282638

  14. The relation of respiratory sinus arrhythmia to later shyness: Moderation by neighborhood quality.

    PubMed

    Zhang, Hui; Spinrad, Tracy L; Eisenberg, Nancy; Zhang, Linlin

    2018-05-21

    The purpose of the study was to predict young children's shyness from both internal/biological (i.e., resting respiratory sinus arrhythmia; RSA) and external (i.e., neighborhood quality) factors. Participants were 180 children at 42 (Time 1; T1), 72 (T2), and 84 (T3) months of age. RSA data were obtained at T1 during a neutral film in the laboratory. Mothers reported perceived neighborhood quality at T2 and children's dispositional shyness at T1 and T3. Path analyses indicated that resting RSA interacted with neighborhood quality to predict T3 shyness, even after controlling for earlier family income and T1 shyness. Specifically, high levels of resting RSA predicted low levels of shyness in the context of high neighborhood quality. When neighborhood quality was low, resting RSA was positively related to later shyness. These findings indicate that children's shyness is predicted by more than biological processes and that consideration of the broader context is critical to understanding children's social behavior. © 2018 Wiley Periodicals, Inc.

  15. Meteorological Processes Affecting Air Quality – Research and Model Development Needs

    EPA Science Inventory

    Meteorology modeling is an important component of air quality modeling systems that defines the physical and dynamical environment for atmospheric chemistry. The meteorology models used for air quality applications are based on numerical weather prediction models that were devel...

  16. Neighborhood Context and Financial Strain as Predictors of Marital Interaction and Marital Quality in African American Couples

    PubMed Central

    Cutrona, Carolyn E.; Russell, Daniel W.; Abraham, W. Todd; Gardner, Kelli A.; Melby, Janet N.; Bryant, Chalandra; Conger, Rand D.

    2007-01-01

    Demographic characteristics, family financial strain, neighborhood-level economic disadvantage, and state of residence were tested as predictors of observed warmth, hostility, and self-reported marital quality. Participants were 202 married African American couples who resided in a range of neighborhood contexts. Neighborhood-level economic disadvantage predicted lower warmth during marital interactions, as did residence in the rural south. Consistent with the family stress model (e.g., Conger & Elder, 1994), family financial strain predicted lower perceived marital quality. Unexpectedly, neighborhood-level economic disadvantage predicted higher marital quality. Social comparison processes and degree of exposure to racially based discrimination are considered as explanations for this unexpected result. The importance of context in relationship outcomes is highlighted. PMID:17955056

  17. Nonstarch polysaccharides in wheat flour wire-cut cookie making.

    PubMed

    Guttieri, Mary J; Souza, Edward J; Sneller, Clay

    2008-11-26

    Nonstarch polysaccharides in wheat flour have significant capacity to affect the processing quality of wheat flour dough and the finished quality of wheat flour products. Most research has focused on the effects of arabinoxylans (AX) in bread making. This study found that water-extractable AX and arabinogalactan peptides can predict variation in pastry wheat quality as captured by the wire-cut cookie model system. The sum of water-extractable AX plus arabinogalactan was highly predictive of cookie spread factor. The combination of cookie spread factor and the ratio of water-extractable arabinose to xylose predicted peak force of the three-point bend test of cookie texture.

  18. Nonlinear Multiobjective MPC-Based Optimal Operation of a High Consistency Refining System in Papermaking

    DOE PAGES

    Li, Mingjie; Zhou, Ping; Wang, Hong; ...

    2017-09-19

    As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less

  19. Nonlinear Multiobjective MPC-Based Optimal Operation of a High Consistency Refining System in Papermaking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Mingjie; Zhou, Ping; Wang, Hong

    As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less

  20. Application of response surface methodology to maximize the productivity of scalable automated human embryonic stem cell manufacture.

    PubMed

    Ratcliffe, Elizabeth; Hourd, Paul; Guijarro-Leach, Juan; Rayment, Erin; Williams, David J; Thomas, Robert J

    2013-01-01

    Commercial regenerative medicine will require large quantities of clinical-specification human cells. The cost and quality of manufacture is notoriously difficult to control due to highly complex processes with poorly defined tolerances. As a step to overcome this, we aimed to demonstrate the use of 'quality-by-design' tools to define the operating space for economic passage of a scalable human embryonic stem cell production method with minimal cell loss. Design of experiments response surface methodology was applied to generate empirical models to predict optimal operating conditions for a unit of manufacture of a previously developed automatable and scalable human embryonic stem cell production method. Two models were defined to predict cell yield and cell recovery rate postpassage, in terms of the predictor variables of media volume, cell seeding density, media exchange and length of passage. Predicted operating conditions for maximized productivity were successfully validated. Such 'quality-by-design' type approaches to process design and optimization will be essential to reduce the risk of product failure and patient harm, and to build regulatory confidence in cell therapy manufacturing processes.

  1. Reliability prediction of ontology-based service compositions using Petri net and time series models.

    PubMed

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.

  2. Video quality assessment using motion-compensated temporal filtering and manifold feature similarity

    PubMed Central

    Yu, Mei; Jiang, Gangyi; Shao, Feng; Peng, Zongju

    2017-01-01

    Well-performed Video quality assessment (VQA) method should be consistent with human visual systems for better prediction accuracy. In this paper, we propose a VQA method using motion-compensated temporal filtering (MCTF) and manifold feature similarity. To be more specific, a group of frames (GoF) is first decomposed into a temporal high-pass component (HPC) and a temporal low-pass component (LPC) by MCTF. Following this, manifold feature learning (MFL) and phase congruency (PC) are used to predict the quality of temporal LPC and temporal HPC respectively. The quality measures of the LPC and the HPC are then combined as GoF quality. A temporal pooling strategy is subsequently used to integrate GoF qualities into an overall video quality. The proposed VQA method appropriately processes temporal information in video by MCTF and temporal pooling strategy, and simulate human visual perception by MFL. Experiments on publicly available video quality database showed that in comparison with several state-of-the-art VQA methods, the proposed VQA method achieves better consistency with subjective video quality and can predict video quality more accurately. PMID:28445489

  3. Modelling and control for laser based welding processes: modern methods of process control to improve quality of laser-based joining methods

    NASA Astrophysics Data System (ADS)

    Zäh, Ralf-Kilian; Mosbach, Benedikt; Hollwich, Jan; Faupel, Benedikt

    2017-02-01

    To ensure the competitiveness of manufacturing companies it is indispensable to optimize their manufacturing processes. Slight variations of process parameters and machine settings have only marginally effects on the product quality. Therefore, the largest possible editing window is required. Such parameters are, for example, the movement of the laser beam across the component for the laser keyhole welding. That`s why it is necessary to keep the formation of welding seams within specified limits. Therefore, the quality of laser welding processes is ensured, by using post-process methods, like ultrasonic inspection, or special in-process methods. These in-process systems only achieve a simple evaluation which shows whether the weld seam is acceptable or not. Furthermore, in-process systems use no feedback for changing the control variables such as speed of the laser or adjustment of laser power. In this paper the research group presents current results of the research field of Online Monitoring, Online Controlling and Model predictive controlling in laser welding processes to increase the product quality. To record the characteristics of the welding process, tested online methods are used during the process. Based on the measurement data, a state space model is ascertained, which includes all the control variables of the system. Depending on simulation tools the model predictive controller (MPC) is designed for the model and integrated into an NI-Real-Time-System.

  4. Linked Hydrologic-Hydrodynamic Model Framework to Forecast Impacts of Rivers on Beach Water Quality

    NASA Astrophysics Data System (ADS)

    Anderson, E. J.; Fry, L. M.; Kramer, E.; Ritzenthaler, A.

    2014-12-01

    The goal of NOAA's beach quality forecasting program is to use a multi-faceted approach to aid in detection and prediction of bacteria in recreational waters. In particular, our focus has been on the connection between tributary loads and bacteria concentrations at nearby beaches. While there is a clear link between stormwater runoff and beach water quality, quantifying the contribution of river loadings to nearshore bacterial concentrations is complicated due to multiple processes that drive bacterial concentrations in rivers as well as those processes affecting the fate and transport of bacteria upon exiting the rivers. In order to forecast potential impacts of rivers on beach water quality, we developed a linked hydrologic-hydrodynamic water quality framework that simulates accumulation and washoff of bacteria from the landscape, and then predicts the fate and transport of washed off bacteria from the watershed to the coastal zone. The framework includes a watershed model (IHACRES) to predict fecal indicator bacteria (FIB) loadings to the coastal environment (accumulation, wash-off, die-off) as a function of effective rainfall. These loadings are input into a coastal hydrodynamic model (FVCOM), including a bacteria transport model (Lagrangian particle), to simulate 3D bacteria transport within the coastal environment. This modeling system provides predictive tools to assist local managers in decision-making to reduce human health threats.

  5. Sensitivity Analysis of the Sheet Metal Stamping Processes Based on Inverse Finite Element Modeling and Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Yu, Maolin; Du, R.

    2005-08-01

    Sheet metal stamping is one of the most commonly used manufacturing processes, and hence, much research has been carried for economic gain. Searching through the literatures, however, it is found that there are still a lots of problems unsolved. For example, it is well known that for a same press, same workpiece material, and same set of die, the product quality may vary owing to a number of factors, such as the inhomogeneous of the workpice material, the loading error, the lubrication, and etc. Presently, few seem able to predict the quality variation, not to mention what contribute to the quality variation. As a result, trial-and-error is still needed in the shop floor, causing additional cost and time delay. This paper introduces a new approach to predict the product quality variation and identify the sensitive design / process parameters. The new approach is based on a combination of inverse Finite Element Modeling (FEM) and Monte Carlo Simulation (more specifically, the Latin Hypercube Sampling (LHS) approach). With an acceptable accuracy, the inverse FEM (also called one-step FEM) requires much less computation load than that of the usual incremental FEM and hence, can be used to predict the quality variations under various conditions. LHS is a statistical method, through which the sensitivity analysis can be carried out. The result of the sensitivity analysis has clear physical meaning and can be used to optimize the die design and / or the process design. Two simulation examples are presented including drawing a rectangular box and drawing a two-step rectangular box.

  6. Monitoring and predicting shrink potential and future processing quality of potato tubers

    USDA-ARS?s Scientific Manuscript database

    Long-term storage of potato tubers increases risks, which are often attributed to shrink and quality loss. To minimize shrink and ensure high quality tubers, producers must closely monitor the condition of the crop during storage and make necessary adjustments to management plans. Evaluation procedu...

  7. Assessment and management of the performance risk of a pilot reclaimed water disinfection process.

    PubMed

    Zhou, Guangyu; Zhao, Xinhua; Zhang, Lei; Wu, Qing

    2013-10-01

    Chlorination disinfection has been widely used in reclaimed water treatment plants to ensure water quality. In order to assess the downstream quality risk of a running reclaimed water disinfection process, a set of dynamic equations was developed to simulate reactions in the disinfection process concerning variables of bacteria, chemical oxygen demand (COD), ammonia and monochloramine. The model was calibrated by the observations obtained from a pilot disinfection process which was designed to simulate the actual process in a reclaimed water treatment plant. A Monte Carlo algorithm was applied to calculate the predictive effluent quality distributions that were used in the established hierarchical assessment system for the downstream quality risk, and the key factors affecting the downstream quality risk were defined using the Regional Sensitivity Analysis method. The results showed that the seasonal upstream quality variation caused considerable downstream quality risk; the effluent ammonia was significantly influenced by its upstream concentration; the upstream COD was a key factor determining the process effluent risk of bacterial, COD and residual disinfectant indexes; and lower COD and ammonia concentrations in the influent would mean better downstream quality.

  8. Assessment and prediction of air quality using fuzzy logic and autoregressive models

    NASA Astrophysics Data System (ADS)

    Carbajal-Hernández, José Juan; Sánchez-Fernández, Luis P.; Carrasco-Ochoa, Jesús A.; Martínez-Trinidad, José Fco.

    2012-12-01

    In recent years, artificial intelligence methods have been used for the treatment of environmental problems. This work, presents two models for assessment and prediction of air quality. First, we develop a new computational model for air quality assessment in order to evaluate toxic compounds that can harm sensitive people in urban areas, affecting their normal activities. In this model we propose to use a Sigma operator to statistically asses air quality parameters using their historical data information and determining their negative impact in air quality based on toxicity limits, frequency average and deviations of toxicological tests. We also introduce a fuzzy inference system to perform parameter classification using a reasoning process and integrating them in an air quality index describing the pollution levels in five stages: excellent, good, regular, bad and danger, respectively. The second model proposed in this work predicts air quality concentrations using an autoregressive model, providing a predicted air quality index based on the fuzzy inference system previously developed. Using data from Mexico City Atmospheric Monitoring System, we perform a comparison among air quality indices developed for environmental agencies and similar models. Our results show that our models are an appropriate tool for assessing site pollution and for providing guidance to improve contingency actions in urban areas.

  9. Artificial neural network modeling of the water quality index using land use areas as predictors.

    PubMed

    Gazzaz, Nabeel M; Yusoff, Mohd Kamil; Ramli, Mohammad Firuz; Juahir, Hafizan; Aris, Ahmad Zaharin

    2015-02-01

    This paper describes the design of an artificial neural network (ANN) model to predict the water quality index (WQI) using land use areas as predictors. Ten-year records of land use statistics and water quality data for Kinta River (Malaysia) were employed in the modeling process. The most accurate WQI predictions were obtained with the network architecture 7-23-1; the back propagation training algorithm; and a learning rate of 0.02. The WQI forecasts of this model had significant (p < 0.01), positive, very high correlation (ρs = 0.882) with the measured WQI values. Sensitivity analysis revealed that the relative importance of the land use classes to WQI predictions followed the order: mining > rubber > forest > logging > urban areas > agriculture > oil palm. These findings show that the ANNs are highly reliable means of relating water quality to land use, thus integrating land use development with river water quality management.

  10. An Integrated Modeling Framework Forecasting Ecosystem Exposure-- A Systems Approach to the Cumulative Impacts of Multiple Stressors

    NASA Astrophysics Data System (ADS)

    Johnston, J. M.

    2013-12-01

    Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.

  11. [Development of whole process quality control and management system of traditional Chinese medicine decoction pieces based on traditional Chinese medicine quality tree].

    PubMed

    Yu, Wen-Kang; Dong, Ling; Pei, Wen-Xuan; Sun, Zhi-Rong; Dai, Jun-Dong; Wang, Yun

    2017-12-01

    The whole process quality control and management of traditional Chinese medicine (TCM) decoction pieces is a system engineering, involving the base environment, seeds and seedlings, harvesting, processing and other multiple steps, so the accurate identification of factors in TCM production process that may induce the quality risk, as well as reasonable quality control measures are very important. At present, the concept of quality risk is mainly concentrated in the aspects of management and regulations, etc. There is no comprehensive analysis on possible risks in the quality control process of TCM decoction pieces, or analysis summary of effective quality control schemes. A whole process quality control and management system for TCM decoction pieces based on TCM quality tree was proposed in this study. This system effectively combined the process analysis method of TCM quality tree with the quality risk management, and can help managers to make real-time decisions while realizing the whole process quality control of TCM. By providing personalized web interface, this system can realize user-oriented information feedback, and was convenient for users to predict, evaluate and control the quality of TCM. In the application process, the whole process quality control and management system of the TCM decoction pieces can identify the related quality factors such as base environment, cultivation and pieces processing, extend and modify the existing scientific workflow according to their own production conditions, and provide different enterprises with their own quality systems, to achieve the personalized service. As a new quality management model, this paper can provide reference for improving the quality of Chinese medicine production and quality standardization. Copyright© by the Chinese Pharmaceutical Association.

  12. Characterization of a Saccharomyces cerevisiae fermentation process for production of a therapeutic recombinant protein using a multivariate Bayesian approach.

    PubMed

    Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan

    2016-05-01

    The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016. © 2016 American Institute of Chemical Engineers.

  13. Quality prediction modeling for sintered ores based on mechanism models of sintering and extreme learning machine based error compensation

    NASA Astrophysics Data System (ADS)

    Tiebin, Wu; Yunlian, Liu; Xinjun, Li; Yi, Yu; Bin, Zhang

    2018-06-01

    Aiming at the difficulty in quality prediction of sintered ores, a hybrid prediction model is established based on mechanism models of sintering and time-weighted error compensation on the basis of the extreme learning machine (ELM). At first, mechanism models of drum index, total iron, and alkalinity are constructed according to the chemical reaction mechanism and conservation of matter in the sintering process. As the process is simplified in the mechanism models, these models are not able to describe high nonlinearity. Therefore, errors are inevitable. For this reason, the time-weighted ELM based error compensation model is established. Simulation results verify that the hybrid model has a high accuracy and can meet the requirement for industrial applications.

  14. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    PubMed

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  15. Prediction of settled water turbidity and optimal coagulant dosage in drinking water treatment plant using a hybrid model of k-means clustering and adaptive neuro-fuzzy inference system

    NASA Astrophysics Data System (ADS)

    Kim, Chan Moon; Parnichkun, Manukid

    2017-11-01

    Coagulation is an important process in drinking water treatment to attain acceptable treated water quality. However, the determination of coagulant dosage is still a challenging task for operators, because coagulation is nonlinear and complicated process. Feedback control to achieve the desired treated water quality is difficult due to lengthy process time. In this research, a hybrid of k-means clustering and adaptive neuro-fuzzy inference system ( k-means-ANFIS) is proposed for the settled water turbidity prediction and the optimal coagulant dosage determination using full-scale historical data. To build a well-adaptive model to different process states from influent water, raw water quality data are classified into four clusters according to its properties by a k-means clustering technique. The sub-models are developed individually on the basis of each clustered data set. Results reveal that the sub-models constructed by a hybrid k-means-ANFIS perform better than not only a single ANFIS model, but also seasonal models by artificial neural network (ANN). The finally completed model consisting of sub-models shows more accurate and consistent prediction ability than a single model of ANFIS and a single model of ANN based on all five evaluation indices. Therefore, the hybrid model of k-means-ANFIS can be employed as a robust tool for managing both treated water quality and production costs simultaneously.

  16. CRN5EXP: Expert system for statistical quality control

    NASA Technical Reports Server (NTRS)

    Hentea, Mariana

    1991-01-01

    The purpose of the Expert System CRN5EXP is to assist in checking the quality of the coils at two very important mills: Hot Rolling and Cold Rolling in a steel plant. The system interprets the statistical quality control charts, diagnoses and predicts the quality of the steel. Measurements of process control variables are recorded in a database and sample statistics such as the mean and the range are computed and plotted on a control chart. The chart is analyzed through patterns using the C Language Integrated Production System (CLIPS) and a forward chaining technique to reach a conclusion about the causes of defects and to take management measures for the improvement of the quality control techniques. The Expert System combines the certainty factors associated with the process control variables to predict the quality of the steel. The paper presents the approach to extract data from the database, the reason to combine certainty factors, the architecture and the use of the Expert System. However, the interpretation of control charts patterns requires the human expert's knowledge and lends to Expert Systems rules.

  17. Q-marker based strategy for CMC research of Chinese medicine: A case study of Panax Notoginseng saponins.

    PubMed

    Zhong, Yi; Zhu, Jieqiang; Yang, Zhenzhong; Shao, Qing; Fan, Xiaohui; Cheng, Yiyu

    2018-01-31

    To ensure pharmaceutical quality, chemistry, manufacturing and control (CMC) research is essential. However, due to the inherent complexity of Chinese medicine (CM), CMC study of CM remains a great challenge for academia, industry, and regulatory agencies. Recently, quality-marker (Q-marker) was proposed to establish quality standards or quality analysis approaches of Chinese medicine, which sheds a light on Chinese medicine's CMC study. Here manufacture processes of Panax Notoginseng Saponins (PNS) is taken as a case study and the present work is to establish a Q-marker based research strategy for CMC of Chinese medicine. The Q-markers of Panax Notoginseng Saponins (PNS) is selected and established by integrating chemical profile with pharmacological activities. Then, the key processes of PNS manufacturing are identified by material flow analysis. Furthermore, modeling algorithms are employed to explore the relationship between Q-markers and critical process parameters (CPPs) of the key processes. At last, CPPs of the key processes are optimized in order to improving the process efficiency. Among the 97 identified compounds, Notoginsenoside R 1 , ginsenoside Rg 1 , Re, Rb 1 and Rd are selected as the Q-markers of PNS. Our analysis on PNS manufacturing show the extraction process and column chromatography process are the key processes. With the CPPs of each process as the inputs and Q-markers' contents as the outputs, two process prediction models are built separately for the extraction process and column chromatography process of Panax notoginseng, which both possess good prediction ability. Based on the efficiency models of extraction process and column chromatography process we constructed, the optimal CPPs of both processes are calculated. Our results show that the Q-markers derived from CMC research strategy can be applied to analyze the manufacturing processes of Chinese medicine to assure product's quality and promote key processes' efficiency simultaneously. Copyright © 2018 Elsevier GmbH. All rights reserved.

  18. Influence of rainfall and catchment characteristics on urban stormwater quality.

    PubMed

    Liu, An; Egodawatta, Prasanna; Guan, Yuntao; Goonetilleke, Ashantha

    2013-02-01

    The accuracy and reliability of urban stormwater quality modelling outcomes are important for stormwater management decision making. The commonly adopted approach where only a limited number of factors are used to predict urban stormwater quality may not adequately represent the complexity of the quality response to a rainfall event or site-to-site differences to support efficient treatment design. This paper discusses an investigation into the influence of rainfall and catchment characteristics on urban stormwater quality in order to investigate the potential areas for errors in current stormwater quality modelling practices. It was found that the influence of rainfall characteristics on pollutant wash-off is step-wise based on specific thresholds. This means that a modelling approach where the wash-off process is predicted as a continuous function of rainfall intensity and duration is not appropriate. Additionally, other than conventional catchment characteristics, namely, land use and impervious surface fraction, other catchment characteristics such as impervious area layout, urban form and site specific characteristics have an important influence on both, pollutant build-up and wash-off processes. Finally, the use of solids as a surrogate to estimate other pollutant species was found to be inappropriate. Individually considering build-up and wash-off processes for each pollutant species should be the preferred option. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Mamdani-Fuzzy Modeling Approach for Quality Prediction of Non-Linear Laser Lathing Process

    NASA Astrophysics Data System (ADS)

    Sivaraos; Khalim, A. Z.; Salleh, M. S.; Sivakumar, D.; Kadirgama, K.

    2018-03-01

    Lathing is a process to fashioning stock materials into desired cylindrical shapes which usually performed by traditional lathe machine. But, the recent rapid advancements in engineering materials and precision demand gives a great challenge to the traditional method. The main drawback of conventional lathe is its mechanical contact which brings to the undesirable tool wear, heat affected zone, finishing, and dimensional accuracy especially taper quality in machining of stock with high length to diameter ratio. Therefore, a novel approach has been devised to investigate in transforming a 2D flatbed CO2 laser cutting machine into 3D laser lathing capability as an alternative solution. Three significant design parameters were selected for this experiment, namely cutting speed, spinning speed, and depth of cut. Total of 24 experiments were performed with eight (8) sequential runs where they were then replicated three (3) times. The experimental results were then used to establish Mamdani - Fuzzy predictive model where it yields the accuracy of more than 95%. Thus, the proposed Mamdani - Fuzzy modelling approach is found very much suitable and practical for quality prediction of non-linear laser lathing process for cylindrical stocks of 10mm diameter.

  20. Structure, Process, and Outcome Quality of Surgical Site Infection Surveillance in Switzerland.

    PubMed

    Kuster, Stefan P; Eisenring, Marie-Christine; Sax, Hugo; Troillet, Nicolas

    2017-10-01

    OBJECTIVE To assess the structure and quality of surveillance activities and to validate outcome detection in the Swiss national surgical site infection (SSI) surveillance program. DESIGN Countrywide survey of SSI surveillance quality. SETTING 147 hospitals or hospital units with surgical activities in Switzerland. METHODS Site visits were conducted with on-site structured interviews and review of a random sample of 15 patient records per hospital: 10 from the entire data set and 5 from a subset of patients with originally reported infection. Process and structure were rated in 9 domains with a weighted overall validation score, and sensitivity, specificity, positive predictive value, and negative predictive value were calculated for the identification of SSI. RESULTS Of 50 possible points, the median validation score was 35.5 (range, 16.25-48.5). Public hospitals (P<.001), hospitals in the Italian-speaking region of Switzerland (P=.021), and hospitals with longer participation in the surveillance (P=.018) had higher scores than others. Domains that contributed most to lower scores were quality of chart review and quality of data extraction. Of 49 infections, 15 (30.6%) had been overlooked in a random sample of 1,110 patient records, accounting for a sensitivity of 69.4% (95% confidence interval [CI], 54.6%-81.7%), a specificity of 99.9% (95% CI, 99.5%-100%), a positive predictive value of 97.1% (95% CI, 85.1%-99.9%), and a negative predictive value of 98.6% (95% CI, 97.7%-99.2%). CONCLUSIONS Irrespective of a well-defined surveillance methodology, there is a wide variation of SSI surveillance quality. The quality of chart review and the accuracy of data collection are the main areas for improvement. Infect Control Hosp Epidemiol 2017;38:1172-1181.

  1. Image quality prediction - An aid to the Viking lander imaging investigation on Mars

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Wall, S. D.

    1976-01-01

    Image quality criteria and image quality predictions are formulated for the multispectral panoramic cameras carried by the Viking Mars landers. Image quality predictions are based on expected camera performance, Mars surface radiance, and lighting and viewing geometry (fields of view, Mars lander shadows, solar day-night alternation), and are needed in diagnosis of camera performance, in arriving at a preflight imaging strategy, and revision of that strategy should the need arise. Landing considerations, camera control instructions, camera control logic, aspects of the imaging process (spectral response, spatial response, sensitivity), and likely problems are discussed. Major concerns include: degradation of camera response by isotope radiation, uncertainties in lighting and viewing geometry and in landing site local topography, contamination of camera window by dust abrasion, and initial errors in assigning camera dynamic ranges (gains and offsets).

  2. Reliability Prediction of Ontology-Based Service Compositions Using Petri Net and Time Series Models

    PubMed Central

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy. PMID:24688429

  3. The Effects of Argument Quality and Involvement Type on Attitude Formation and Attitude Change: A Test of Dual-Process and Social Judgment Predictions

    ERIC Educational Resources Information Center

    Park, Hee Sun; Levine, Timothy R.; Kingsley Westerman, Catherine Y.; Orfgen, Tierney; Foregger, Sarah

    2007-01-01

    Involvement has long been theoretically specified as a crucial factor determining the persuasive impact of messages. In social judgment theory, ego-involvement makes people more resistant to persuasion, whereas in dual-process models, high-involvement people are susceptible to persuasion when argument quality is high. It is argued that these…

  4. The Relationship of Previous Training and Experience of Journal Peer Reviewers to Subsequent Review Quality

    PubMed Central

    Callaham, Michael L; Tercier, John

    2007-01-01

    Background Peer review is considered crucial to the selection and publication of quality science, but very little is known about the previous experiences and training that might identify high-quality peer reviewers. The reviewer selection processes of most journals, and thus the qualifications of their reviewers, are ill defined. More objective selection of peer reviewers might improve the journal peer review process and thus the quality of published science. Methods and Findings 306 experienced reviewers (71% of all those associated with a specialty journal) completed a survey of past training and experiences postulated to improve peer review skills. Reviewers performed 2,856 reviews of 1,484 separate manuscripts during a four-year study period, all prospectively rated on a standardized quality scale by editors. Multivariable analysis revealed that most variables, including academic rank, formal training in critical appraisal or statistics, or status as principal investigator of a grant, failed to predict performance of higher-quality reviews. The only significant predictors of quality were working in a university-operated hospital versus other teaching environment and relative youth (under ten years of experience after finishing training). Being on an editorial board and doing formal grant (study section) review were each predictors for only one of our two comparisons. However, the predictive power of all variables was weak. Conclusions Our study confirms that there are no easily identifiable types of formal training or experience that predict reviewer performance. Skill in scientific peer review may be as ill defined and hard to impart as is “common sense.” Without a better understanding of those skills, it seems unlikely journals and editors will be successful in systematically improving their selection of reviewers. This inability to predict performance makes it imperative that all but the smallest journals implement routine review ratings systems to routinely monitor the quality of their reviews (and thus the quality of the science they publish). PMID:17411314

  5. Sensory Information Processing

    DTIC Science & Technology

    1975-12-31

    system noise . To see how this is avoided, note that zeroes in the blur spectrum become sharp, spike-like negative «*»• Page impulses when the...Synthetic Speech Quality Using Binaural Reverberation-- Boll 12 13 Section 4. Noise Suppression with Linear Prediction Filtering—Peterson 24 Section...5. Speech Processing to Reduce Noise and Improve Intelligibility— Callahan 28 Section 6. Linear Predictive Coding with a Glottal 36 Section 7

  6. Total Quality Management in Higher Education: Applying Deming's Fourteen Points.

    ERIC Educational Resources Information Center

    Masters, Robert J.; Leiker, Linda

    1992-01-01

    This article presents guidelines to aid administrators of institutions of higher education in applying the 14 principles of Total Quality Management. The principles stress understanding process improvements, handling variation, fostering prediction, and using psychology to capitalize on human resources. (DB)

  7. A Case for a Process Approach: The Warwick Experience.

    ERIC Educational Resources Information Center

    Screen, P.

    1988-01-01

    Describes the cyclical nature of a problem-solving sequence produced from observing children involved in the process. Discusses the generic qualities of science: (1) observing; (2) inferring; (3) classifying; (4) predicting; (5) controlling variables; and (6) hypothesizing. Explains the processes in use and advantages of a process-led course. (RT)

  8. Integrating cognitive and peripheral factors in predicting hearing-aid processing effectiveness

    PubMed Central

    Kates, James M.; Arehart, Kathryn H.; Souza, Pamela E.

    2013-01-01

    Individual factors beyond the audiogram, such as age and cognitive abilities, can influence speech intelligibility and speech quality judgments. This paper develops a neural network framework for combining multiple subject factors into a single model that predicts speech intelligibility and quality for a nonlinear hearing-aid processing strategy. The nonlinear processing approach used in the paper is frequency compression, which is intended to improve the audibility of high-frequency speech sounds by shifting them to lower frequency regions where listeners with high-frequency loss have better hearing thresholds. An ensemble averaging approach is used for the neural network to avoid the problems associated with overfitting. Models are developed for two subject groups, one having nearly normal hearing and the other mild-to-moderate sloping losses. PMID:25669257

  9. Process evaluation of the project P.A.T.H.S. (secondary 2 program): findings based on the co-walker scheme.

    PubMed

    Shek, Daniel T L; Tam, Suet-yan

    2009-01-01

    To understand the implementation quality of the Tier 1 Program (Secondary 2 Curriculum) of the P.A.T.H.S. Project, process evaluation was carried out by co-walkers through classroom observation of 195 units in 131 schools. Results showed that the overall level of program adherence was generally high with an average of 84.55%, and different factors of the implementation process were evaluated as positive. Quality of program implementation and achievement of program objectives were predicted by students' participation and involvement, strategies to enhance students' motivation, opportunity for reflection, time management, and class preparation. Success in program implementation was predicted by students' participation and involvement, classroom control, interactive delivery method, strategies to enhance students' motivation, opportunity for reflection, and lesson preparation.

  10. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics

    PubMed Central

    Lin, Sabrina C.; Bays, Brett C.; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells. PMID:26848582

  11. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics.

    PubMed

    Zahedi, Atena; On, Vincent; Lin, Sabrina C; Bays, Brett C; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells.

  12. The role of context in preschool learning: a multilevel examination of the contribution of context-specific problem behaviors and classroom process quality to low-income children's approaches to learning.

    PubMed

    Domínguez, Ximena; Vitiello, Virginia E; Fuccillo, Janna M; Greenfield, Daryl B; Bulotsky-Shearer, Rebecca J

    2011-04-01

    Research suggests that promoting adaptive approaches to learning early in childhood may help close the gap between advantaged and disadvantaged children. Recent research has identified specific child-level and classroom-level variables that are significantly associated with preschoolers' approaches to learning. However, further research is needed to understand the interactive effects of these variables and determine whether classroom-level variables buffer the detrimental effects of child-level risk variables. Using a largely urban and minority sample (N=275) of preschool children, the present study examined the additive and interactive effects of children's context-specific problem behaviors and classroom process quality dimensions on children's approaches to learning. Teachers rated children's problem behavior and approaches to learning and independent assessors conducted classroom observations to assess process quality. Problem behaviors in structured learning situations and in peer and teacher interactions were found to negatively predict variance in approaches to learning. Classroom process quality domains did not independently predict variance in approaches to learning. Nonetheless, classroom process quality played an important role in these associations; high emotional support buffered the detrimental effects of problem behavior, whereas high instructional support exacerbated them. The findings of this study have important implications for classroom practices aimed at helping children who exhibit problem behaviors. Copyright © 2010 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  13. The Predictive Validity of Teacher Candidate Letters of Reference

    ERIC Educational Resources Information Center

    Mason, Richard W.; Schroeder, Mark P.

    2014-01-01

    Letters of reference are widely used as an essential part of the hiring process of newly licensed teachers. While the predictive validity of these letters of reference has been called into question it has never been empirically studied. The current study examined the predictive validity of the quality of letters of reference for forty-one student…

  14. Quality-by-design III: application of near-infrared spectroscopy to monitor roller compaction in-process and product quality attributes of immediate release tablets.

    PubMed

    Kona, Ravikanth; Fahmy, Raafat M; Claycamp, Gregg; Polli, James E; Martinez, Marilyn; Hoag, Stephen W

    2015-02-01

    The objective of this study is to use near-infrared spectroscopy (NIRS) coupled with multivariate chemometric models to monitor granule and tablet quality attributes in the formulation development and manufacturing of ciprofloxacin hydrochloride (CIP) immediate release tablets. Critical roller compaction process parameters, compression force (CFt), and formulation variables identified from our earlier studies were evaluated in more detail. Multivariate principal component analysis (PCA) and partial least square (PLS) models were developed during the development stage and used as a control tool to predict the quality of granules and tablets. Validated models were used to monitor and control batches manufactured at different sites to assess their robustness to change. The results showed that roll pressure (RP) and CFt played a critical role in the quality of the granules and the finished product within the range tested. Replacing binder source did not statistically influence the quality attributes of the granules and tablets. However, lubricant type has significantly impacted the granule size. Blend uniformity, crushing force, disintegration time during the manufacturing was predicted using validated PLS regression models with acceptable standard error of prediction (SEP) values, whereas the models resulted in higher SEP for batches obtained from different manufacturing site. From this study, we were able to identify critical factors which could impact the quality attributes of the CIP IR tablets. In summary, we demonstrated the ability of near-infrared spectroscopy coupled with chemometrics as a powerful tool to monitor critical quality attributes (CQA) identified during formulation development.

  15. Applying Risk Prediction Models to Optimize Lung Cancer Screening: Current Knowledge, Challenges, and Future Directions.

    PubMed

    Sakoda, Lori C; Henderson, Louise M; Caverly, Tanner J; Wernli, Karen J; Katki, Hormuzd A

    2017-12-01

    Risk prediction models may be useful for facilitating effective and high-quality decision-making at critical steps in the lung cancer screening process. This review provides a current overview of published lung cancer risk prediction models and their applications to lung cancer screening and highlights both challenges and strategies for improving their predictive performance and use in clinical practice. Since the 2011 publication of the National Lung Screening Trial results, numerous prediction models have been proposed to estimate the probability of developing or dying from lung cancer or the probability that a pulmonary nodule is malignant. Respective models appear to exhibit high discriminatory accuracy in identifying individuals at highest risk of lung cancer or differentiating malignant from benign pulmonary nodules. However, validation and critical comparison of the performance of these models in independent populations are limited. Little is also known about the extent to which risk prediction models are being applied in clinical practice and influencing decision-making processes and outcomes related to lung cancer screening. Current evidence is insufficient to determine which lung cancer risk prediction models are most clinically useful and how to best implement their use to optimize screening effectiveness and quality. To address these knowledge gaps, future research should be directed toward validating and enhancing existing risk prediction models for lung cancer and evaluating the application of model-based risk calculators and its corresponding impact on screening processes and outcomes.

  16. Quantitative structure-activity relationship models that stand the test of time.

    PubMed

    Davis, Andrew M; Wood, David J

    2013-04-01

    The pharmaceutical industry is in a period of intense change. While this has many drivers, attrition through the development process continues to be an important pressure. The emerging definitions of "compound quality" that are based on retrospective analyses of developmental attrition have highlighted a new direction for medicinal chemistry and the paradigm of "quality at the point of design". The time has come for retrospective analyses to catalyze prospective action. Quality at the point of design places pressure on the quality of our predictive models. Empirical QSAR models when built with care provide true predictive control, but their accuracy and precision can be improved. Here we describe AstraZeneca's experience of automation in QSAR model building and validation, and how an informatics system can provide a step-change in predictive power to project design teams, if they choose to use it.

  17. Optical spectral signatures of liquids by means of fiber optic technology for product and quality parameter identification

    NASA Astrophysics Data System (ADS)

    Mignani, A. G.; Ciaccheri, L.; Mencaglia, A. A.; Diaz-Herrera, N.; Garcia-Allende, P. B.; Ottevaere, H.; Thienpont, H.; Attilio, C.; Cimato, A.; Francalanci, S.; Paccagnini, A.; Pavone, F. S.

    2009-01-01

    Absorption spectroscopy in the wide 200-1700 nm spectral range is carried out by means of optical fiber instrumentation to achieve a digital mapping of liquids for the prediction of important quality parameters. Extra virgin olive oils from Italy and lubricant oils from turbines with different degrees of degradation were considered as "case studies". The spectral data were processed by means of multivariate analysis so as to obtain a correlation to quality parameters. In practice, the wide range absorption spectra were considered as an optical signature of the liquids from which to extract product quality information. The optical signatures of extra virgin olive oils were used to predict the content of the most important fatty acids. The optical signatures of lubricant oils were used to predict the concentration of the most important parameters for indicating the oil's degree of degradation, such as TAN, JOAP anti-wear index, and water content.

  18. Tandem mass spectrometry data quality assessment by self-convolution.

    PubMed

    Choo, Keng Wah; Tham, Wai Mun

    2007-09-20

    Many algorithms have been developed for deciphering the tandem mass spectrometry (MS) data sets. They can be essentially clustered into two classes. The first performs searches on theoretical mass spectrum database, while the second based itself on de novo sequencing from raw mass spectrometry data. It was noted that the quality of mass spectra affects significantly the protein identification processes in both instances. This prompted the authors to explore ways to measure the quality of MS data sets before subjecting them to the protein identification algorithms, thus allowing for more meaningful searches and increased confidence level of proteins identified. The proposed method measures the qualities of MS data sets based on the symmetric property of b- and y-ion peaks present in a MS spectrum. Self-convolution on MS data and its time-reversal copy was employed. Due to the symmetric nature of b-ions and y-ions peaks, the self-convolution result of a good spectrum would produce a highest mid point intensity peak. To reduce processing time, self-convolution was achieved using Fast Fourier Transform and its inverse transform, followed by the removal of the "DC" (Direct Current) component and the normalisation of the data set. The quality score was defined as the ratio of the intensity at the mid point to the remaining peaks of the convolution result. The method was validated using both theoretical mass spectra, with various permutations, and several real MS data sets. The results were encouraging, revealing a high percentage of positive prediction rates for spectra with good quality scores. We have demonstrated in this work a method for determining the quality of tandem MS data set. By pre-determining the quality of tandem MS data before subjecting them to protein identification algorithms, spurious protein predictions due to poor tandem MS data are avoided, giving scientists greater confidence in the predicted results. We conclude that the algorithm performs well and could potentially be used as a pre-processing for all mass spectrometry based protein identification tools.

  19. Intrinsic and extrinsic motivation in early adolescents' friendship development: friendship selection, influence, and prospective friendship quality.

    PubMed

    Ojanen, Tiina; Sijtsema, Jelle J; Hawley, Patricia H; Little, Todd D

    2010-12-01

    Friendships are essential for adolescent social development. However, they may be pursued for varying motives, which, in turn, may predict similarity in friendships via social selection or social influence processes, and likely help to explain friendship quality. We examined the effect of early adolescents' (N = 374, 12-14 years) intrinsic and extrinsic friendship motivation on friendship selection and social influence by utilizing social network modeling. In addition, longitudinal relations among motivation and friendship quality were estimated with structural equation modeling. Extrinsic motivation predicted activity in making friendship nominations during the sixth grade and lower friendship quality across time. Intrinsic motivation predicted inactivity in making friendship nominations during the sixth, popularity as a friend across the transition to middle school, and higher friendship quality across time. Social influence effects were observed for both motives, but were more pronounced for intrinsic motivation. Copyright © 2010 The Association for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  20. The relationship between health related quality of life and sensory deficits among patients with diabetes mellitus.

    PubMed

    Engel-Yeger, Batya; Darawsha Najjar, Sanaa; Darawsha, Mahmud

    2017-08-13

    (1) To profile sensory deficits examined in the ability to process sensory information from daily environment and discriminate between tactile stimuli among patients with controlled and un-controlled diabetes mellitus. (2) Examine the relationship between the sensory deficits and patients' health-related quality of life. This study included 115 participants aged 33-55 with uncontrolled (n = 22) or controlled (n = 24) glycemic levels together with healthy subjects (n = 69). All participants completed the brief World Health Organization Quality of Life Questionnaire, the Adolescent/Adult Sensory Profile and performed the tactile discrimination test. Sensory deficits were more emphasized among patients with uncontrolled glycemic levels as expressed in difficulties to register sensory input, lower sensation seeking in daily environments and difficulties to discriminate between tactile stimuli. They also reported the lowest physical and social quality of life as compared to the other two groups. Better sensory seeking and registration predicted better quality of life. Disease control and duration contributed to these predictions. Difficulties in processing sensory information from their daily environments are particularly prevalent among patients with uncontrolled glycemic levels, and significantly impacted their quality of life. Clinicians should screen for sensory processing difficulties among patients with diabetes mellitus and understand their impacts on patients' quality of life. Implications for Rehabilitation Patients with diabetes mellitus, and particularly those with uncontrolled glycemic levels, may have difficulties in processing sensory information from daily environment. A multidisciplinary intervention approach is recommended: clinicians should screen for sensory processing deficits among patients with diabetes mellitus and understand their impacts on patients' daily life. By providing the patients with environmental adaptations and coping strategies, clinicians may assist in optimizing sensory experiences in real life context and elevate patients' quality of life. Relating to quality of life and emphasizing a multidisciplinary approach is of major importance in broadening our understanding of health conditions and providing holistic treatment for patients.

  1. Predictive displays for a process-control schematic interface.

    PubMed

    Yin, Shanqing; Wickens, Christopher D; Helander, Martin; Laberge, Jason C

    2015-02-01

    Our objective was to examine the extent to which increasing precision of predictive (rate of change) information in process control will improve performance on a simulated process-control task. Predictive displays have been found to be useful in process control (as well as aviation and maritime industries). However, authors of prior research have not examined the extent to which predictive value is increased by increasing predictor resolution, nor has such research tied potential improvements to changes in process control strategy. Fifty nonprofessional participants each controlled a simulated chemical mixture process (honey mixer simulation) that simulated the operations found in process control. Participants in each of five groups controlled with either no predictor or a predictor ranging in the resolution of prediction of the process. Increasing detail resolution generally increased the benefit of prediction over the control condition although not monotonically so. The best overall performance, combining quality and predictive ability, was obtained by the display of intermediate resolution. The two displays with the lowest resolution were clearly inferior. Predictors with higher resolution are of value but may trade off enhanced sensitivity to variable change (lower-resolution discrete state predictor) with smoother control action (higher-resolution continuous predictors). The research provides guidelines to the process-control industry regarding displays that can most improve operator performance.

  2. Temperature and vegetation effects on soil organic carbon quality along a forested mean annual temperature gradient in North America

    Treesearch

    Cinzia Fissore; Christian P. Giardina; Randall K. Kolka; Carl C. Trettin; Gary M. King; Martin F. Jurgensen; Christopher D. Barton; S. Douglas McDowell

    2008-01-01

    Both climate and plant species are hypothesized to influence soil organic carbon (SOC) quality, but accurate prediction of how SOC process rates respond to global change will require an improved understanding of how SOC quality varies with mean annual temperature (MAT) and forest type. We investigated SOC quality in paired hardwood and pine stands growing in coarse...

  3. Prediction of porosity of food materials during drying: Current challenges and directions.

    PubMed

    Joardder, Mohammad U H; Kumar, C; Karim, M A

    2017-07-18

    Pore formation in food samples is a common physical phenomenon observed during dehydration processes. The pore evolution during drying significantly affects the physical properties and quality of dried foods. Therefore, it should be taken into consideration when predicting transport processes in the drying sample. Characteristics of pore formation depend on the drying process parameters, product properties and processing time. Understanding the physics of pore formation and evolution during drying will assist in accurately predicting the drying kinetics and quality of food materials. Researchers have been trying to develop mathematical models to describe the pore formation and evolution during drying. In this study, existing porosity models are critically analysed and limitations are identified. Better insight into the factors affecting porosity is provided, and suggestions are proposed to overcome the limitations. These include considerations of process parameters such as glass transition temperature, sample temperature, and variable material properties in the porosity models. Several researchers have proposed models for porosity prediction of food materials during drying. However, these models are either very simplistic or empirical in nature and failed to consider relevant significant factors that influence porosity. In-depth understanding of characteristics of the pore is required for developing a generic model of porosity. A micro-level analysis of pore formation is presented for better understanding, which will help in developing an accurate and generic porosity model.

  4. Effects of temporal and spatial resolution of calibration data on integrated hydrologic water quality model identification

    NASA Astrophysics Data System (ADS)

    Jiang, Sanyuan; Jomaa, Seifeddine; Büttner, Olaf; Rode, Michael

    2014-05-01

    Hydrological water quality modeling is increasingly used for investigating runoff and nutrient transport processes as well as watershed management but it is mostly unclear how data availablity determins model identification. In this study, the HYPE (HYdrological Predictions for the Environment) model, which is a process-based, semi-distributed hydrological water quality model, was applied in two different mesoscale catchments (Selke (463 km2) and Weida (99 km2)) located in central Germany to simulate discharge and inorganic nitrogen (IN) transport. PEST and DREAM(ZS) were combined with the HYPE model to conduct parameter calibration and uncertainty analysis. Split-sample test was used for model calibration (1994-1999) and validation (1999-2004). IN concentration and daily IN load were found to be highly correlated with discharge, indicating that IN leaching is mainly controlled by runoff. Both dynamics and balances of water and IN load were well captured with NSE greater than 0.83 during validation period. Multi-objective calibration (calibrating hydrological and water quality parameters simultaneously) was found to outperform step-wise calibration in terms of model robustness. Multi-site calibration was able to improve model performance at internal sites, decrease parameter posterior uncertainty and prediction uncertainty. Nitrogen-process parameters calibrated using continuous daily averages of nitrate-N concentration observations produced better and more robust simulations of IN concentration and load, lower posterior parameter uncertainty and IN concentration prediction uncertainty compared to the calibration against uncontinuous biweekly nitrate-N concentration measurements. Both PEST and DREAM(ZS) are efficient in parameter calibration. However, DREAM(ZS) is more sound in terms of parameter identification and uncertainty analysis than PEST because of its capability to evolve parameter posterior distributions and estimate prediction uncertainty based on global search and Bayesian inference schemes.

  5. Effect of hot carcass weight on loin, ham, and belly quality from pigs sourced from a commercial processing facility

    USDA-ARS?s Scientific Manuscript database

    The objective was to determine the predictive abilities of HCW for loin, ham, and belly quality of 7,684 pigs with carcass weights ranging from 53.2 to 129.6 kg. Carcass composition, subjective loin quality, and ham face color were targeted on all carcasses, whereas in-plant instrumental loin color ...

  6. PREDICTIVE UNCERTAINTY IN HYDROLOGIC AND WATER QUALITY MODELING: APPROACHES, APPLICATION TO ENVIRONMENTAL MANAGEMENT, AND FUTURE CHALLENGES (PRESENTATION)

    EPA Science Inventory

    Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...

  7. PREDICTIVE UNCERTAINTY IN HYDROLOGIC AND WATER QUALITY MODELING: APPROACHES, APPLICATION TO ENVIRONMENTAL MANAGEMENT, AND FUTURE CHALLENGES

    EPA Science Inventory

    Extant process-based hydrologic and water quality models are indispensable to water resources planning and environmental management. However, models are only approximations of real systems and often calibrated with incomplete and uncertain data. Reliable estimates, or perhaps f...

  8. A multilevel modeling approach to examining individual differences in skill acquisition for a computer-based task.

    PubMed

    Nair, Sankaran N; Czaja, Sara J; Sharit, Joseph

    2007-06-01

    This article explores the role of age, cognitive abilities, prior experience, and knowledge in skill acquisition for a computer-based simulated customer service task. Fifty-two participants aged 50-80 performed the task over 4 consecutive days following training. They also completed a battery that assessed prior computer experience and cognitive abilities. The data indicated that overall quality and efficiency of performance improved with practice. The predictors of initial level of performance and rate of change in performance varied according to the performance parameter assessed. Age and fluid intelligence predicted initial level and rate of improvement in overall quality, whereas crystallized intelligence and age predicted initial e-mail processing time, and crystallized intelligence predicted rate of change in e-mail processing time over days. We discuss the implications of these findings for the design of intervention strategies.

  9. The reliability-quality relationship for quality systems and quality risk management.

    PubMed

    Claycamp, H Gregg; Rahaman, Faiad; Urban, Jason M

    2012-01-01

    Engineering reliability typically refers to the probability that a system, or any of its components, will perform a required function for a stated period of time and under specified operating conditions. As such, reliability is inextricably linked with time-dependent quality concepts, such as maintaining a state of control and predicting the chances of losses from failures for quality risk management. Two popular current good manufacturing practice (cGMP) and quality risk management tools, failure mode and effects analysis (FMEA) and root cause analysis (RCA) are examples of engineering reliability evaluations that link reliability with quality and risk. Current concepts in pharmaceutical quality and quality management systems call for more predictive systems for maintaining quality; yet, the current pharmaceutical manufacturing literature and guidelines are curiously silent on engineering quality. This commentary discusses the meaning of engineering reliability while linking the concept to quality systems and quality risk management. The essay also discusses the difference between engineering reliability and statistical (assay) reliability. The assurance of quality in a pharmaceutical product is no longer measured only "after the fact" of manufacturing. Rather, concepts of quality systems and quality risk management call for designing quality assurance into all stages of the pharmaceutical product life cycle. Interestingly, most assays for quality are essentially static and inform product quality over the life cycle only by being repeated over time. Engineering process reliability is the fundamental concept that is meant to anticipate quality failures over the life cycle of the product. Reliability is a well-developed theory and practice for other types of manufactured products and manufacturing processes. Thus, it is well known to be an appropriate index of manufactured product quality. This essay discusses the meaning of reliability and its linkages with quality systems and quality risk management.

  10. [Quality by design approaches for pharmaceutical development and manufacturing of Chinese medicine].

    PubMed

    Xu, Bing; Shi, Xin-Yuan; Wu, Zhi-Sheng; Zhang, Yan-Ling; Wang, Yun; Qiao, Yan-Jiang

    2017-03-01

    The pharmaceutical quality was built by design, formed in the manufacturing process and improved during the product's lifecycle. Based on the comprehensive literature review of pharmaceutical quality by design (QbD), the essential ideas and implementation strategies of pharmaceutical QbD were interpreted. Considering the complex nature of Chinese medicine, the "4H" model was innovated and proposed for implementing QbD in pharmaceutical development and industrial manufacture of Chinese medicine product. "4H" corresponds to the acronym of holistic design, holistic information analysis, holistic quality control, and holistic process optimization, which is consistent with the holistic concept of Chinese medicine theory. The holistic design aims at constructing both the quality problem space from the patient requirement and the quality solution space from multidisciplinary knowledge. Holistic information analysis emphasizes understanding the quality pattern of Chinese medicine by integrating and mining multisource data and information at a relatively high level. The batch-to-batch quality consistence and manufacturing system reliability can be realized by comprehensive application of inspective quality control, statistical quality control, predictive quality control and intelligent quality control strategies. Holistic process optimization is to improve the product quality and process capability during the product lifecycle management. The implementation of QbD is useful to eliminate the ecosystem contradictions lying in the pharmaceutical development and manufacturing process of Chinese medicine product, and helps guarantee the cost effectiveness. Copyright© by the Chinese Pharmaceutical Association.

  11. MOCAT: A Metagenomics Assembly and Gene Prediction Toolkit

    PubMed Central

    Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R.; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer

    2012-01-01

    MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/. PMID:23082188

  12. MOCAT: a metagenomics assembly and gene prediction toolkit.

    PubMed

    Kultima, Jens Roat; Sunagawa, Shinichi; Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer

    2012-01-01

    MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/.

  13. Fabrication of Thermoplastic Composite Laminates Having Film Interleaves By Automated Fiber Placement

    NASA Technical Reports Server (NTRS)

    Hulcher, A. B.; Tiwari, S. N.; Marchello, J. M.; Johnston, Norman J. (Technical Monitor)

    2001-01-01

    Experiments were carried out at the NASA Langley Research Center automated Fiber placement facility to determine an optimal process for the fabrication of composite materials having polymer film interleaves. A series of experiments was conducted to determine an optimal process for the composite prior to investigation of a process to fabricate laminates with polymer films. The results of the composite tests indicated that a well-consolidated, void-free laminate could be attained. Preliminary interleaf processing trials were then conducted to establish some broad guidelines for film processing. The primary finding of these initial studies was that a two-stage process was necessary in order to process these materials adequately. A screening experiment was then performed to determine the relative influence of the process variables on the quality of the film interface as determined by the wedge peel test method. Parameters that were found to be of minor influence on specimen quality were subsequently held at fixed values enabling a more rapid determination of an optimal process. Optimization studies were then performed by varying the remaining parameters at three film melt processing rates. The resulting peel data were fitted with quadratic response surfaces. Additional specimens were fabricated at levels of high peel strength as predicted by the regression models in an attempt to gage the accuracy of the predicted response and to assess the repeatability of the process. The overall results indicate that quality laminates having film interleaves can be successfully and repeatably fabricated by automated fiber placement.

  14. Comparing Binaural Pre-processing Strategies I: Instrumental Evaluation.

    PubMed

    Baumgärtel, Regina M; Krawczyk-Becker, Martin; Marquardt, Daniel; Völker, Christoph; Hu, Hongmei; Herzke, Tobias; Coleman, Graham; Adiloğlu, Kamil; Ernst, Stephan M A; Gerkmann, Timo; Doclo, Simon; Kollmeier, Birger; Hohmann, Volker; Dietz, Mathias

    2015-12-30

    In a collaborative research project, several monaural and binaural noise reduction algorithms have been comprehensively evaluated. In this article, eight selected noise reduction algorithms were assessed using instrumental measures, with a focus on the instrumental evaluation of speech intelligibility. Four distinct, reverberant scenarios were created to reflect everyday listening situations: a stationary speech-shaped noise, a multitalker babble noise, a single interfering talker, and a realistic cafeteria noise. Three instrumental measures were employed to assess predicted speech intelligibility and predicted sound quality: the intelligibility-weighted signal-to-noise ratio, the short-time objective intelligibility measure, and the perceptual evaluation of speech quality. The results show substantial improvements in predicted speech intelligibility as well as sound quality for the proposed algorithms. The evaluated coherence-based noise reduction algorithm was able to provide improvements in predicted audio signal quality. For the tested single-channel noise reduction algorithm, improvements in intelligibility-weighted signal-to-noise ratio were observed in all but the nonstationary cafeteria ambient noise scenario. Binaural minimum variance distortionless response beamforming algorithms performed particularly well in all noise scenarios. © The Author(s) 2015.

  15. Comparing Binaural Pre-processing Strategies I

    PubMed Central

    Krawczyk-Becker, Martin; Marquardt, Daniel; Völker, Christoph; Hu, Hongmei; Herzke, Tobias; Coleman, Graham; Adiloğlu, Kamil; Ernst, Stephan M. A.; Gerkmann, Timo; Doclo, Simon; Kollmeier, Birger; Hohmann, Volker; Dietz, Mathias

    2015-01-01

    In a collaborative research project, several monaural and binaural noise reduction algorithms have been comprehensively evaluated. In this article, eight selected noise reduction algorithms were assessed using instrumental measures, with a focus on the instrumental evaluation of speech intelligibility. Four distinct, reverberant scenarios were created to reflect everyday listening situations: a stationary speech-shaped noise, a multitalker babble noise, a single interfering talker, and a realistic cafeteria noise. Three instrumental measures were employed to assess predicted speech intelligibility and predicted sound quality: the intelligibility-weighted signal-to-noise ratio, the short-time objective intelligibility measure, and the perceptual evaluation of speech quality. The results show substantial improvements in predicted speech intelligibility as well as sound quality for the proposed algorithms. The evaluated coherence-based noise reduction algorithm was able to provide improvements in predicted audio signal quality. For the tested single-channel noise reduction algorithm, improvements in intelligibility-weighted signal-to-noise ratio were observed in all but the nonstationary cafeteria ambient noise scenario. Binaural minimum variance distortionless response beamforming algorithms performed particularly well in all noise scenarios. PMID:26721920

  16. Three-dimensional numerical modeling of water quality and sediment-associated processes in natural lakes

    USDA-ARS?s Scientific Manuscript database

    This chapter presents the development and application of a three-dimensional water quality model for predicting the distributions of nutrients, phytoplankton, dissolved oxygen, etc., in natural lakes. In this model, the computational domain was divided into two parts: the water column and the bed se...

  17. [Discussion on research and development of new traditional Chinese medicine preparation process based on idea of QbD].

    PubMed

    Feng, Yi; Hong, Yan-Long; Xian, Jie-Chen; Du, Ruo-Fei; Zhao, Li-Jie; Shen, Lan

    2014-09-01

    Traditional processes are mostly adopted in traditional Chinese medicine (TCM) preparation production and the quality of products is mostly controlled by terminal. Potential problems of the production in the process are unpredictable and is relied on experience in most cases. Therefore, it is hard to find the key points affecting the preparation process and quality control. A pattern of research and development of traditional Chinese medicine preparation process based on the idea of Quality by Design (QbD) was proposed after introducing the latest research achievement. Basic theories of micromeritics and rheology were used to characterize the physical property of TCM raw material. TCM preparation process was designed in a more scientific and rational way by studying the correlation among enhancing physical property of raw material, preparation process and product quality of preparation. So factors affecting the quality of TCM production would be found out and problems that might occur in the pilot process could be predicted. It would be a foundation for the R&D and production of TCM preparation as well as support for the "process control" of TCMIs gradually realized in the future.

  18. Experience-based quality control of clinical intensity-modulated radiotherapy planning.

    PubMed

    Moore, Kevin L; Brame, R Scott; Low, Daniel A; Mutic, Sasa

    2011-10-01

    To incorporate a quality control tool, according to previous planning experience and patient-specific anatomic information, into the intensity-modulated radiotherapy (IMRT) plan generation process and to determine whether the tool improved treatment plan quality. A retrospective study of 42 IMRT plans demonstrated a correlation between the fraction of organs at risk (OARs) overlapping the planning target volume and the mean dose. This yielded a model, predicted dose = prescription dose (0.2 + 0.8 [1 - exp(-3 overlapping planning target volume/volume of OAR)]), that predicted the achievable mean doses according to the planning target volume overlap/volume of OAR and the prescription dose. The model was incorporated into the planning process by way of a user-executable script that reported the predicted dose for any OAR. The script was introduced to clinicians engaged in IMRT planning and deployed thereafter. The script's effect was evaluated by tracking δ = (mean dose-predicted dose)/predicted dose, the fraction by which the mean dose exceeded the model. All OARs under investigation (rectum and bladder in prostate cancer; parotid glands, esophagus, and larynx in head-and-neck cancer) exhibited both smaller δ and reduced variability after script implementation. These effects were substantial for the parotid glands, for which the previous δ = 0.28 ± 0.24 was reduced to δ = 0.13 ± 0.10. The clinical relevance was most evident in the subset of cases in which the parotid glands were potentially salvageable (predicted dose <30 Gy). Before script implementation, an average of 30.1 Gy was delivered to the salvageable cases, with an average predicted dose of 20.3 Gy. After implementation, an average of 18.7 Gy was delivered to salvageable cases, with an average predicted dose of 17.2 Gy. In the prostate cases, the rectum model excess was reduced from δ = 0.28 ± 0.20 to δ = 0.07 ± 0.15. On surveying dosimetrists at the end of the study, most reported that the script both improved their IMRT planning (8 of 10) and increased their efficiency (6 of 10). This tool proved successful in increasing normal tissue sparing and reducing interclinician variability, providing effective quality control of the IMRT plan development process. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Using the Gamma-Poisson Model to Predict Library Circulations.

    ERIC Educational Resources Information Center

    Burrell, Quentin L.

    1990-01-01

    Argues that the gamma mixture of Poisson processes, for all its perceived defects, can be used to make predictions regarding future library book circulations of a quality adequate for general management requirements. The use of the model is extensively illustrated with data from two academic libraries. (Nine references) (CLB)

  20. Modeling Benthic Sediment Processes to Predict Water Quality and Ecology in Narragansett Bay

    EPA Science Inventory

    The benthic sediment acts as a huge reservoir of particulate and dissolved material (within interstitial water) which can contribute to loading of contaminants and nutrients to the water column. A benthic sediment model is presented in this report to predict spatial and temporal ...

  1. Predictors of Handwriting in Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Hellinckx, Tinneke; Roeyers, Herbert; Van Waelvelde, Hilde

    2013-01-01

    During writing, perceptual, motor, and cognitive processes interact. This study explored the predictive value of several factors on handwriting quality as well as on speed in children with Autism Spectrum Disorder (ASD). Our results showed that, in this population, age, gender, and visual-motor integration significantly predicted handwriting…

  2. Real-time control of combined surface water quantity and quality: polder flushing.

    PubMed

    Xu, M; van Overloop, P J; van de Giesen, N C; Stelling, G S

    2010-01-01

    In open water systems, keeping both water depths and water quality at specified values is critical for maintaining a 'healthy' water system. Many systems still require manual operation, at least for water quality management. When applying real-time control, both quantity and quality standards need to be met. In this paper, an artificial polder flushing case is studied. Model Predictive Control (MPC) is developed to control the system. In addition to MPC, a 'forward estimation' procedure is used to acquire water quality predictions for the simplified model used in MPC optimization. In order to illustrate the advantages of MPC, classical control [Proportional-Integral control (PI)] has been developed for comparison in the test case. The results show that both algorithms are able to control the polder flushing process, but MPC is more efficient in functionality and control flexibility.

  3. Binary classification of items of interest in a repeatable process

    DOEpatents

    Abell, Jeffrey A.; Spicer, John Patrick; Wincek, Michael Anthony; Wang, Hui; Chakraborty, Debejyo

    2014-06-24

    A system includes host and learning machines in electrical communication with sensors positioned with respect to an item of interest, e.g., a weld, and memory. The host executes instructions from memory to predict a binary quality status of the item. The learning machine receives signals from the sensor(s), identifies candidate features, and extracts features from the candidates that are more predictive of the binary quality status relative to other candidate features. The learning machine maps the extracted features to a dimensional space that includes most of the items from a passing binary class and excludes all or most of the items from a failing binary class. The host also compares the received signals for a subsequent item of interest to the dimensional space to thereby predict, in real time, the binary quality status of the subsequent item of interest.

  4. Bonding prediction in friction stir consolidation of aluminum alloys: A preliminary study

    NASA Astrophysics Data System (ADS)

    Baffari, Dario; Reynolds, Anthony P.; Li, Xiao; Fratini, Livan

    2018-05-01

    Friction Stir Consolidation (FSC) is a solid-state process that results in consolidation of metal powders or chips producing solid billet through severe plastic deformation and the solid-state bonding phenomena. This process can be used both for primary production and for metal scrap recycling. During the FSC process, a rotating die is plunged into a hollow chamber containing the finely divided, unconsolidated material to be processed. In this paper, a FEM numerical model for the prediction of the quality of the consolidated billet is presented. In particular, a dedicated bonding criterion that takes into account the peculiar process mechanics of this innovative technology is proposed.

  5. Importance of good manufacturing practices in microbiological monitoring in processing human tissues for transplant.

    PubMed

    Pianigiani, Elisa; Ierardi, Francesca; Fimiani, Michele

    2013-12-01

    Skin allografts represent an important therapeutic resource in the treatment of severe skin loss. The risk associated with application of processed tissues in humans is very low, however, human material always carries the risk of disease transmission. To minimise the risk of contamination of grafts, processing is carried out in clean rooms where air quality is monitored. Procedures and quality control tests are performed to standardise the production process and to guarantee the final product for human use. Since we only validate and distribute aseptic tissues, we conducted a study to determine what type of quality controls for skin processing are the most suitable for detecting processing errors and intercurrent contamination, and for faithfully mapping the process without unduly increasing production costs. Two different methods for quality control were statistically compared using the Fisher exact test. On the basis of the current study we selected our quality control procedure based on pre- and post-processing tissue controls, operator and environmental controls. Evaluation of the predictability of our control methods showed that tissue control was the most reliable method of revealing microbial contamination of grafts. We obtained 100 % sensitivity by doubling tissue controls, while maintaining high specificity (77 %).

  6. Assessment of infant formula quality and composition using Vis-NIR, MIR and Raman process analytical technologies.

    PubMed

    Wang, Xiao; Esquerre, Carlos; Downey, Gerard; Henihan, Lisa; O'Callaghan, Donal; O'Donnell, Colm

    2018-06-01

    In this study, visible and near-infrared (Vis-NIR), mid-infrared (MIR) and Raman process analytical technologies were investigated for assessment of infant formula quality and compositional parameters namely preheat temperature, storage temperature, storage time, fluorescence of advanced Maillard products and soluble tryptophan (FAST) index, soluble protein, fat and surface free fat (SFF) content. PLS-DA models developed using spectral data with appropriate data pre-treatment and significant variables selected using Martens' uncertainty test had good accuracy for the discrimination of preheat temperature (92.3-100%) and storage temperature (91.7-100%). The best PLS regression models developed yielded values for the ratio of prediction error to deviation (RPD) of 3.6-6.1, 2.1-2.7, 1.7-2.9, 1.6-2.6 and 2.5-3.0 for storage time, FAST index, soluble protein, fat and SFF content prediction respectively. Vis-NIR, MIR and Raman were demonstrated to be potential PAT tools for process control and quality assurance applications in infant formula and dairy ingredient manufacture. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. High-Level Prediction Signals in a Low-Level Area of the Macaque Face-Processing Hierarchy.

    PubMed

    Schwiedrzik, Caspar M; Freiwald, Winrich A

    2017-09-27

    Theories like predictive coding propose that lower-order brain areas compare their inputs to predictions derived from higher-order representations and signal their deviation as a prediction error. Here, we investigate whether the macaque face-processing system, a three-level hierarchy in the ventral stream, employs such a coding strategy. We show that after statistical learning of specific face sequences, the lower-level face area ML computes the deviation of actual from predicted stimuli. But these signals do not reflect the tuning characteristic of ML. Rather, they exhibit identity specificity and view invariance, the tuning properties of higher-level face areas AL and AM. Thus, learning appears to endow lower-level areas with the capability to test predictions at a higher level of abstraction than what is afforded by the feedforward sweep. These results provide evidence for computational architectures like predictive coding and suggest a new quality of functional organization of information-processing hierarchies beyond pure feedforward schemes. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Real-time product attribute control to manufacture antibodies with defined N-linked glycan levels.

    PubMed

    Zupke, Craig; Brady, Lowell J; Slade, Peter G; Clark, Philip; Caspary, R Guy; Livingston, Brittney; Taylor, Lisa; Bigham, Kyle; Morris, Arvia E; Bailey, Robert W

    2015-01-01

    Pressures for cost-effective new therapies and an increased emphasis on emerging markets require technological advancements and a flexible future manufacturing network for the production of biologic medicines. The safety and efficacy of a product is crucial, and consistent product quality is an essential feature of any therapeutic manufacturing process. The active control of product quality in a typical biologic process is challenging because of measurement lags and nonlinearities present in the system. The current study uses nonlinear model predictive control to maintain a critical product quality attribute at a predetermined value during pilot scale manufacturing operations. This approach to product quality control ensures a more consistent product for patients, enables greater manufacturing efficiency, and eliminates the need for extensive process characterization by providing direct measures of critical product quality attributes for real time release of drug product. © 2015 American Institute of Chemical Engineers.

  9. Process service quality evaluation based on Dempster-Shafer theory and support vector machine.

    PubMed

    Pei, Feng-Que; Li, Dong-Bo; Tong, Yi-Fei; He, Fei

    2017-01-01

    Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM) and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs) are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.

  10. Visualization of the Invisible, Explanation of the Unknown, Ruggedization of the Unstable: Sensitivity Analysis, Virtual Tryout and Robust Design through Systematic Stochastic Simulation

    NASA Astrophysics Data System (ADS)

    Zwickl, Titus; Carleer, Bart; Kubli, Waldemar

    2005-08-01

    In the past decade, sheet metal forming simulation became a well established tool to predict the formability of parts. In the automotive industry, this has enabled significant reduction in the cost and time for vehicle design and development, and has helped to improve the quality and performance of vehicle parts. However, production stoppages for troubleshooting and unplanned die maintenance, as well as production quality fluctuations continue to plague manufacturing cost and time. The focus therefore has shifted in recent times beyond mere feasibility to robustness of the product and process being engineered. Ensuring robustness is the next big challenge for the virtual tryout / simulation technology. We introduce new methods, based on systematic stochastic simulations, to visualize the behavior of the part during the whole forming process — in simulation as well as in production. Sensitivity analysis explains the response of the part to changes in influencing parameters. Virtual tryout allows quick exploration of changed designs and conditions. Robust design and manufacturing guarantees quality and process capability for the production process. While conventional simulations helped to reduce development time and cost by ensuring feasible processes, robustness engineering tools have the potential for far greater cost and time savings. Through examples we illustrate how expected and unexpected behavior of deep drawing parts may be tracked down, identified and assigned to the influential parameters. With this knowledge, defects can be eliminated or springback can be compensated e.g.; the response of the part to uncontrollable noise can be predicted and minimized. The newly introduced methods enable more reliable and predictable stamping processes in general.

  11. In the moral eye of the beholder: the interactive effects of leader and follower moral identity on perceptions of ethical leadership and LMX quality

    PubMed Central

    Giessner, Steffen R.; Van Quaquebeke, Niels; van Gils, Suzanne; van Knippenberg, Daan; Kollée, Janine A. J. M.

    2015-01-01

    Previous research indicated that leader moral identity (MI; i.e., leaders’ self-definition in terms of moral attributes) predicts to what extent followers perceive their leader as ethical (i.e., demonstrating and promoting ethical conduct in the organization). Leadership, however, is a relational process that involves leaders and followers. Building on this understanding, we hypothesized that follower and leader MI (a) interact in predicting whether followers will perceive their leaders as ethical and, as a result, (b) influence followers’ perceptions of leader–follower relationship quality. A dyadic field study (N = 101) shows that leader MI is a stronger predictor of followers’ perceptions of ethical leadership for followers who are high (vs. low) in MI. Perceptions of ethical leadership in turn predict how the quality of the relationship will be perceived. Hence, whether leader MI translates to perceptions of ethical leadership and of better relationship quality depends on the MI of followers. PMID:26300820

  12. In the moral eye of the beholder: the interactive effects of leader and follower moral identity on perceptions of ethical leadership and LMX quality.

    PubMed

    Giessner, Steffen R; Van Quaquebeke, Niels; van Gils, Suzanne; van Knippenberg, Daan; Kollée, Janine A J M

    2015-01-01

    Previous research indicated that leader moral identity (MI; i.e., leaders' self-definition in terms of moral attributes) predicts to what extent followers perceive their leader as ethical (i.e., demonstrating and promoting ethical conduct in the organization). Leadership, however, is a relational process that involves leaders and followers. Building on this understanding, we hypothesized that follower and leader MI (a) interact in predicting whether followers will perceive their leaders as ethical and, as a result, (b) influence followers' perceptions of leader-follower relationship quality. A dyadic field study (N = 101) shows that leader MI is a stronger predictor of followers' perceptions of ethical leadership for followers who are high (vs. low) in MI. Perceptions of ethical leadership in turn predict how the quality of the relationship will be perceived. Hence, whether leader MI translates to perceptions of ethical leadership and of better relationship quality depends on the MI of followers.

  13. Rapid monitoring of the fermentation process for Korean traditional rice wine 'Makgeolli' using FT-NIR spectroscopy

    NASA Astrophysics Data System (ADS)

    Kim, Dae-Yong; Cho, Byoung-Kwan

    2015-11-01

    The quality parameters of the Korean traditional rice wine "Makgeolli" were monitored using Fourier transform near-infrared (FT-NIR) spectroscopy with multivariate statistical analysis (MSA) during fermentation. Alcohol, reducing sugar, and titratable acid were the parameters assessed to determine the quality index of fermentation substrates and products. The acquired spectra were analyzed with partial least squares regression (PLSR). The best prediction model for alcohol was obtained with maximum normalization, showing a coefficient of determination (Rp2) of 0.973 and a standard error of prediction (SEP) of 0.760%. In addition, the best prediction model for reducing sugar was obtained with no data preprocessing, with a Rp2 value of 0.945 and a SEP of 1.233%. The prediction of titratable acidity was best with mean normalization, showing a Rp2 value of 0.882 and a SEP of 0.045%. These results demonstrate that FT-NIR spectroscopy can be used for rapid measurements of quality parameters during Makgeolli fermentation.

  14. Design of experiments-based monitoring of critical quality attributes for the spray-drying process of insulin by NIR spectroscopy.

    PubMed

    Maltesen, Morten Jonas; van de Weert, Marco; Grohganz, Holger

    2012-09-01

    Moisture content and aerodynamic particle size are critical quality attributes for spray-dried protein formulations. In this study, spray-dried insulin powders intended for pulmonary delivery were produced applying design of experiments methodology. Near infrared spectroscopy (NIR) in combination with preprocessing and multivariate analysis in the form of partial least squares projections to latent structures (PLS) were used to correlate the spectral data with moisture content and aerodynamic particle size measured by a time of flight principle. PLS models predicting the moisture content were based on the chemical information of the water molecules in the NIR spectrum. Models yielded prediction errors (RMSEP) between 0.39% and 0.48% with thermal gravimetric analysis used as reference method. The PLS models predicting the aerodynamic particle size were based on baseline offset in the NIR spectra and yielded prediction errors between 0.27 and 0.48 μm. The morphology of the spray-dried particles had a significant impact on the predictive ability of the models. Good predictive models could be obtained for spherical particles with a calibration error (RMSECV) of 0.22 μm, whereas wrinkled particles resulted in much less robust models with a Q (2) of 0.69. Based on the results in this study, NIR is a suitable tool for process analysis of the spray-drying process and for control of moisture content and particle size, in particular for smooth and spherical particles.

  15. Chapter 8: Acoustic Assessment of Wood Quality in Trees and Logs

    Treesearch

    Xiping Wang; Peter Carter

    2015-01-01

    Assessing the quality of raw wood materials has become a crucial issue in the operational value chain as forestry and the wood processing industry are increasingly under economic pressure to maximize extracted value. A significant effort has been devoted toward developing robust nondestructive evaluation (NDE) technologies capable of predicting the intrinsic wood...

  16. DYNAMIC EVALUATION OF REGIONAL AIR QUALITY MODELS: ASSESSING CHANGES TO O 3 STEMMING FROM CHANGES IN EMISSIONS AND METEOROLOGY

    EPA Science Inventory

    Regional-scale air quality models are used to estimate the response of air pollutants to potential emission control strategies as part of the decision-making process. Traditionally, the model predicted pollutant concentrations are evaluated for the “base case” to assess a model’s...

  17. An Artificial Intelligence System to Predict Quality of Service in Banking Organizations

    PubMed Central

    Popovič, Aleš

    2016-01-01

    Quality of service, that is, the waiting time that customers must endure in order to receive a service, is a critical performance aspect in private and public service organizations. Providing good service quality is particularly important in highly competitive sectors where similar services exist. In this paper, focusing on banking sector, we propose an artificial intelligence system for building a model for the prediction of service quality. While the traditional approach used for building analytical models relies on theories and assumptions about the problem at hand, we propose a novel approach for learning models from actual data. Thus, the proposed approach is not biased by the knowledge that experts may have about the problem, but it is completely based on the available data. The system is based on a recently defined variant of genetic programming that allows practitioners to include the concept of semantics in the search process. This will have beneficial effects on the search process and will produce analytical models that are based only on the data and not on domain-dependent knowledge. PMID:27313604

  18. An Artificial Intelligence System to Predict Quality of Service in Banking Organizations.

    PubMed

    Castelli, Mauro; Manzoni, Luca; Popovič, Aleš

    2016-01-01

    Quality of service, that is, the waiting time that customers must endure in order to receive a service, is a critical performance aspect in private and public service organizations. Providing good service quality is particularly important in highly competitive sectors where similar services exist. In this paper, focusing on banking sector, we propose an artificial intelligence system for building a model for the prediction of service quality. While the traditional approach used for building analytical models relies on theories and assumptions about the problem at hand, we propose a novel approach for learning models from actual data. Thus, the proposed approach is not biased by the knowledge that experts may have about the problem, but it is completely based on the available data. The system is based on a recently defined variant of genetic programming that allows practitioners to include the concept of semantics in the search process. This will have beneficial effects on the search process and will produce analytical models that are based only on the data and not on domain-dependent knowledge.

  19. Real-time determination of critical quality attributes using near-infrared spectroscopy: a contribution for Process Analytical Technology (PAT).

    PubMed

    Rosas, Juan G; Blanco, Marcel; González, Josep M; Alcalà, Manel

    2012-08-15

    Process Analytical Technology (PAT) is playing a central role in current regulations on pharmaceutical production processes. Proper understanding of all operations and variables connecting the raw materials to end products is one of the keys to ensuring quality of the products and continuous improvement in their production. Near infrared spectroscopy (NIRS) has been successfully used to develop faster and non-invasive quantitative methods for real-time predicting critical quality attributes (CQA) of pharmaceutical granulates (API content, pH, moisture, flowability, angle of repose and particle size). NIR spectra have been acquired from the bin blender after granulation process in a non-classified area without the need of sample withdrawal. The methodology used for data acquisition, calibration modelling and method application in this context is relatively inexpensive and can be easily implemented by most pharmaceutical laboratories. For this purpose, Partial Least-Squares (PLS) algorithm was used to calculate multivariate calibration models, that provided acceptable Root Mean Square Error of Predictions (RMSEP) values (RMSEP(API)=1.0 mg/g; RMSEP(pH)=0.1; RMSEP(Moisture)=0.1%; RMSEP(Flowability)=0.6 g/s; RMSEP(Angle of repose)=1.7° and RMSEP(Particle size)=2.5%) that allowed the application for routine analyses of production batches. The proposed method affords quality assessment of end products and the determination of important parameters with a view to understanding production processes used by the pharmaceutical industry. As shown here, the NIRS technique is a highly suitable tool for Process Analytical Technologies. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. No-reference video quality measurement: added value of machine learning

    NASA Astrophysics Data System (ADS)

    Mocanu, Decebal Constantin; Pokhrel, Jeevan; Garella, Juan Pablo; Seppänen, Janne; Liotou, Eirini; Narwaria, Manish

    2015-11-01

    Video quality measurement is an important component in the end-to-end video delivery chain. Video quality is, however, subjective, and thus, there will always be interobserver differences in the subjective opinion about the visual quality of the same video. Despite this, most existing works on objective quality measurement typically focus only on predicting a single score and evaluate their prediction accuracies based on how close it is to the mean opinion scores (or similar average based ratings). Clearly, such an approach ignores the underlying diversities in the subjective scoring process and, as a result, does not allow further analysis on how reliable the objective prediction is in terms of subjective variability. Consequently, the aim of this paper is to analyze this issue and present a machine-learning based solution to address it. We demonstrate the utility of our ideas by considering the practical scenario of video broadcast transmissions with focus on digital terrestrial television (DTT) and proposing a no-reference objective video quality estimator for such application. We conducted meaningful verification studies on different video content (including video clips recorded from real DTT broadcast transmissions) in order to verify the performance of the proposed solution.

  1. Application of Fourier transform near-infrared spectroscopy to optimization of green tea steaming process conditions.

    PubMed

    Ono, Daiki; Bamba, Takeshi; Oku, Yuichi; Yonetani, Tsutomu; Fukusaki, Eiichiro

    2011-09-01

    In this study, we constructed prediction models by metabolic fingerprinting of fresh green tea leaves using Fourier transform near-infrared (FT-NIR) spectroscopy and partial least squares (PLS) regression analysis to objectively optimize of the steaming process conditions in green tea manufacture. The steaming process is the most important step for manufacturing high quality green tea products. However, the parameter setting of the steamer is currently determined subjectively by the manufacturer. Therefore, a simple and robust system that can be used to objectively set the steaming process parameters is necessary. We focused on FT-NIR spectroscopy because of its simple operation, quick measurement, and low running costs. After removal of noise in the spectral data by principal component analysis (PCA), PLS regression analysis was performed using spectral information as independent variables, and the steaming parameters set by experienced manufacturers as dependent variables. The prediction models were successfully constructed with satisfactory accuracy. Moreover, the results of the demonstrated experiment suggested that the green tea steaming process parameters could be predicted on a larger manufacturing scale. This technique will contribute to improvement of the quality and productivity of green tea because it can objectively optimize the complicated green tea steaming process and will be suitable for practical use in green tea manufacture. Copyright © 2011 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  2. Optimization and Characterization of the Friction Stir Welded Sheets of AA 5754-H111: Monitoring of the Quality of Joints with Thermographic Techniques.

    PubMed

    De Filippis, Luigi Alberto Ciro; Serio, Livia Maria; Palumbo, Davide; De Finis, Rosa; Galietti, Umberto

    2017-10-11

    Friction Stir Welding (FSW) is a solid-state welding process, based on frictional and stirring phenomena, that offers many advantages with respect to the traditional welding methods. However, several parameters can affect the quality of the produced joints. In this work, an experimental approach has been used for studying and optimizing the FSW process, applied on 5754-H111 aluminum plates. In particular, the thermal behavior of the material during the process has been investigated and two thermal indexes, the maximum temperature and the heating rate of the material, correlated to the frictional power input, were investigated for different process parameters (the travel and rotation tool speeds) configurations. Moreover, other techniques (micrographs, macrographs and destructive tensile tests) were carried out for supporting in a quantitative way the analysis of the quality of welded joints. The potential of thermographic technique has been demonstrated both for monitoring the FSW process and for predicting the quality of joints in terms of tensile strength.

  3. Anxiety, social skills, friendship quality, and peer victimization: an integrated model.

    PubMed

    Crawford, A Melissa; Manassis, Katharina

    2011-10-01

    This cross-sectional study investigated whether anxiety and social functioning interact in their prediction of peer victimization. A structural equation model linking anxiety, social skills, and friendship quality to victimization was tested separately for children with anxiety disorders and normal comparison children to explore whether the processes involved in victimization differ for these groups. Participants were 8-14 year old children: 55 (34 boys, 21 girls) diagnosed with an anxiety disorder and 85 (37 boys, 48 girls) normal comparison children. The final models for both groups yielded two independent pathways to victimization: (a) anxiety independently predicted being victimized; and (b) poor social skills predicted lower friendship quality, which in turn, placed a child at risk for victimization. These findings have important implications for the treatment of childhood anxiety disorders and for school-based anti-bullying interventions, but replication with larger samples is indicated. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Evaluation of coffee roasting degree by using electronic nose and artificial neural network for off-line quality control.

    PubMed

    Romani, Santina; Cevoli, Chiara; Fabbri, Angelo; Alessandrini, Laura; Dalla Rosa, Marco

    2012-09-01

    An electronic nose (EN) based on an array of 10 metal oxide semiconductor sensors was used, jointly with an artificial neural network (ANN), to predict coffee roasting degree. The flavor release evolution and the main physicochemical modifications (weight loss, density, moisture content, and surface color: L*, a*), during the roasting process of coffee, were monitored at different cooking times (0, 6, 8, 10, 14, 19 min). Principal component analysis (PCA) was used to reduce the dimensionality of sensors data set (600 values per sensor). The selected PCs were used as ANN input variables. Two types of ANN methods (multilayer perceptron [MLP] and general regression neural network [GRNN]) were used in order to estimate the EN signals. For both neural networks the input values were represented by scores of sensors data set PCs, while the output values were the quality parameter at different roasting times. Both the ANNs were able to well predict coffee roasting degree, giving good prediction results for both roasting time and coffee quality parameters. In particular, GRNN showed the highest prediction reliability. Actually the evaluation of coffee roasting degree is mainly a manned operation, substantially based on the empirical final color observation. For this reason it requires well-trained operators with a long professional skill. The coupling of e-nose and artificial neural networks (ANNs) may represent an effective possibility to roasting process automation and to set up a more reproducible procedure for final coffee bean quality characterization. © 2012 Institute of Food Technologists®

  5. Statistical process control methods allow the analysis and improvement of anesthesia care.

    PubMed

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  6. Predicting plot soil loss by empirical and process-oriented approaches: A review

    USDA-ARS?s Scientific Manuscript database

    Soil erosion directly affects the quality of the soil, its agricultural productivity and its biological diversity. Many mathematical models have been developed to estimate plot soil erosion at different temporal scales. At present, empirical soil loss equations and process-oriented models are consid...

  7. Relationship Among Signal Fidelity, Hearing Loss, and Working Memory for Digital Noise Suppression.

    PubMed

    Arehart, Kathryn; Souza, Pamela; Kates, James; Lunner, Thomas; Pedersen, Michael Syskind

    2015-01-01

    This study considered speech modified by additive babble combined with noise-suppression processing. The purpose was to determine the relative importance of the signal modifications, individual peripheral hearing loss, and individual cognitive capacity on speech intelligibility and speech quality. The participant group consisted of 31 individuals with moderate high-frequency hearing loss ranging in age from 51 to 89 years (mean = 69.6 years). Speech intelligibility and speech quality were measured using low-context sentences presented in babble at several signal-to-noise ratios. Speech stimuli were processed with a binary mask noise-suppression strategy with systematic manipulations of two parameters (error rate and attenuation values). The cumulative effects of signal modification produced by babble and signal processing were quantified using an envelope-distortion metric. Working memory capacity was assessed with a reading span test. Analysis of variance was used to determine the effects of signal processing parameters on perceptual scores. Hierarchical linear modeling was used to determine the role of degree of hearing loss and working memory capacity in individual listener response to the processed noisy speech. The model also considered improvements in envelope fidelity caused by the binary mask and the degradations to envelope caused by error and noise. The participants showed significant benefits in terms of intelligibility scores and quality ratings for noisy speech processed by the ideal binary mask noise-suppression strategy. This benefit was observed across a range of signal-to-noise ratios and persisted when up to a 30% error rate was introduced into the processing. Average intelligibility scores and average quality ratings were well predicted by an objective metric of envelope fidelity. Degree of hearing loss and working memory capacity were significant factors in explaining individual listener's intelligibility scores for binary mask processing applied to speech in babble. Degree of hearing loss and working memory capacity did not predict listeners' quality ratings. The results indicate that envelope fidelity is a primary factor in determining the combined effects of noise and binary mask processing for intelligibility and quality of speech presented in babble noise. Degree of hearing loss and working memory capacity are significant factors in explaining variability in listeners' speech intelligibility scores but not in quality ratings.

  8. On the long-term stability of terrestrial reference frame solutions based on Kalman filtering

    NASA Astrophysics Data System (ADS)

    Soja, Benedikt; Gross, Richard S.; Abbondanza, Claudio; Chin, Toshio M.; Heflin, Michael B.; Parker, Jay W.; Wu, Xiaoping; Nilsson, Tobias; Glaser, Susanne; Balidakis, Kyriakos; Heinkelmann, Robert; Schuh, Harald

    2018-06-01

    The Global Geodetic Observing System requirement for the long-term stability of the International Terrestrial Reference Frame is 0.1 mm/year, motivated by rigorous sea level studies. Furthermore, high-quality station velocities are of great importance for the prediction of future station coordinates, which are fundamental for several geodetic applications. In this study, we investigate the performance of predictions from very long baseline interferometry (VLBI) terrestrial reference frames (TRFs) based on Kalman filtering. The predictions are computed by extrapolating the deterministic part of the coordinate model. As observational data, we used over 4000 VLBI sessions between 1980 and the middle of 2016. In order to study the predictions, we computed VLBI TRF solutions only from the data until the end of 2013. The period of 2014 until 2016.5 was used to validate the predictions of the TRF solutions against the measured VLBI station coordinates. To assess the quality, we computed average WRMS values from the coordinate differences as well as from estimated Helmert transformation parameters, in particular, the scale. We found that the results significantly depend on the level of process noise used in the filter. While larger values of process noise allow the TRF station coordinates to more closely follow the input data (decrease in WRMS of about 45%), the TRF predictions exhibit larger deviations from the VLBI station coordinates after 2014 (WRMS increase of about 15%). On the other hand, lower levels of process noise improve the predictions, making them more similar to those of solutions without process noise. Furthermore, our investigations show that additionally estimating annual signals in the coordinates does not significantly impact the results. Finally, we computed TRF solutions mimicking a potential real-time TRF and found significant improvements over the other investigated solutions, all of which rely on extrapolating the coordinate model for their predictions, with WRMS reductions of almost 50%.

  9. The Dynamics of Narrative Writing in Primary Grade Children: Writing Process Factors Predict Story Quality

    ERIC Educational Resources Information Center

    von Koss Torkildsen, Janne; Morken, Frøydis; Helland, Wenche A.; Helland, Turid

    2016-01-01

    In this study of third grade school children, we investigated the association between writing process measures recorded with key stroke logging and the final written product. Moreover, we examined the cognitive predictors of writing process and product measures. Analyses of key strokes showed that while most children spontaneously made local…

  10. Psychophysical Laws and the Superorganism.

    PubMed

    Reina, Andreagiovanni; Bose, Thomas; Trianni, Vito; Marshall, James A R

    2018-03-12

    Through theoretical analysis, we show how a superorganism may react to stimulus variations according to psychophysical laws observed in humans and other animals. We investigate an empirically-motivated honeybee house-hunting model, which describes a value-sensitive decision process over potential nest-sites, at the level of the colony. In this study, we show how colony decision time increases with the number of available nests, in agreement with the Hick-Hyman law of psychophysics, and decreases with mean nest quality, in agreement with Piéron's law. We also show that colony error rate depends on mean nest quality, and difference in quality, in agreement with Weber's law. Psychophysical laws, particularly Weber's law, have been found in diverse species, including unicellular organisms. Our theoretical results predict that superorganisms may also exhibit such behaviour, suggesting that these laws arise from fundamental mechanisms of information processing and decision-making. Finally, we propose a combined psychophysical law which unifies Hick-Hyman's law and Piéron's law, traditionally studied independently; this unified law makes predictions that can be empirically tested.

  11. Application of the quality by design approach to the drug substance manufacturing process of an Fc fusion protein: towards a global multi-step design space.

    PubMed

    Eon-duval, Alex; Valax, Pascal; Solacroup, Thomas; Broly, Hervé; Gleixner, Ralf; Strat, Claire L E; Sutter, James

    2012-10-01

    The article describes how Quality by Design principles can be applied to the drug substance manufacturing process of an Fc fusion protein. First, the quality attributes of the product were evaluated for their potential impact on safety and efficacy using risk management tools. Similarly, process parameters that have a potential impact on critical quality attributes (CQAs) were also identified through a risk assessment. Critical process parameters were then evaluated for their impact on CQAs, individually and in interaction with each other, using multivariate design of experiment techniques during the process characterisation phase. The global multi-step Design Space, defining operational limits for the entire drug substance manufacturing process so as to ensure that the drug substance quality targets are met, was devised using predictive statistical models developed during the characterisation study. The validity of the global multi-step Design Space was then confirmed by performing the entire process, from cell bank thawing to final drug substance, at its limits during the robustness study: the quality of the final drug substance produced under different conditions was verified against predefined targets. An adaptive strategy was devised whereby the Design Space can be adjusted to the quality of the input material to ensure reliable drug substance quality. Finally, all the data obtained during the process described above, together with data generated during additional validation studies as well as manufacturing data, were used to define the control strategy for the drug substance manufacturing process using a risk assessment methodology. Copyright © 2012 Wiley-Liss, Inc.

  12. Acoustic assessment of wood quality of raw forest materials : a path to increased profitability

    Treesearch

    Xiping Wang; Peter Carter; Robert J. Ross; Brian K. Brashaw

    2007-01-01

    Assessment of the quality of raw wood materials has become a crucial issue in the operational value chain as forestry and the wood processing industry are increasingly under economic pressure to maximize extracted value. A significant effort has been devoted toward developing robust nondestructive evaluation (NDE) technologies capable of predicting the intrinsic wood...

  13. Intrinsic and Extrinsic Motivation in Early Adolescents' Friendship Development: Friendship Selection, Influence, and Prospective Friendship Quality

    ERIC Educational Resources Information Center

    Ojanen, Tiina; Sijtsema, Jelle J.; Hawley, Patricia H.; Little, Todd D.

    2010-01-01

    Friendships are essential for adolescent social development. However, they may be pursued for varying motives, which, in turn, may predict similarity in friendships via social selection or social influence processes, and likely help to explain friendship quality. We examined the effect of early adolescents' (N = 374, 12-14 years) intrinsic and…

  14. A Model-Based Approach to Predicting Graduate-Level Performance Using Indicators of Undergraduate-Level Performance

    ERIC Educational Resources Information Center

    Zimmermann, Judith; Brodersen, Kay H.; Heinimann, Hans R.; Buhmann, Joachim M.

    2015-01-01

    The graduate admissions process is crucial for controlling the quality of higher education, yet, rules-of-thumb and domain-specific experiences often dominate evidence-based approaches. The goal of the present study is to dissect the predictive power of undergraduate performance indicators and their aggregates. We analyze 81 variables in 171…

  15. Machine learning and predictive data analytics enabling metrology and process control in IC fabrication

    NASA Astrophysics Data System (ADS)

    Rana, Narender; Zhang, Yunlin; Wall, Donald; Dirahoui, Bachir; Bailey, Todd C.

    2015-03-01

    Integrate circuit (IC) technology is going through multiple changes in terms of patterning techniques (multiple patterning, EUV and DSA), device architectures (FinFET, nanowire, graphene) and patterning scale (few nanometers). These changes require tight controls on processes and measurements to achieve the required device performance, and challenge the metrology and process control in terms of capability and quality. Multivariate data with complex nonlinear trends and correlations generally cannot be described well by mathematical or parametric models but can be relatively easily learned by computing machines and used to predict or extrapolate. This paper introduces the predictive metrology approach which has been applied to three different applications. Machine learning and predictive analytics have been leveraged to accurately predict dimensions of EUV resist patterns down to 18 nm half pitch leveraging resist shrinkage patterns. These patterns could not be directly and accurately measured due to metrology tool limitations. Machine learning has also been applied to predict the electrical performance early in the process pipeline for deep trench capacitance and metal line resistance. As the wafer goes through various processes its associated cost multiplies. It may take days to weeks to get the electrical performance readout. Predicting the electrical performance early on can be very valuable in enabling timely actionable decision such as rework, scrap, feedforward, feedback predicted information or information derived from prediction to improve or monitor processes. This paper provides a general overview of machine learning and advanced analytics application in the advanced semiconductor development and manufacturing.

  16. Validating a model that predicts daily growth and feed quality of New Zealand dairy pastures.

    PubMed

    Woodward, S J

    2001-09-01

    The Pasture Quality (PQ) model is a simple, mechanistic, dynamical system model that was designed to capture the essential biological processes in grazed grass-clover pasture, and to be optimised to derive improved grazing strategies for New Zealand dairy farms. While the individual processes represented in the model (photosynthesis, tissue growth, flowering, leaf death, decomposition, worms) were based on experimental data, this did not guarantee that the assembled model would accurately predict the behaviour of the system as a whole (i.e., pasture growth and quality). Validation of the whole model was thus a priority, since any strategy derived from the model could impact a farm business in the order of thousands of dollars per annum if adopted. This paper describes the process of defining performance criteria for the model, obtaining suitable data to test the model, and carrying out the validation analysis. The validation process highlighted a number of weaknesses in the model, which will lead to the model being improved. As a result, the model's utility will be enhanced. Furthermore, validation was found to have an unexpected additional benefit, in that despite the model's poor initial performance, support was generated for the model among field scientists involved in the wider project.

  17. Development of VIS/NIR spectroscopic system for real-time prediction of fresh pork quality

    NASA Astrophysics Data System (ADS)

    Zhang, Haiyun; Peng, Yankun; Zhao, Songwei; Sasao, Akira

    2013-05-01

    Quality attributes of fresh meat will influence nutritional value and consumers' purchasing power. The aim of the research was to develop a prototype for real-time detection of quality in meat. It consisted of hardware system and software system. A VIS/NIR spectrograph in the range of 350 to 1100 nm was used to collect the spectral data. In order to acquire more potential information of the sample, optical fiber multiplexer was used. A conveyable and cylindrical device was designed and fabricated to hold optical fibers from multiplexer. High power halogen tungsten lamp was collected as the light source. The spectral data were obtained with the exposure time of 2.17ms from the surface of the sample by press down the trigger switch on the self-developed system. The system could automatically acquire, process, display and save the data. Moreover the quality could be predicted on-line. A total of 55 fresh pork samples were used to develop prediction model for real time detection. The spectral data were pretreated with standard normalized variant (SNV) and partial least squares regression (PLSR) was used to develop prediction model. The correlation coefficient and root mean square error of the validation set for water content and pH were 0.810, 0.653, and 0.803, 0.098 respectively. The research shows that the real-time non-destructive detection system based on VIS/NIR spectroscopy can be efficient to predict the quality of fresh meat.

  18. Variables that Predict Serve Efficacy in Elite Men’s Volleyball with Different Quality of Opposition Sets

    PubMed Central

    Valhondo, Álvaro; Fernández-Echeverría, Carmen; González-Silva, Jara; Claver, Fernando; Moreno, M. Perla

    2018-01-01

    Abstract The objective of this study was to determine the variables that predicted serve efficacy in elite men’s volleyball, in sets with different quality of opposition. 3292 serve actions were analysed, of which 2254 were carried out in high quality of opposition sets and 1038 actions were in low quality of opposition sets, corresponding to a total of 24 matches played during the Men’s European Volleyball Championships held in 2011. The independent variables considered in this study were the serve zone, serve type, serving player, serve direction, reception zone, receiving player and reception type; the dependent variable was serve efficacy and the situational variable was quality of opposition sets. The variables that acted as predictors in both high and low quality of opposition sets were the serving player, reception zone and reception type. The serve type variable only acted as a predictor in high quality of opposition sets, while the serve zone variable only acted as a predictor in low quality of opposition sets. These results may provide important guidance in men’s volleyball training processes. PMID:29599869

  19. Inhalable Ipratropium Bromide Particle Engineering with Multicriteria Optimization.

    PubMed

    Vinjamuri, Bhavani Prasad; Haware, Rahul V; Stagner, William C

    2017-08-01

    Spray-dried ipratropium bromide (IPB) microspheres for oral inhalation were engineered using Quality by Design. The interrogation of material properties, process parameters, and critical product quality attributes interplay enabled rational product design. A 2 7-3 screening design exhibited the Maillard reaction between L-leucine (LL) and lactose at studied outlet temperatures (OT) >130°C. A response surface custom design was used in conjunction with multicriteria optimization to determine the operating design space to achieve inhalable microparticles. Statistically significant predictive models were developed for volume median diameter (p = 0.0001, adjusted R 2   = 0.9938), span (p = 0.0278, adjusted R 2   = 0.7912), yield (p = 0.0020, adjusted R 2   = 0.9320), and OT (p = 0.0082, adjusted R 2   = 0.8768). An independent verification batch confirmed the model's predictive capability. The prediction and actual values were in good agreement. Particle size and span were 3.32 ± 0.09 μm and 1.71 ± 0.18, which were 4.7 and 5.3% higher than the predicted values. The process yield was 50.3%, compared to the predicted value of 65.3%. The OT was 100°C versus the predicted value of 105°C. The label strength of IPB microparticles was 99.0 to 105.9% w/w suggesting that enrichment occurred during the spray-drying process. The present study can be utilized to initiate the design of the first commercial IPB dry powder inhaler.

  20. NOAA's National Air Quality Prediction and Development of Aerosol and Atmospheric Composition Prediction Components for NGGPS

    NASA Astrophysics Data System (ADS)

    Stajner, I.; McQueen, J.; Lee, P.; Stein, A. F.; Wilczak, J. M.; Upadhayay, S.; daSilva, A.; Lu, C. H.; Grell, G. A.; Pierce, R. B.

    2017-12-01

    NOAA's operational air quality predictions of ozone, fine particulate matter (PM2.5) and wildfire smoke over the United States and airborne dust over the contiguous 48 states are distributed at http://airquality.weather.gov. The National Air Quality Forecast Capability (NAQFC) providing these predictions was updated in June 2017. Ozone and PM2.5 predictions are now produced using the system linking the Community Multiscale Air Quality model (CMAQ) version 5.0.2 with meteorological inputs from the North American Mesoscale Forecast System (NAM) version 4. Predictions of PM2.5 include intermittent dust emissions and wildfire emissions from an updated version of BlueSky system. For the latter, the CMAQ system is initialized by rerunning it over the previous 24 hours to include wildfire emissions at the time when they were observed from the satellites. Post processing to reduce the bias in PM2.5 prediction was updated using the Kalman filter analog (KFAN) technique. Dust related aerosol species at the CMAQ domain lateral boundaries now come from the NEMS Global Aerosol Component (NGAC) v2 predictions. Further development of NAQFC includes testing of CMAQ predictions to 72 hours, Canadian fire emissions data from Environment and Climate Change Canada (ECCC) and the KFAN technique to reduce bias in ozone predictions. NOAA is developing the Next Generation Global Predictions System (NGGPS) with an aerosol and gaseous atmospheric composition component to improve and integrate aerosol and ozone predictions and evaluate their impacts on physics, data assimilation and weather prediction. Efforts are underway to improve cloud microphysics, investigate aerosol effects and include representations of atmospheric composition of varying complexity into NGGPS: from the operational ozone parameterization, GOCART aerosols, with simplified ozone chemistry, to CMAQ chemistry with aerosol modules. We will present progress on community building, planning and development of NGGPS.

  1. Development and evaluation of a dimensionless mechanistic pan coating model for the prediction of coated tablet appearance.

    PubMed

    Niblett, Daniel; Porter, Stuart; Reynolds, Gavin; Morgan, Tomos; Greenamoyer, Jennifer; Hach, Ronald; Sido, Stephanie; Karan, Kapish; Gabbott, Ian

    2017-08-07

    A mathematical, mechanistic tablet film-coating model has been developed for pharmaceutical pan coating systems based on the mechanisms of atomisation, tablet bed movement and droplet drying with the main purpose of predicting tablet appearance quality. Two dimensionless quantities were used to characterise the product properties and operating parameters: the dimensionless Spray Flux (relating to area coverage of the spray droplets) and the Niblett Number (relating to the time available for drying of coating droplets). The Niblett Number is the ratio between the time a droplet needs to dry under given thermodynamic conditions and the time available for the droplet while on the surface of the tablet bed. The time available for drying on the tablet bed surface is critical for appearance quality. These two dimensionless quantities were used to select process parameters for a set of 22 coating experiments, performed over a wide range of multivariate process parameters. The dimensionless Regime Map created can be used to visualise the effect of interacting process parameters on overall tablet appearance quality and defects such as picking and logo bridging. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Comparative evaluation of the impact of GRAPES and MM5 meteorology on CMAQ prediction over Pearl River Delta, China

    NASA Astrophysics Data System (ADS)

    Deng, T.; Chen, Y.; Wan, Q.

    2017-12-01

    The Community Multiscale Air Quality (CMAQ) model was utilized for forecasting air quality over the Pearl River Delta (PRD) region from December 2013 to January 2014. The pollution forecasting performance of CMAQ coupled with the two different meteorological models, the Global/Regional Assimilation and Prediction System (GRAPES) and the 5th-generation Mesoscale Model (MM5), was assessed by combining observational data. The effect of meteorological factors and physical-chemical processes on forecast results was discussed through process analysis. The results showed that both models have similar good performance with better performance by GRAPES-CMAQ. GRAPES was superior in predicting the overall meteorological element variation tendencies but showed large deviations in atmospheric pressure and wind speed. It contributed to higher correlation coefficients of the pollutants with GRAPES-CMAQ, but with greater deviation. The underestimations of nitrate and ammonium salt contributed to the underestimations of Particle Matter (PM) and extinction coefficients. Surface layer SO2, CO and NO source emissions made the sole positive contribution. O3 originated mainly from horizontal and vertical transport and chemical processes were the main consumption item. On the contrary, NO2 derived mainly from chemical production.

  3. Estimation of antioxidant components of tomato using VIS-NIR reflectance data by handheld portable spectrometer

    NASA Astrophysics Data System (ADS)

    Szuvandzsiev, Péter; Helyes, Lajos; Lugasi, Andrea; Szántó, Csongor; Baranowski, Piotr; Pék, Zoltán

    2014-10-01

    Processing tomato production represents an important part of the total production of processed vegetables in the world. The quality characteristics of processing tomato, important for the food industry, are soluble solids content and antioxidant content (such as lycopene and polyphenols) of the fruit. Analytical quantification of these components is destructive, time and labour consuming. That is why researchers try to develop a non-destructive and rapid method to assess those quality parameters. The present study reports the suitability of a portable handheld visible near infrared spectrometer to predict soluble solids, lycopene and polyphenol content of tomato fruit puree. Spectral ranges of 500-1000 nm were directly acquired on fruit puree of five different tomato varieties using a FieldSpec HandHeld 2™ Portable Spectroradiometer. Immediately after spectral measurement, each fruit sample was analysed to determine soluble solids, lycopene and polyphenol content. Partial least square regressions were carried out to create models of prediction between spectral data and the values obtained from the analytical results. The accuracy of the predictions was analysed according to the coefficient of determination value (R2), the root mean square error of calibration/ cross-validation.

  4. Poor sleep quality predicts deficient emotion information processing over time in early adolescence.

    PubMed

    Soffer-Dudek, Nirit; Sadeh, Avi; Dahl, Ronald E; Rosenblat-Stein, Shiran

    2011-11-01

    There is deepening understanding of the effects of sleep on emotional information processing. Emotion information processing is a key aspect of social competence, which undergoes important maturational and developmental changes in adolescence; however, most research in this area has focused on adults. Our aim was to test the links between sleep and emotion information processing during early adolescence. Sleep and facial information processing were assessed objectively during 3 assessment waves, separated by 1-year lags. Data were obtained in natural environments-sleep was assessed in home settings, and facial information processing was assessed at school. 94 healthy children (53 girls, 41 boys), aged 10 years at Time 1. N/A. Facial information processing was tested under neutral (gender identification) and emotional (emotional expression identification) conditions. Sleep was assessed in home settings using actigraphy for 7 nights at each assessment wave. Waking > 5 min was considered a night awakening. Using multilevel modeling, elevated night awakenings and decreased sleep efficiency significantly predicted poor performance only in the emotional information processing condition (e.g., b = -1.79, SD = 0.52, confidence interval: lower boundary = -2.82, upper boundary = -0.076, t(416.94) = -3.42, P = 0.001). Poor sleep quality is associated with compromised emotional information processing during early adolescence, a sensitive period in socio-emotional development.

  5. Near-infrared chemical imaging (NIR-CI) as a process monitoring solution for a production line of roll compaction and tableting.

    PubMed

    Khorasani, Milad; Amigo, José M; Sun, Changquan Calvin; Bertelsen, Poul; Rantanen, Jukka

    2015-06-01

    In the present study the application of near-infrared chemical imaging (NIR-CI) supported by chemometric modeling as non-destructive tool for monitoring and assessing the roller compaction and tableting processes was investigated. Based on preliminary risk-assessment, discussion with experts and current work from the literature the critical process parameter (roll pressure and roll speed) and critical quality attributes (ribbon porosity, granule size, amount of fines, tablet tensile strength) were identified and a design space was established. Five experimental runs with different process settings were carried out which revealed intermediates (ribbons, granules) and final products (tablets) with different properties. Principal component analysis (PCA) based model of NIR images was applied to map the ribbon porosity distribution. The ribbon porosity distribution gained from the PCA based NIR-CI was used to develop predictive models for granule size fractions. Predictive methods with acceptable R(2) values could be used to predict the granule particle size. Partial least squares regression (PLS-R) based model of the NIR-CI was used to map and predict the chemical distribution and content of active compound for both roller compacted ribbons and corresponding tablets. In order to select the optimal process, setting the standard deviation of tablet tensile strength and tablet weight for each tablet batch was considered. Strong linear correlation between tablet tensile strength and amount of fines and granule size was established, respectively. These approaches are considered to have a potentially large impact on quality monitoring and control of continuously operating manufacturing lines, such as roller compaction and tableting processes. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. A portable device for detecting fruit quality by diffuse reflectance Vis/NIR spectroscopy

    NASA Astrophysics Data System (ADS)

    Sun, Hongwei; Peng, Yankun; Li, Peng; Wang, Wenxiu

    2017-05-01

    Soluble solid content (SSC) is a major quality parameter to fruit, which has influence on its flavor or texture. Some researches on the on-line non-invasion detection of fruit quality were published. However, consumers desire portable devices currently. This study aimed to develop a portable device for accurate, real-time and nondestructive determination of quality factors of fruit based on diffuse reflectance Vis/NIR spectroscopy (520-950 nm). The hardware of the device consisted of four units: light source unit, spectral acquisition unit, central processing unit, display unit. Halogen lamp was chosen as light source. When working, its hand-held probe was in contact with the surface of fruit samples thus forming dark environment to shield the interferential light outside. Diffuse reflectance light was collected and measured by spectrometer (USB4000). ARM (Advanced RISC Machines), as central processing unit, controlled all parts in device and analyzed spectral data. Liquid Crystal Display (LCD) touch screen was used to interface with users. To validate its reliability and stability, 63 apples were tested in experiment, 47 of which were chosen as calibration set, while others as prediction set. Their SSC reference values were measured by refractometer. At the same time, samples' spectral data acquired by portable device were processed by standard normalized variables (SNV) and Savitzky-Golay filter (S-G) to eliminate the spectra noise. Then partial least squares regression (PLSR) was applied to build prediction models, and the best predictions results was achieved with correlation coefficient (r) of 0.855 and standard error of 0.6033° Brix. The results demonstrated that this device was feasible to quantitatively analyze soluble solid content of apple.

  7. Comparative assessment of several post-processing methods for correcting evapotranspiration forecasts derived from TIGGE datasets.

    NASA Astrophysics Data System (ADS)

    Tian, D.; Medina, H.

    2017-12-01

    Post-processing of medium range reference evapotranspiration (ETo) forecasts based on numerical weather prediction (NWP) models has the potential of improving the quality and utility of these forecasts. This work compares the performance of several post-processing methods for correcting ETo forecasts over the continental U.S. generated from The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) database using data from Europe (EC), the United Kingdom (MO), and the United States (NCEP). The pondered post-processing techniques are: simple bias correction, the use of multimodels, the Ensemble Model Output Statistics (EMOS, Gneitting et al., 2005) and the Bayesian Model Averaging (BMA, Raftery et al., 2005). ETo estimates based on quality-controlled U.S. Regional Climate Reference Network measurements, and computed with the FAO 56 Penman Monteith equation, are adopted as baseline. EMOS and BMA are generally the most efficient post-processing techniques of the ETo forecasts. Nevertheless, the simple bias correction of the best model is commonly much more rewarding than using multimodel raw forecasts. Our results demonstrate the potential of different forecasting and post-processing frameworks in operational evapotranspiration and irrigation advisory systems at national scale.

  8. Predicting meat quality traits of ovine m. semimembranosus, both fresh and following freezing and thawing, using a hand held Raman spectroscopic device.

    PubMed

    Fowler, Stephanie M; Schmidt, Heinar; van de Ven, Remy; Wynn, Peter; Hopkins, David L

    2015-10-01

    Complementary studies were conducted to determine the potential for a Raman spectroscopic hand held device to predict meat quality traits of fresh lamb m. semimembranosus (topside) after ageing and freezing/thawing. Spectra were collected from 80 fresh muscles at 24h and 5d PM, another 80 muscles were measured at 24h, 5d and following freezing/thawing. Shear force, cooking loss, sarcomere length, colour, particle size, collagen content, pH24, pHu, purge and thaw loss were also measured. Results indicated a potential to predict pHu (R(2)cv=0.59), pH24 (R(2)cv=0.48) and purge (R(2)cv=0.42) using spectra collected 24h PM. L* could be predicted using spectra collected 24h (R(2)cv=0.33) or 5d PM (R(2)cv=0.33). This suggests that Raman spectroscopy is suited to identifying carcases which deviate from the normal metabolic processes and related meat quality traits. Copyright © 2015. Published by Elsevier Ltd.

  9. Predicting water quality by relating secchi-disk transparency and chlorophyll a measurements to Landsat satellite imagery for Michigan inland lakes, 2001-2006

    USGS Publications Warehouse

    Fuller, L.M.; Minnerick, R.J.

    2007-01-01

    The State of Michigan has more than 11,000 inland lakes; approximately 3,500 of these lakes are greater than 25 acres. The USGS, in cooperation with the Michigan Department of Environmental Quality (MDEQ), has been monitoring the quality of inland lakes in Michigan through the Lake Water Quality Assessment monitoring program. Approximately 100 inland lakes will be sampled per year from 2001 to 2015. Volunteers coordinated by MDEQ started sampling lakes in 1974, and continue to sample to date approximately 250 inland lakes each year through the Cooperative Lakes Monitoring Program (CLMP), Michigan’s volunteer lakes monitoring program. Despite this sampling effort, it is still impossible to physically collect the necessary water-quality measurements for all 3,500 Michigan inland lakes. Therefore, a technique was used by USGS, modeled after Olmanson and others (2001), in cooperation with MDEQ that uses satellite remote sensing to predict water quality in unsampled inland lakes greater than 25 acres. Water-quality characteristics that are associated with water clarity can be predicted for Michigan inland lakes by relating sampled measurements of secchi-disk transparency (SDT) and chlorophyll a concentrations (Chl-a), to satellite imagery. The trophic state index (TSI) which is an indicator of the biological productivity can be calculated based on SDT measurements, Chl-a concentrations, and total phosphorus (TP) concentrations measured near the lake’s surface. Through this process, unsampled inland lakes within the fourteen Landsat satellite scenes encompassing Michigan can be translated into estimated TSI from either predicted SDT or Chl-a (fig. 1).

  10. Prediction of canned black bean texture (Phaseolus vulgaris L.) from intact dry seeds using visible/near-infrared spectroscopy and hyperspectral imaging data

    USDA-ARS?s Scientific Manuscript database

    BACKGROUND: Texture is a major quality parameter for the acceptability of canned whole beans. Prior knowledge of this quality trait before processing would be useful to guide variety development by bean breeders and optimize handling protocols by processors. The objective of this study was to evalua...

  11. Process optimization of rolling for zincked sheet technology using response surface methodology and genetic algorithm

    NASA Astrophysics Data System (ADS)

    Ji, Liang-Bo; Chen, Fang

    2017-07-01

    Numerical simulation and intelligent optimization technology were adopted for rolling and extrusion of zincked sheet. By response surface methodology (RSM), genetic algorithm (GA) and data processing technology, an efficient optimization of process parameters for rolling of zincked sheet was investigated. The influence trend of roller gap, rolling speed and friction factor effects on reduction rate and plate shortening rate were analyzed firstly. Then a predictive response surface model for comprehensive quality index of part was created using RSM. Simulated and predicted values were compared. Through genetic algorithm method, the optimal process parameters for the forming of rolling were solved. They were verified and the optimum process parameters of rolling were obtained. It is feasible and effective.

  12. Construction of quality-assured infant feeding process of care data repositories: Construction of the perinatal repository (Part 2).

    PubMed

    García-de-León-Chocano, Ricardo; Muñoz-Soler, Verónica; Sáez, Carlos; García-de-León-González, Ricardo; García-Gómez, Juan M

    2016-04-01

    This is the second in a series of two papers regarding the construction of data quality (DQ) assured repositories, based on population data from Electronic Health Records (EHR), for the reuse of information on infant feeding from birth until the age of two. This second paper describes the application of the computational process of constructing the first quality-assured repository for the reuse of information on infant feeding in the perinatal period, with the aim of studying relevant questions from the Baby Friendly Hospital Initiative (BFHI) and monitoring its deployment in our hospital. The construction of the repository was carried out using 13 semi-automated procedures to assess, recover or discard clinical data. The initial information consisted of perinatal forms from EHR related to 2048 births (Facts of Study, FoS) between 2009 and 2011, with a total of 433,308 observations of 223 variables. DQ was measured before and after the procedures using metrics related to eight quality dimensions: predictive value, correctness, duplication, consistency, completeness, contextualization, temporal-stability, and spatial-stability. Once the predictive variables were selected and DQ was assured, the final repository consisted of 1925 births, 107,529 observations and 73 quality-assured variables. The amount of discarded observations mainly corresponds to observations of non-predictive variables (52.90%) and the impact of the de-duplication process (20.58%) with respect to the total input data. Seven out of thirteen procedures achieved 100% of valid births, observations and variables. Moreover, 89% of births and ~98% of observations were consistent according to the experts׳ criteria. A multidisciplinary approach along with the quantification of DQ has allowed us to construct the first repository about infant feeding in the perinatal period based on EHR population data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. What Predicts Skill in Lecture Note Taking?

    ERIC Educational Resources Information Center

    Peverly, Stephen T.; Ramaswamy, Vivek; Brown, Cindy; Sumowski, James; Alidoost, Moona; Garner, Joanna

    2007-01-01

    Despite the importance of good lecture notes to test performance, very little is known about the cognitive processes that underlie effective lecture note taking. The primary purpose of the 2 studies reported (a pilot study and Study 1) was to investigate 3 processes hypothesized to be significantly related to quality of notes: transcription…

  14. Adaptive Data Processing Technique for Lidar-Assisted Control to Bridge the Gap between Lidar Systems and Wind Turbines: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlipf, David; Raach, Steffen; Haizmann, Florian

    2015-12-14

    This paper presents first steps toward an adaptive lidar data processing technique crucial for lidar-assisted control in wind turbines. The prediction time and the quality of the wind preview from lidar measurements depend on several factors and are not constant. If the data processing is not continually adjusted, the benefit of lidar-assisted control cannot be fully exploited, or can even result in harmful control action. An online analysis of the lidar and turbine data are necessary to continually reassess the prediction time and lidar data quality. In this work, a structured process to develop an analysis tool for the predictionmore » time and a new hardware setup for lidar-assisted control are presented. The tool consists of an online estimation of the rotor effective wind speed from lidar and turbine data and the implementation of an online cross correlation to determine the time shift between both signals. Further, initial results from an ongoing campaign in which this system was employed for providing lidar preview for feed-forward pitch control are presented.« less

  15. Optimization and Characterization of the Friction Stir Welded Sheets of AA 5754-H111: Monitoring of the Quality of Joints with Thermographic Techniques

    PubMed Central

    De Filippis, Luigi Alberto Ciro; Serio, Livia Maria; Galietti, Umberto

    2017-01-01

    Friction Stir Welding (FSW) is a solid-state welding process, based on frictional and stirring phenomena, that offers many advantages with respect to the traditional welding methods. However, several parameters can affect the quality of the produced joints. In this work, an experimental approach has been used for studying and optimizing the FSW process, applied on 5754-H111 aluminum plates. In particular, the thermal behavior of the material during the process has been investigated and two thermal indexes, the maximum temperature and the heating rate of the material, correlated to the frictional power input, were investigated for different process parameters (the travel and rotation tool speeds) configurations. Moreover, other techniques (micrographs, macrographs and destructive tensile tests) were carried out for supporting in a quantitative way the analysis of the quality of welded joints. The potential of thermographic technique has been demonstrated both for monitoring the FSW process and for predicting the quality of joints in terms of tensile strength. PMID:29019948

  16. Working alliance, real relationship, session quality, and client improvement in psychodynamic psychotherapy: A longitudinal actor partner interdependence model.

    PubMed

    Kivlighan, Dennis M; Hill, Clara E; Gelso, Charles J; Baumann, Ellen

    2016-03-01

    We used the Actor Partner Interdependence Model (APIM; Kashy & Kenny, 2000) to examine the dyadic associations of 74 clients and 23 therapists in their evaluations of working alliance, real relationship, session quality, and client improvement over time in ongoing psychodynamic or interpersonal psychotherapy. There were significant actor effects for both therapists and clients, with the participant's own ratings of working alliance and real relationship independently predicting their own evaluations of session quality. There were significant client partner effects, with clients' working alliance and real relationship independently predicting their therapists' evaluations of session quality. The client partner real relationship effect was stronger in later sessions than in earlier sessions. Therapists' real relationship ratings (partner effect) were a stronger predictor of clients' session quality ratings in later sessions than in earlier sessions. Therapists' working alliance ratings (partner effect) were a stronger predictor of clients' session quality ratings when clients made greater improvement than when clients made lesser improvement. For clients' session outcome ratings, there were complex three-way interactions, such that both Client real relationship and working alliance interacted with client improvement and time in treatment to predict clients' session quality. These findings strongly suggest both individual and partner effects when clients and therapists evaluate psychotherapy process and outcome. Implications for research and practice are discussed. (c) 2016 APA, all rights reserved).

  17. A statistical model for water quality predictions from a river discharge using coastal observations

    NASA Astrophysics Data System (ADS)

    Kim, S.; Terrill, E. J.

    2007-12-01

    Understanding and predicting coastal ocean water quality has benefits for reducing human health risks, protecting the environment, and improving local economies which depend on clean beaches. Continuous observations of coastal physical oceanography increase the understanding of the processes which control the fate and transport of a riverine plume which potentially contains high levels of contaminants from the upstream watershed. A data-driven model of the fate and transport of river plume water from the Tijuana River has been developed using surface current observations provided by a network of HF radar operated as part of a local coastal observatory that has been in place since 2002. The model outputs are compared with water quality sampling of shoreline indicator bacteria, and the skill of an alarm for low water quality is evaluated using the receiver operating characteristic (ROC) curve. In addition, statistical analysis of beach closures in comparison with environmental variables is also discussed.

  18. Early Discharge Planning and Improved Care Transitions: Pre-Admission Assessment for Readmission Risk in an Elective Orthopedic and Cardiovascular Surgical Population

    PubMed Central

    Mola, Ana; Rosenfeld, Peri; Ford, Shauna

    2016-01-01

    Background/Methods: Readmission prevention is a marker of patient care quality and requires comprehensive, early discharge planning for safe hospital transitions. Effectively performed, this process supports patient satisfaction, efficient resource utilization, and care integration. This study developed/tested the utility of a predictive early discharge risk assessment with 366 elective orthopedic/cardiovascular surgery patients. Quality improvement cycles were undertaken for the design and to inform analytic plan. An 8-item questionnaire, which includes patient self-reported health, was integrated into care managers’ telephonic pre-admission assessments during a 12-month period. Results: Regression models found the questionnaire to be predictive of readmission (p ≤ .005; R2 = .334) and length-of-stay (p ≤ .001; R2 = .314). Independent variables of “lives-alone” and “self-rated health” were statistically significant for increased readmission odds, as was “self-rated health” for increased length-of-stay. Quality measures, patient experience and increased rates of discharges-to-home further supported the benefit of embedding these questions into the pro-active planning process. Conclusion: The pilot discharge risk assessment was predictive of readmission risk and length-of-stay for elective orthopedic/cardiovascular patients. Given the usability of the questionnaire in advance of elective admissions, it can facilitate pro-active discharge planning essential for producing quality outcomes and addressing new reimbursement methodologies for continuum-based episodes of care. PMID:27616965

  19. Measuring Up: Implementing a Dental Quality Measure in the Electronic Health Record Context

    PubMed Central

    Bhardwaj, Aarti; Ramoni, Rachel; Kalenderian, Elsbeth; Neumann, Ana; Hebballi, Nutan B; White, Joel M; McClellan, Lyle; Walji, Muhammad F

    2015-01-01

    Background Quality improvement requires quality measures that are validly implementable. In this work, we assessed the feasibility and performance of an automated electronic Meaningful Use dental clinical quality measure (percentage of children who received fluoride varnish). Methods We defined how to implement the automated measure queries in a dental electronic health record (EHR). Within records identified through automated query, we manually reviewed a subsample to assess the performance of the query. Results The automated query found 71.0% of patients to have had fluoride varnish compared to 77.6% found using the manual chart review. The automated quality measure performance was 90.5% sensitivity, 90.8% specificity, 96.9% positive predictive value, and 75.2% negative predictive value. Conclusions Our findings support the feasibility of automated dental quality measure queries in the context of sufficient structured data. Information noted only in the free text rather than in structured data would require natural language processing approaches to effectively query. Practical Implications To participate in self-directed quality improvement, dental clinicians must embrace the accountability era. Commitment to quality will require enhanced documentation in order to support near-term automated calculation of quality measures. PMID:26562736

  20. Postprocessing for Air Quality Predictions

    NASA Astrophysics Data System (ADS)

    Delle Monache, L.

    2017-12-01

    In recent year, air quality (AQ) forecasting has made significant progress towards better predictions with the goal of protecting the public from harmful pollutants. This progress is the results of improvements in weather and chemical transport models, their coupling, and more accurate emission inventories (e.g., with the development of new algorithms to account in near real-time for fires). Nevertheless, AQ predictions are still affected at times by significant biases which stem from limitations in both weather and chemistry transport models. Those are the result of numerical approximations and the poor representation (and understanding) of important physical and chemical process. Moreover, although the quality of emission inventories has been significantly improved, they are still one of the main sources of uncertainties in AQ predictions. For operational real-time AQ forecasting, a significant portion of these biases can be reduced with the implementation of postprocessing methods. We will review some of the techniques that have been proposed to reduce both systematic and random errors of AQ predictions, and improve the correlation between predictions and observations of ground-level ozone and surface particulate matter less than 2.5 µm in diameter (PM2.5). These methods, which can be applied to both deterministic and probabilistic predictions, include simple bias-correction techniques, corrections inspired by the Kalman filter, regression methods, and the more recently developed analog-based algorithms. These approaches will be compared and contrasted, and strength and weaknesses of each will be discussed.

  1. Tools for studying dry-cured ham processing by using computed tomography.

    PubMed

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.

  2. Quality status display for a vibration welding process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spicer, John Patrick; Abell, Jeffrey A.; Wincek, Michael Anthony

    A method includes receiving, during a vibration welding process, a set of sensory signals from a collection of sensors positioned with respect to a work piece during formation of a weld on or within the work piece. The method also includes receiving control signals from a welding controller during the process, with the control signals causing the welding horn to vibrate at a calibrated frequency, and processing the received sensory and control signals using a host machine. Additionally, the method includes displaying a predicted weld quality status on a surface of the work piece using a status projector. The methodmore » may include identifying and display a quality status of a suspect weld. The laser projector may project a laser beam directly onto or immediately adjacent to the suspect welds, e.g., as a red, green, blue laser or a gas laser having a switched color filter.« less

  3. Static and fatigue testing of full-scale fuselage panels fabricated using a Therm-X(R) process

    NASA Technical Reports Server (NTRS)

    Dinicola, Albert J.; Kassapoglou, Christos; Chou, Jack C.

    1992-01-01

    Large, curved, integrally stiffened composite panels representative of an aircraft fuselage structure were fabricated using a Therm-X process, an alternative concept to conventional two-sided hard tooling and contour vacuum bagging. Panels subsequently were tested under pure shear loading in both static and fatigue regimes to assess the adequacy of the manufacturing process, the effectiveness of damage tolerant design features co-cured with the structure, and the accuracy of finite element and closed-form predictions of postbuckling capability and failure load. Test results indicated the process yielded panels of high quality and increased damage tolerance through suppression of common failure modes such as skin-stiffener separation and frame-stiffener corner failure. Finite element analyses generally produced good predictions of postbuckled shape, and a global-local modelling technique yielded failure load predictions that were within 7% of the experimental mean.

  4. NIR spectroscopy for the quality control of Moringa oleifera (Lam.) leaf powders: Prediction of minerals, protein and moisture contents.

    PubMed

    Rébufa, Catherine; Pany, Inès; Bombarda, Isabelle

    2018-09-30

    A rapid methodology was developed to simultaneously predict water content and activity values (a w ) of Moringa oleifera leaf powders (MOLP) using near infrared (NIR) signatures and experimental sorption isotherms. NIR spectra of MOLP samples (n = 181) were recorded. A Partial Least Square Regression model (PLS2) was obtained with low standard errors of prediction (SEP of 1.8% and 0.07 for water content and a w respectively). Experimental sorption isotherms obtained at 20, 30 and 40 °C showed similar profiles. This result is particularly important to use MOLP in food industry. In fact, a temperature variation of the drying process will not affect their available water content (self-life). Nutrient contents based on protein and selected minerals (Ca, Fe, K) were also predicted from PLS1 models. Protein contents were well predicted (SEP of 2.3%). This methodology allowed for an improvement in MOLP safety, quality control and traceability. Published by Elsevier Ltd.

  5. Taking into Account the Quality of the Relationship in HIV Disclosure.

    PubMed

    Smith, Charlotte; Cook, Rachel; Rohleder, Poul

    2017-01-01

    Despite growing interest in HIV disclosure, most theoretical frameworks and empirical studies focus on individual and social factors affecting the process, leaving the contribution of interpersonal factors relatively unexplored. HIV transmission and disclosure often occur within a couple however, and this is where disclosure has the most scope as a HIV transmission intervention. With this in mind, this study explores whether perceived relationship quality influences HIV disclosure outcomes. Ninety-five UK individuals with HIV participated in a cross-sectional survey. Retrospective data were collected on their perceived relationship quality prior to disclosing their HIV positive status, and on disclosure outcomes. Perceived relationship quality was found to significantly affect disclosure outcomes. Positive qualities in the relationship were associated with positive outcomes, whereas negative qualities were associated with negative outcomes. Results further confirmed that this association was not merely correlational, but demonstrated predictive power. Relationship quality might act as either a risk or a resilience factor in the disclosure process, and thus warrants greater attention in future research.

  6. Predictive Model for the Meniscus-Guided Coating of High-Quality Organic Single-Crystalline Thin Films.

    PubMed

    Janneck, Robby; Vercesi, Federico; Heremans, Paul; Genoe, Jan; Rolin, Cedric

    2016-09-01

    A model that describes solvent evaporation dynamics in meniscus-guided coating techniques is developed. In combination with a single fitting parameter, it is shown that this formula can accurately predict a processing window for various coating conditions. Organic thin-film transistors (OTFTs), fabricated by a zone-casting setup, indeed show the best performance at the predicted coating speeds with mobilities reaching 7 cm 2 V -1 s -1 . © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Age effects on sensory-processing abilities and their impact on handwriting.

    PubMed

    Engel-Yeger, Batya; Hus, Sari; Rosenblum, Sara

    2012-12-01

    Sensory-processing abilities are known to deteriorate in the elderly. As a result, daily activities such as handwriting may be impaired. Yet, knowledge about sensory-processing involvement in handwriting characteristics among older persons is limited. To examine how age influences sensory-processing abilities and the impact on handwriting as a daily performance. The study participants were 118 healthy, independently functioning adults divided into four age groups: 31-45, 46-60, 61-75 and 76+ years. All participants completed the Adolescent/ Adult Sensory Profile (AASP). Handwriting process was documented using the Computerized Handwriting Penmanship Evaluation Tool (ComPET). Age significantly affects sensory processing and handwriting pressure as well as temporal and spatial measures. Both handwriting time and spatial organization of the written product were predicted by sensory seeking. When examining age contribution to the prediction of handwriting by sensory processing, sensory seeking showed a tendency for predicting handwriting pressure (p = .06), while sensory sensitivity significantly predicted handwriting velocity. Age appears to influence sensory-processing abilities and affect daily performance tasks, such as handwriting, for which sensitivity and seeking for sensations are essential. Awareness of clinicians to sensory-processing deficits among older adults and examining their impact on broader daily activities are essential to improve daily performance and quality of life.

  8. A quality by design approach to investigate the effect of mannitol and dicalcium phosphate qualities on roll compaction.

    PubMed

    Souihi, Nabil; Dumarey, Melanie; Wikström, Håkan; Tajarobi, Pirjo; Fransson, Magnus; Svensson, Olof; Josefson, Mats; Trygg, Johan

    2013-04-15

    Roll compaction is a continuous process for solid dosage form manufacturing increasingly popular within pharmaceutical industry. Although roll compaction has become an established technique for dry granulation, the influence of material properties is still not fully understood. In this study, a quality by design (QbD) approach was utilized, not only to understand the influence of different qualities of mannitol and dicalcium phosphate (DCP), but also to predict critical quality attributes of the drug product based solely on the material properties of that filler. By describing each filler quality in terms of several representative physical properties, orthogonal projections to latent structures (OPLS) was used to understand and predict how those properties affected drug product intermediates as well as critical quality attributes of the final drug product. These models were then validated by predicting product attributes for filler qualities not used in the model construction. The results of this study confirmed that the tensile strength reduction, known to affect plastic materials when roll compacted, is not prominent when using brittle materials. Some qualities of these fillers actually demonstrated improved compactability following roll compaction. While direct compression qualities are frequently used for roll compacted drug products because of their excellent flowability and good compaction properties, this study revealed that granules from these qualities were more poor flowing than the corresponding powder blends, which was not seen for granules from traditional qualities. The QbD approach used in this study could be extended beyond fillers. Thus any new compound/ingredient would first be characterized and then suitable formulation characteristics could be determined in silico, without running any additional experiments. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Utility of NCEP Operational and Emerging Meteorological Models for Driving Air Quality Prediction

    NASA Astrophysics Data System (ADS)

    McQueen, J.; Huang, J.; Huang, H. C.; Shafran, P.; Lee, P.; Pan, L.; Sleinkofer, A. M.; Stajner, I.; Upadhayay, S.; Tallapragada, V.

    2017-12-01

    Operational air quality predictions for the United States (U. S.) are provided at NOAA by the National Air Quality Forecasting Capability (NAQFC). NAQFC provides nationwide operational predictions of ozone and particulate matter twice per day (at 06 and 12 UTC cycles) at 12 km resolution and 1 hour time intervals through 48 hours and distributed at http://airquality.weather.gov. The NOAA National Centers for Environmental Prediction (NCEP) operational North American Mesoscale (NAM) 12 km weather prediction is used to drive the Community Multiscale Air Quality (CMAQ) model. In 2017, the NAM was upgraded in part to reduce a warm 2m temperature bias in Summer (V4). At the same time CMAQ was updated to V5.0.2. Both versions of the models were run in parallel for several months. Therefore the impact of improvements from the atmospheric chemistry model versus upgrades with the weather prediction model could be assessed. . Improvements to CMAQ were related to improvements to improvements in NAM 2 m temperature bias through increasing the opacity of clouds and reducing downward shortwave radiation resulted in reduced ozone photolysis. Higher resolution operational NWP models have recently been introduced as part of the NCEP modeling suite. These include the NAM CONUS Nest (3 km horizontal resolution) run four times per day through 60 hours and the High Resolution Rapid Refresh (HRRR, 3 km) run hourly out to 18 hours. In addition, NCEP with other NOAA labs has begun to develop and test the Next Generation Global Prediction System (NGGPS) based on the FV3 global model. This presentation also overviews recent developments with operational numerical weather prediction and evaluates the ability of these models for predicting low level temperatures, clouds and capturing boundary layer processes important for driving air quality prediction in complex terrain. The assessed meteorological model errors could help determine the magnitude of possible pollutant errors from CMAQ if used for driving meteorology. The NWP models will be evaluated against standard and mesonet fields averaged for various regions during the summer 2017. An evaluation of meteorological fields important to air quality modeling (eg: near surface winds, temperatures, moisture and boundary layer heights, cloud cover) will be reported on.

  10. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance. Task 1 and Quality Control Sample; Error-Prone Modeling Analysis Plan.

    ERIC Educational Resources Information Center

    Saavedra, Pedro; And Others

    Parameters and procedures for developing an error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications are introduced. Specifications to adapt these general parameters to secondary data analysis of the Validation, Edits, and Applications Processing Systems…

  11. Modelling sewer sediment deposition, erosion, and transport processes to predict acute influent and reduce combined sewer overflows and CO(2) emissions.

    PubMed

    Mouri, Goro; Oki, Taikan

    2010-01-01

    Understanding of solids deposition, erosion, and transport processes in sewer systems has improved considerably in the past decade. This has provided guidance for controlling sewer solids and associated acute pollutants to protect the environment and improve the operation of wastewater systems. Although measures to decrease combined sewer overflow (CSO) events have reduced the amount of discharged pollution, overflows continue to occur during rainy weather in combined sewer systems. The solution lies in the amount of water allotted to various processes in an effluent treatment system, in impact evaluation of water quality and prediction technology, and in stressing the importance of developing a control technology. Extremely contaminated inflow has been a serious research subject, especially in connection with the influence of rainy weather on nitrogen and organic matter removal efficiency in wastewater treatment plants (WWTP). An intensive investigation of an extremely polluted inflow load to WWTP during rainy weather was conducted in the city of Matsuyama, the region used for the present research on total suspended solid (TSS) concentration. Since the inflow during rainy weather can be as much as 400 times that in dry weather, almost all sewers are unsettled and overflowing when a rain event is more than moderate. Another concern is the energy consumed by wastewater treatment; this problem has become important from the viewpoint of reducing CO(2) emissions and overall costs. Therefore, while establishing a prediction technology for the inflow water quality characteristics of a sewage disposal plant is an important priority, the development of a management/control method for an effluent treatment system that minimises energy consumption and CO(2) emissions due to water disposal is also a pressing research topic with regards to the quality of treated water. The procedure to improve water quality must make use of not only water quality and biotic criteria, but also modelling systems to enable the user to link the effect of changes in urban sewage systems with specific quality, energy consumption, CO(2) emission, and ecological improvements of the receiving water.

  12. Parenting and youth sexual risk in context: The role of community factors.

    PubMed

    Goodrum, Nada M; Armistead, Lisa P; Tully, Erin C; Cook, Sarah L; Skinner, Donald

    2017-06-01

    Black South African youth are disproportionately affected by HIV, and risky sexual behaviors increase youths' vulnerability to infection. U.S.-based research has highlighted several contextual influences on sexual risk, but these processes have not been examined in a South African context. In a convenience sample of Black South African caregivers and their 10-14-year-old youth (M age  = 11.7, SD = 1.4; 52.5% female), we examined the relation between parenting and youth sexual risk within the context of community-level processes, including neighborhood quality and maternal social support. Hypotheses were evaluated using structural equation modeling. Results revealed that better neighborhood quality and more social support predicted positive parenting, which in turn predicted less youth sexual risk. There was a significant indirect effect from neighborhood quality to youth sexual risk via parenting. Results highlight the importance of the community context in parenting and youth sexual risk in this understudied sample. HIV prevention-interventions should be informed by these contextual factors. Copyright © 2017 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  13. Optimization of porthole die geometrical variables by Taguchi method

    NASA Astrophysics Data System (ADS)

    Gagliardi, F.; Ciancio, C.; Ambrogio, G.; Filice, L.

    2017-10-01

    Porthole die extrusion is commonly used to manufacture hollow profiles made of lightweight alloys for numerous industrial applications. The reliability of extruded parts is affected strongly by the quality of the longitudinal and transversal seam welds. According to that, the die geometry must be designed correctly and the process parameters must be selected properly to achieve the desired product quality. In this study, numerical 3D simulations have been created and run to investigate the role of various geometrical variables on punch load and maximum pressure inside the welding chamber. These are important outputs to take into account affecting, respectively, the necessary capacity of the extrusion press and the quality of the welding lines. The Taguchi technique has been used to reduce the number of the required numerical simulations necessary for considering the influence of twelve different geometric variables. Moreover, the Analysis of variance (ANOVA) has been implemented to individually analyze the effect of each input parameter on the two responses. Then, the methodology has been utilized to determine the optimal process configuration individually optimizing the two investigated process outputs. Finally, the responses of the optimized parameters have been verified through finite element simulations approximating the predicted value closely. This study shows the feasibility of the Taguchi technique for predicting performance, optimization and therefore for improving the design of a porthole extrusion process.

  14. A Practical Framework Toward Prediction of Breaking Force and Disintegration of Tablet Formulations Using Machine Learning Tools.

    PubMed

    Akseli, Ilgaz; Xie, Jingjin; Schultz, Leon; Ladyzhynsky, Nadia; Bramante, Tommasina; He, Xiaorong; Deanne, Rich; Horspool, Keith R; Schwabe, Robert

    2017-01-01

    Enabling the paradigm of quality by design requires the ability to quantitatively correlate material properties and process variables to measureable product performance attributes. Conventional, quality-by-test methods for determining tablet breaking force and disintegration time usually involve destructive tests, which consume significant amount of time and labor and provide limited information. Recent advances in material characterization, statistical analysis, and machine learning have provided multiple tools that have the potential to develop nondestructive, fast, and accurate approaches in drug product development. In this work, a methodology to predict the breaking force and disintegration time of tablet formulations using nondestructive ultrasonics and machine learning tools was developed. The input variables to the model include intrinsic properties of formulation and extrinsic process variables influencing the tablet during manufacturing. The model has been applied to predict breaking force and disintegration time using small quantities of active pharmaceutical ingredient and prototype formulation designs. The novel approach presented is a step forward toward rational design of a robust drug product based on insight into the performance of common materials during formulation and process development. It may also help expedite drug product development timeline and reduce active pharmaceutical ingredient usage while improving efficiency of the overall process. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  15. Product Quality Modelling Based on Incremental Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Wang, J.; Zhang, W.; Qin, B.; Shi, W.

    2012-05-01

    Incremental Support vector machine (ISVM) is a new learning method developed in recent years based on the foundations of statistical learning theory. It is suitable for the problem of sequentially arriving field data and has been widely used for product quality prediction and production process optimization. However, the traditional ISVM learning does not consider the quality of the incremental data which may contain noise and redundant data; it will affect the learning speed and accuracy to a great extent. In order to improve SVM training speed and accuracy, a modified incremental support vector machine (MISVM) is proposed in this paper. Firstly, the margin vectors are extracted according to the Karush-Kuhn-Tucker (KKT) condition; then the distance from the margin vectors to the final decision hyperplane is calculated to evaluate the importance of margin vectors, where the margin vectors are removed while their distance exceed the specified value; finally, the original SVs and remaining margin vectors are used to update the SVM. The proposed MISVM can not only eliminate the unimportant samples such as noise samples, but also can preserve the important samples. The MISVM has been experimented on two public data and one field data of zinc coating weight in strip hot-dip galvanizing, and the results shows that the proposed method can improve the prediction accuracy and the training speed effectively. Furthermore, it can provide the necessary decision supports and analysis tools for auto control of product quality, and also can extend to other process industries, such as chemical process and manufacturing process.

  16. Benchmarking performance: Environmental impact statements in Egypt

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Badr, El-Sayed A., E-mail: ebadr@mans.edu.e; Zahran, Ashraf A., E-mail: ashraf_zahran@yahoo.co; Cashmore, Matthew, E-mail: m.cashmore@uea.ac.u

    Environmental impact assessment (EIA) was formally introduced in Egypt in 1994. This short paper evaluates 'how well' the EIA process is working in practice in Egypt, by reviewing the quality of 45 environmental impact statements (EISs) produced between 2000 and 2007 for a variety of project types. The Lee and Colley review package was used to assess the quality of the selected EISs. About 69% of the EISs sampled were found to be of a satisfactory quality. An assessment of the performance of different elements of the EIA process indicates that descriptive tasks tend to be performed better than scientificmore » tasks. The quality of core elements of EIA (e.g., impact prediction, significance evaluation, scoping and consideration of alternatives) appears to be particularly problematic. Variables that influence the quality of EISs are identified and a number of broad recommendations are made for improving the effectiveness of the EIA system.« less

  17. Advanced multivariate data analysis to determine the root cause of trisulfide bond formation in a novel antibody–peptide fusion

    PubMed Central

    Goldrick, Stephen; Holmes, William; Bond, Nicholas J.; Lewis, Gareth; Kuiper, Marcel; Turner, Richard

    2017-01-01

    ABSTRACT Product quality heterogeneities, such as a trisulfide bond (TSB) formation, can be influenced by multiple interacting process parameters. Identifying their root cause is a major challenge in biopharmaceutical production. To address this issue, this paper describes the novel application of advanced multivariate data analysis (MVDA) techniques to identify the process parameters influencing TSB formation in a novel recombinant antibody–peptide fusion expressed in mammalian cell culture. The screening dataset was generated with a high‐throughput (HT) micro‐bioreactor system (AmbrTM 15) using a design of experiments (DoE) approach. The complex dataset was firstly analyzed through the development of a multiple linear regression model focusing solely on the DoE inputs and identified the temperature, pH and initial nutrient feed day as important process parameters influencing this quality attribute. To further scrutinize the dataset, a partial least squares model was subsequently built incorporating both on‐line and off‐line process parameters and enabled accurate predictions of the TSB concentration at harvest. Process parameters identified by the models to promote and suppress TSB formation were implemented on five 7 L bioreactors and the resultant TSB concentrations were comparable to the model predictions. This study demonstrates the ability of MVDA to enable predictions of the key performance drivers influencing TSB formation that are valid also upon scale‐up. Biotechnol. Bioeng. 2017;114: 2222–2234. © 2017 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc. PMID:28500668

  18. Thermo-Mechanical Calculations of Hybrid Rotary Friction Welding at Equal Diameter Copper Bars and Effects of Essential Parameters on Dependent Special Variables

    NASA Astrophysics Data System (ADS)

    Parsa, M. H.; Davari, H.; Hadian, A. M.; Ahmadabadi, M. Nili

    2007-05-01

    Hybrid Rotary Friction Welding is a modified type of common rotary friction welding processes. In this welding method parameters such as pressure, angular velocity and time of welding control temperature, stress, strain and their variations. These dependent factors play an important rule in defining optimum process parameters combinations in order to improve the design and manufacturing of welding machines and quality of welded parts. Thermo-mechanical simulation of friction welding has been carried out and it has been shown that, simulation is an important tool for prediction of generated heat and strain at the weld interface and can be used for prediction of microstructure and evaluation of quality of welds. For simulation of Hybrid Rotary Friction Welding, a commercial finite element program has been used and the effects of pressure and rotary velocity of rotary part on temperature and strain variations have been investigated.

  19. Applications of artificial neural networks (ANNs) in food science.

    PubMed

    Huang, Yiqun; Kangas, Lars J; Rasco, Barbara A

    2007-01-01

    Artificial neural networks (ANNs) have been applied in almost every aspect of food science over the past two decades, although most applications are in the development stage. ANNs are useful tools for food safety and quality analyses, which include modeling of microbial growth and from this predicting food safety, interpreting spectroscopic data, and predicting physical, chemical, functional and sensory properties of various food products during processing and distribution. ANNs hold a great deal of promise for modeling complex tasks in process control and simulation and in applications of machine perception including machine vision and electronic nose for food safety and quality control. This review discusses the basic theory of the ANN technology and its applications in food science, providing food scientists and the research community an overview of the current research and future trend of the applications of ANN technology in the field.

  20. Assessment of motor and process skills: assessing client work performance in Belgium.

    PubMed

    Vandamme, Dirk

    2010-01-01

    The aim of this study is to establish whether the Assessment of Motor and Process Skills (AMPS) is an appropriate tool to evaluate the quality of work performance by comparing clients' results on the AMPS with the quality of the skills that they demonstrate on the shop floor. A convenience sample of chronically unemployed (vocationally disabled) participants (N=139) with no formal training who were seeking unskilled work through Jobcentrum West-Vlaanderen (West Flanders Job Centre, Belgium) was used. Results demonstrated that in 75.2% of cases the prediction of employment outcome was correct; it is suggested that an AMPS motor score < 2.5 and a process score < 1.2 is insufficient for regular employment, while a motor score > 3.1 and process score > 1.5 indicates that regular employment is a realistic goal. The quality of the motor skills measured by the AMPS and measured on the shop floor are comparable, but little similarity was found in the measurement of process skills.

  1. Does Class Size in First Grade Relate to Children's Academic and Social Performance or Observed Classroom Processes?

    ERIC Educational Resources Information Center

    Allhusen, Virginia; Belsky, Jay; Booth-LaForce, Cathryn L.; Bradley, Robert; Brownwell, Celia A; Burchinal, Margaret; Campbell, Susan B.; Clarke-Stewart, K. Alison; Cox, Martha; Friedman, Sarah L.; Hirsh-Pasek, Kathryn; Houts, Renate M.; Huston, Aletha; Jaeger, Elizabeth; Johnson, Deborah J.; Kelly, Jean F.; Knoke, Bonnie; Marshall, Nancy; McCartney, Kathleen; Morrison, Frederick J.; O'Brien, Marion; Tresch Owen, Margaret; Payne, Chris; Phillips, Deborah; Pianta, Robert; Randolph, Suzanne M.; Robeson, Wendy W.; Spieker, Susan; Lowe Vandell, Deborah; Weinraub, Marsha

    2004-01-01

    This study evaluated the extent to which first-grade class size predicted child outcomes and observed classroom processes for 651 children (in separate classrooms). Analyses examined observed child-adult ratios and teacher-reported class sizes. Smaller classrooms showed higher quality instructional and emotional support, although children were…

  2. Application of Optical Coherence Tomography Freeze-Drying Microscopy for Designing Lyophilization Process and Its Impact on Process Efficiency and Product Quality.

    PubMed

    Korang-Yeboah, Maxwell; Srinivasan, Charudharshini; Siddiqui, Akhtar; Awotwe-Otoo, David; Cruz, Celia N; Muhammad, Ashraf

    2018-01-01

    Optical coherence tomography freeze-drying microscopy (OCT-FDM) is a novel technique that allows the three-dimensional imaging of a drug product during the entire lyophilization process. OCT-FDM consists of a single-vial freeze dryer (SVFD) affixed with an optical coherence tomography (OCT) imaging system. Unlike the conventional techniques, such as modulated differential scanning calorimetry (mDSC) and light transmission freeze-drying microscopy, used for predicting the product collapse temperature (Tc), the OCT-FDM approach seeks to mimic the actual product and process conditions during the lyophilization process. However, there is limited understanding on the application of this emerging technique to the design of the lyophilization process. In this study, we investigated the suitability of OCT-FDM technique in designing a lyophilization process. Moreover, we compared the product quality attributes of the resulting lyophilized product manufactured using Tc, a critical process control parameter, as determined by OCT-FDM versus as estimated by mDSC. OCT-FDM analysis revealed the absence of collapse even for the low protein concentration (5 mg/ml) and low solid content formulation (1%w/v) studied. This was confirmed by lab scale lyophilization. In addition, lyophilization cycles designed using Tc values obtained from OCT-FDM were more efficient with higher sublimation rate and mass flux than the conventional cycles, since drying was conducted at higher shelf temperature. Finally, the quality attributes of the products lyophilized using Tc determined by OCT-FDM and mDSC were similar, and product shrinkage and cracks were observed in all the batches of freeze-dried products irrespective of the technique employed in predicting Tc.

  3. Modelling and analysis of ozone concentration by artificial intelligent techniques for estimating air quality

    NASA Astrophysics Data System (ADS)

    Taylan, Osman

    2017-02-01

    High ozone concentration is an important cause of air pollution mainly due to its role in the greenhouse gas emission. Ozone is produced by photochemical processes which contain nitrogen oxides and volatile organic compounds in the lower atmospheric level. Therefore, monitoring and controlling the quality of air in the urban environment is very important due to the public health care. However, air quality prediction is a highly complex and non-linear process; usually several attributes have to be considered. Artificial intelligent (AI) techniques can be employed to monitor and evaluate the ozone concentration level. The aim of this study is to develop an Adaptive Neuro-Fuzzy inference approach (ANFIS) to determine the influence of peripheral factors on air quality and pollution which is an arising problem due to ozone level in Jeddah city. The concentration of ozone level was considered as a factor to predict the Air Quality (AQ) under the atmospheric conditions. Using Air Quality Standards of Saudi Arabia, ozone concentration level was modelled by employing certain factors such as; nitrogen oxide (NOx), atmospheric pressure, temperature, and relative humidity. Hence, an ANFIS model was developed to observe the ozone concentration level and the model performance was assessed by testing data obtained from the monitoring stations established by the General Authority of Meteorology and Environment Protection of Kingdom of Saudi Arabia. The outcomes of ANFIS model were re-assessed by fuzzy quality charts using quality specification and control limits based on US-EPA air quality standards. The results of present study show that the ANFIS model is a comprehensive approach for the estimation and assessment of ozone level and is a reliable approach to produce more genuine outcomes.

  4. Ecophysiology of avian migration in the face of current global hazards

    PubMed Central

    Klaassen, Marcel; Hoye, Bethany J.; Nolet, Bart A.; Buttemer, William A.

    2012-01-01

    Long-distance migratory birds are often considered extreme athletes, possessing a range of traits that approach the physiological limits of vertebrate design. In addition, their movements must be carefully timed to ensure that they obtain resources of sufficient quantity and quality to satisfy their high-energy needs. Migratory birds may therefore be particularly vulnerable to global change processes that are projected to alter the quality and quantity of resource availability. Because long-distance flight requires high and sustained aerobic capacity, even minor decreases in vitality can have large negative consequences for migrants. In the light of this, we assess how current global change processes may affect the ability of birds to meet the physiological demands of migration, and suggest areas where avian physiologists may help to identify potential hazards. Predicting the consequences of global change scenarios on migrant species requires (i) reconciliation of empirical and theoretical studies of avian flight physiology; (ii) an understanding of the effects of food quality, toxicants and disease on migrant performance; and (iii) mechanistic models that integrate abiotic and biotic factors to predict migratory behaviour. Critically, a multi-dimensional concept of vitality would greatly facilitate evaluation of the impact of various global change processes on the population dynamics of migratory birds. PMID:22566678

  5. Quality assessment of butter cookies applying multispectral imaging

    PubMed Central

    Andresen, Mette S; Dissing, Bjørn S; Løje, Hanne

    2013-01-01

    A method for characterization of butter cookie quality by assessing the surface browning and water content using multispectral images is presented. Based on evaluations of the browning of butter cookies, cookies were manually divided into groups. From this categorization, reference values were calculated for a statistical prediction model correlating multispectral images with a browning score. The browning score is calculated as a function of oven temperature and baking time. It is presented as a quadratic response surface. The investigated process window was the intervals 4–16 min and 160–200°C in a forced convection electrically heated oven. In addition to the browning score, a model for predicting the average water content based on the same images is presented. This shows how multispectral images of butter cookies may be used for the assessment of different quality parameters. Statistical analysis showed that the most significant wavelengths for browning predictions were in the interval 400–700 nm and the wavelengths significant for water prediction were primarily located in the near-infrared spectrum. The water prediction model was found to correctly estimate the average water content with an absolute error of 0.22%. From the images it was also possible to follow the browning and drying propagation from the cookie edge toward the center. PMID:24804036

  6. Adjusted hospital death rates: a potential screen for quality of medical care.

    PubMed

    Dubois, R W; Brook, R H; Rogers, W H

    1987-09-01

    Increased economic pressure on hospitals has accelerated the need to develop a screening tool for identifying hospitals that potentially provide poor quality care. Based upon data from 93 hospitals and 205,000 admissions, we used a multiple regression model to adjust the hospitals crude death rate. The adjustment process used age, origin of patient from the emergency department or nursing home, and a hospital case mix index based on DRGs (diagnostic related groups). Before adjustment, hospital death rates ranged from 0.3 to 5.8 per 100 admissions. After adjustment, hospital death ratios ranged from 0.36 to 1.36 per 100 (actual death rate divided by predicted death rate). Eleven hospitals (12 per cent) were identified where the actual death rate exceeded the predicted death rate by more than two standard deviations. In nine hospitals (10 per cent), the predicted death rate exceeded the actual death rate by a similar statistical margin. The 11 hospitals with higher than predicted death rates may provide inadequate quality of care or have uniquely ill patient populations. The adjusted death rate model needs to be validated and generalized before it can be used routinely to screen hospitals. However, the remaining large differences in observed versus predicted death rates lead us to believe that important differences in hospital performance may exist.

  7. Construction of quality-assured infant feeding process of care data repositories: definition and design (Part 1).

    PubMed

    Garcí A-de-León-Chocano, Ricardo; Sáez, Carlos; Muñoz-Soler, Verónica; Garcí A-de-León-González, Ricardo; García-Gómez, Juan M

    2015-12-01

    This is the first paper of a series of two regarding the construction of data quality (DQ) assured repositories for the reuse of information on infant feeding from birth until two years old. This first paper justifies the need for such repositories and describes the design of a process to construct them from Electronic Health Records (EHR). As a result, Part 1 proposes a computational process to obtain quality-assured datasets represented by a canonical structure extracted from raw data from multiple EHR. For this, 13 steps were defined to ensure the harmonization, standardization, completion, de-duplication, and consistency of the dataset content. Moreover, the quality of the input and output data for each of these steps is controlled according to eight DQ dimensions: predictive value, correctness, duplication, consistency, completeness, contextualization, temporal-stability and spatial-stability. The second paper of the series will describe the application of this computational process to construct the first quality-assured repository for the reuse of information on infant feeding in the perinatal period aimed at the monitoring of clinical activities and research. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Modelling of beef sensory quality for a better prediction of palatability.

    PubMed

    Hocquette, Jean-François; Van Wezemael, Lynn; Chriki, Sghaier; Legrand, Isabelle; Verbeke, Wim; Farmer, Linda; Scollan, Nigel D; Polkinghorne, Rod; Rødbotten, Rune; Allen, Paul; Pethick, David W

    2014-07-01

    Despite efforts by the industry to control the eating quality of beef, there remains a high level of variability in palatability, which is one reason for consumer dissatisfaction. In Europe, there is still no reliable on-line tool to predict beef quality and deliver consistent quality beef to consumers. Beef quality traits depend in part on the physical and chemical properties of the muscles. The determination of these properties (known as muscle profiling) will allow for more informed decisions to be made in the selection of individual muscles for the production of value-added products. Therefore, scientists and professional partners of the ProSafeBeef project have brought together all the data they have accumulated over 20 years. The resulting BIF-Beef (Integrated and Functional Biology of Beef) data warehouse contains available data of animal growth, carcass composition, muscle tissue characteristics and beef quality traits. This database is useful to determine the most important muscle characteristics associated with a high tenderness, a high flavour or generally a high quality. Another more consumer driven modelling tool was developed in Australia: the Meat Standards Australia (MSA) grading scheme that predicts beef quality for each individual muscle×specific cooking method combination using various information on the corresponding animals and post-slaughter processing factors. This system has also the potential to detect variability in quality within muscles. The MSA system proved to be effective in predicting beef palatability not only in Australia but also in many other countries. The results of the work conducted in Europe within the ProSafeBeef project indicate that it would be possible to manage a grading system in Europe similar to the MSA system. The combination of the different modelling approaches (namely muscle biochemistry and a MSA-like meat grading system adapted to the European market) is a promising area of research to improve the prediction of beef quality. In both approaches, the volume of data available not only provides statistically sound correlations between various factors and beef quality traits but also a better understanding of the variability of beef quality according to various criteria (breed, age, sex, pH, marbling etc.). © 2013 The American Meat Science Association. All rights reserved.

  9. Hyperspectral Imaging for Predicting the Internal Quality of Kiwifruits Based on Variable Selection Algorithms and Chemometric Models.

    PubMed

    Zhu, Hongyan; Chu, Bingquan; Fan, Yangyang; Tao, Xiaoya; Yin, Wenxin; He, Yong

    2017-08-10

    We investigated the feasibility and potentiality of determining firmness, soluble solids content (SSC), and pH in kiwifruits using hyperspectral imaging, combined with variable selection methods and calibration models. The images were acquired by a push-broom hyperspectral reflectance imaging system covering two spectral ranges. Weighted regression coefficients (BW), successive projections algorithm (SPA) and genetic algorithm-partial least square (GAPLS) were compared and evaluated for the selection of effective wavelengths. Moreover, multiple linear regression (MLR), partial least squares regression and least squares support vector machine (LS-SVM) were developed to predict quality attributes quantitatively using effective wavelengths. The established models, particularly SPA-MLR, SPA-LS-SVM and GAPLS-LS-SVM, performed well. The SPA-MLR models for firmness (R pre  = 0.9812, RPD = 5.17) and SSC (R pre  = 0.9523, RPD = 3.26) at 380-1023 nm showed excellent performance, whereas GAPLS-LS-SVM was the optimal model at 874-1734 nm for predicting pH (R pre  = 0.9070, RPD = 2.60). Image processing algorithms were developed to transfer the predictive model in every pixel to generate prediction maps that visualize the spatial distribution of firmness and SSC. Hence, the results clearly demonstrated that hyperspectral imaging has the potential as a fast and non-invasive method to predict the quality attributes of kiwifruits.

  10. Automated calibration of the Suomi National Polar-Orbiting Partnership (S-NPP) Visible Infrared Imaging Radiometer Suite (VIIRS) reflective solar bands

    NASA Astrophysics Data System (ADS)

    Rausch, Kameron; Houchin, Scott; Cardema, Jason; Moy, Gabriel; Haas, Evan; De Luccia, Frank J.

    2013-12-01

    National Polar-Orbiting Partnership (S-NPP) Visible Infrared Imaging Radiometer Suite (VIIRS) reflective bands are currently calibrated via weekly updates to look-up tables (LUTs) utilized by operational ground processing in the Joint Polar Satellite System Interface Data Processing Segment (IDPS). The parameters in these LUTs must be predicted ahead 2 weeks and cannot adequately track the dynamically varying response characteristics of the instrument. As a result, spurious "predict-ahead" calibration errors of the order of 0.1% or greater are routinely introduced into the calibrated reflectances and radiances produced by IDPS in sensor data records (SDRs). Spurious calibration errors of this magnitude adversely impact the quality of downstream environmental data records (EDRs) derived from VIIRS SDRs such as Ocean Color/Chlorophyll and cause increased striping and band-to-band radiometric calibration uncertainty of SDR products. A novel algorithm that fully automates reflective band calibration has been developed for implementation in IDPS in late 2013. Automating the reflective solar band (RSB) calibration is extremely challenging and represents a significant advancement over the manner in which RSB calibration has traditionally been performed in heritage instruments such as the Moderate Resolution Imaging Spectroradiometer. The automated algorithm applies calibration data almost immediately after their acquisition by the instrument from views of space and on-onboard calibration sources, thereby eliminating the predict-ahead errors associated with the current offline calibration process. This new algorithm, when implemented, will significantly improve the quality of VIIRS reflective band SDRs and consequently the quality of EDRs produced from these SDRs.

  11. Motivational processes and well-being in cardiac rehabilitation: a self-determination theory perspective.

    PubMed

    Rahman, Rachel Jane; Hudson, Joanne; Thøgersen-Ntoumani, Cecilie; Doust, Jonathan H

    2015-01-01

    This research examined the processes underpinning changes in psychological well-being and behavioural regulation in cardiac rehabilitation (CR) patients using self-determination theory (SDT). A repeated measures design was used to identify the longitudinal relationships between SDT variables, psychological well-being and exercise behaviour during and following a structured CR programme. Participants were 389 cardiac patients (aged 36-84 years; M(age) = 64 ± 9 years; 34.3% female) referred to a 12-week-supervised CR programme. Psychological need satisfaction, behavioural regulation, health-related quality of life, physical self-worth, anxiety and depression were measured at programme entry, exit and six month post-programme. During the programme, increases in autonomy satisfaction predicted positive changes in behavioural regulation, and improvements in competence and relatedness satisfaction predicted improvements in behavioural regulation and well-being. Competence satisfaction also positively predicted habitual physical activity. Decreases in external regulation and increases in intrinsic motivation predicted improvements in physical self-worth and physical well-being, respectively. Significant longitudinal relationships were identified whereby changes during the programme predicted changes in habitual physical activity and the mental quality of life from exit to six month follow-up. Findings provide insight into the factors explaining psychological changes seen during CR. They highlight the importance of increasing patients' perceptions of psychological need satisfaction and self-determined motivation to improve well-being during the structured component of a CR programme and longer term physical activity.

  12. On-line prediction of yield grade, longissimus muscle area, preliminary yield grade, adjusted preliminary yield grade, and marbling score using the MARC beef carcass image analysis system.

    PubMed

    Shackelford, S D; Wheeler, T L; Koohmaraie, M

    2003-01-01

    The present experiment was conducted to evaluate the ability of the U.S. Meat Animal Research Center's beef carcass image analysis system to predict calculated yield grade, longissimus muscle area, preliminary yield grade, adjusted preliminary yield grade, and marbling score under commercial beef processing conditions. In two commercial beef-processing facilities, image analysis was conducted on 800 carcasses on the beef-grading chain immediately after the conventional USDA beef quality and yield grades were applied. Carcasses were blocked by plant and observed calculated yield grade. The carcasses were then separated, with 400 carcasses assigned to a calibration data set that was used to develop regression equations, and the remaining 400 carcasses assigned to a prediction data set used to validate the regression equations. Prediction equations, which included image analysis variables and hot carcass weight, accounted for 90, 88, 90, 88, and 76% of the variation in calculated yield grade, longissimus muscle area, preliminary yield grade, adjusted preliminary yield grade, and marbling score, respectively, in the prediction data set. In comparison, the official USDA yield grade as applied by online graders accounted for 73% of the variation in calculated yield grade. The technology described herein could be used by the beef industry to more accurately determine beef yield grades; however, this system does not provide an accurate enough prediction of marbling score to be used without USDA grader interaction for USDA quality grading.

  13. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems

    NASA Astrophysics Data System (ADS)

    Cota, Steve A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Chris J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Willkinson, Timothy S.

    2008-08-01

    The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.

  14. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Christopher J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Wilkinson, Timothy S.

    2010-06-01

    The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.

  15. [The structure of interaction in romantic relationships: hierarchical data analysis of inter-subjectivity between partners].

    PubMed

    Shimizu, Hiroshi; Daibo, Ikuo

    2008-02-01

    A hierarchical data analysis was conducted using data from couples to examine how self-reports of interactions between partners in romantic relationships predict the quality of the relationships. Whereas the social exchange theory has elucidated the quality of relationships from the individual level of subjectivity, this study focused on the structure of interactions between the partners (i.e., the frequency, strength, and diversity) through a process of inter-subjectivity at the couple level. A multilevel covariance structure analysis of 194 university students involved in romantic relationships revealed that the quality of relationships was mainly related to the strength and the diversity of interactions at the couple level, rather than the strength of interactions at the individual level. These results indicate that the inter-subjective process in romantic relationships may primarily explain the quality of relationships.

  16. Monitoring the quality of welding based on welding current and ste analysis

    NASA Astrophysics Data System (ADS)

    Mazlan, Afidatusshimah; Daniyal, Hamdan; Izzani Mohamed, Amir; Ishak, Mahadzir; Hadi, Amran Abdul

    2017-10-01

    Qualities of welding play an important part in industry especially in manufacturing field. Post-welding non-destructive test is one of the importance process to ensure the quality of welding but it is time consuming and costly. To reduce the chance of defects, online monitoring had been utilized by continuously sense some of welding parameters and predict welding quality. One of the parameters is welding current, which is rich of information but lack of study focus on extract them at signal analysis level. This paper presents the analysis of welding current using Short Time Energy (STE) signal processing to quantify the pattern of the current. GMAW set with carbon steel specimens are used in this experimental study with high-bandwidth and high sampling rate oscilloscope capturing the welding current. The results indicate welding current as signatures have high correlation with the welding process. Continue with STE analysis, the value below 5000 is declare as good welding, meanwhile the STE value more than 6000 is contained defect.

  17. Regional Modelling of Air Quality in the Canadian Arctic: Impact of marine shipping and North American wild fire emissions

    NASA Astrophysics Data System (ADS)

    Gong, W.; Beagley, S. R.; Zhang, J.; Cousineau, S.; Sassi, M.; Munoz-Alpizar, R.; Racine, J.; Menard, S.; Chen, J.

    2015-12-01

    Arctic atmospheric composition is strongly influenced by long-range transport from mid-latitudes as well as processes occurring in the Arctic locally. Using an on-line air quality prediction model GEM-MACH, simulations were carried out for the 2010 northern shipping season (April - October) over a regional Arctic domain. North American wildfire emissions and Arctic shipping emissions were represented, along with other anthropogenic and biogenic emissions. Sensitivity studies were carried out to investigate the principal sources and processes affecting air quality in the Canadian Northern and Arctic regions. In this paper, we present an analysis of sources, transport, and removal processes on the ambient concentrations and atmospheric loading of various pollutants with air quality and climate implications, such as, O3, NOx, SO2, CO, and aerosols (sulfate, black carbon, and organic carbon components). Preliminary results from a model simulation of a recent summertime Arctic field campaign will also be presented.

  18. Near-infrared hyperspectral imaging for quality analysis of agricultural and food products

    NASA Astrophysics Data System (ADS)

    Singh, C. B.; Jayas, D. S.; Paliwal, J.; White, N. D. G.

    2010-04-01

    Agricultural and food processing industries are always looking to implement real-time quality monitoring techniques as a part of good manufacturing practices (GMPs) to ensure high-quality and safety of their products. Near-infrared (NIR) hyperspectral imaging is gaining popularity as a powerful non-destructive tool for quality analysis of several agricultural and food products. This technique has the ability to analyse spectral data in a spatially resolved manner (i.e., each pixel in the image has its own spectrum) by applying both conventional image processing and chemometric tools used in spectral analyses. Hyperspectral imaging technique has demonstrated potential in detecting defects and contaminants in meats, fruits, cereals, and processed food products. This paper discusses the methodology of hyperspectral imaging in terms of hardware, software, calibration, data acquisition and compression, and development of prediction and classification algorithms and it presents a thorough review of the current applications of hyperspectral imaging in the analyses of agricultural and food products.

  19. Mechanisms of deterioration of nutrients. [retention of flavor during freeze drying

    NASA Technical Reports Server (NTRS)

    Karel, M.; Flink, J. M.

    1975-01-01

    The retention of flavor during freeze drying was studied with model systems. Mechanisms by which flavor retention phenomena is explained were developed and process conditions specified so that flavor retention is optimized. The literature is reviewed and results of studies of the flavor retention behavior of a number of real food products, including both liquid and solid foods are evaluated. Process parameters predicted by the mechanisms to be of greatest significance are freezing rate, initial solids content, and conditions which result in maintenance of sample structure. Flavor quality for the real food showed the same behavior relative to process conditions as predicted by the mechanisms based on model system studies.

  20. Robust PLS approach for KPI-related prediction and diagnosis against outliers and missing data

    NASA Astrophysics Data System (ADS)

    Yin, Shen; Wang, Guang; Yang, Xu

    2014-07-01

    In practical industrial applications, the key performance indicator (KPI)-related prediction and diagnosis are quite important for the product quality and economic benefits. To meet these requirements, many advanced prediction and monitoring approaches have been developed which can be classified into model-based or data-driven techniques. Among these approaches, partial least squares (PLS) is one of the most popular data-driven methods due to its simplicity and easy implementation in large-scale industrial process. As PLS is totally based on the measured process data, the characteristics of the process data are critical for the success of PLS. Outliers and missing values are two common characteristics of the measured data which can severely affect the effectiveness of PLS. To ensure the applicability of PLS in practical industrial applications, this paper introduces a robust version of PLS to deal with outliers and missing values, simultaneously. The effectiveness of the proposed method is finally demonstrated by the application results of the KPI-related prediction and diagnosis on an industrial benchmark of Tennessee Eastman process.

  1. Molecular pathological predictive diagnostics in a patient with non-small cell lung cancer treated with crizotinib therapy: A case report.

    PubMed

    Stanek, Libor; Springer, Drahomira; Konopasek, Bohuslav; Vocka, Michal; Tesarova, Petra; Syrucek, Martin; Petruzelka, Lubos; Vicha, Ales; Musil, Zdenek

    2017-12-01

    Lung cancer is one of the most common malignant cancers in the Czech Republic in men, with the highest mortality rate of all the malignant diseases. The development of biological treatment enables study into novel personalized treatment options. This type of treatment is usually of high quality, and is often demanding of predictive and biopsy diagnostics, which is dependent on the quality of the collected material and close cooperation among particular departments. The present study describes the complete biopsy and predictive examinations performed in a male patient with lung adenocarcinoma, with an emphasis on the logistics of the whole process and the application of the tyrosine kinase inhibitors, crizotinib and LDK378. The patient experienced a long overall survival time of 28 months from diagnosis.

  2. Natural language processing in an intelligent writing strategy tutoring system.

    PubMed

    McNamara, Danielle S; Crossley, Scott A; Roscoe, Rod

    2013-06-01

    The Writing Pal is an intelligent tutoring system that provides writing strategy training. A large part of its artificial intelligence resides in the natural language processing algorithms to assess essay quality and guide feedback to students. Because writing is often highly nuanced and subjective, the development of these algorithms must consider a broad array of linguistic, rhetorical, and contextual features. This study assesses the potential for computational indices to predict human ratings of essay quality. Past studies have demonstrated that linguistic indices related to lexical diversity, word frequency, and syntactic complexity are significant predictors of human judgments of essay quality but that indices of cohesion are not. The present study extends prior work by including a larger data sample and an expanded set of indices to assess new lexical, syntactic, cohesion, rhetorical, and reading ease indices. Three models were assessed. The model reported by McNamara, Crossley, and McCarthy (Written Communication 27:57-86, 2010) including three indices of lexical diversity, word frequency, and syntactic complexity accounted for only 6% of the variance in the larger data set. A regression model including the full set of indices examined in prior studies of writing predicted 38% of the variance in human scores of essay quality with 91% adjacent accuracy (i.e., within 1 point). A regression model that also included new indices related to rhetoric and cohesion predicted 44% of the variance with 94% adjacent accuracy. The new indices increased accuracy but, more importantly, afford the means to provide more meaningful feedback in the context of a writing tutoring system.

  3. Processes Understanding of Decadal Climate Variability

    NASA Astrophysics Data System (ADS)

    Prömmel, Kerstin; Cubasch, Ulrich

    2016-04-01

    The realistic representation of decadal climate variability in the models is essential for the quality of decadal climate predictions. Therefore, the understanding of those processes leading to decadal climate variability needs to be improved. Several of these processes are already included in climate models but their importance has not yet completely been clarified. The simulation of other processes requires sometimes a higher resolution of the model or an extension by additional subsystems. This is addressed within one module of the German research program "MiKlip II - Decadal Climate Predictions" (http://www.fona-miklip.de/en/) with a focus on the following processes. Stratospheric processes and their impact on the troposphere are analysed regarding the climate response to aerosol perturbations caused by volcanic eruptions and the stratospheric decadal variability due to solar forcing, climate change and ozone recovery. To account for the interaction between changing ozone concentrations and climate a computationally efficient ozone chemistry module is developed and implemented in the MiKlip prediction system. The ocean variability and air-sea interaction are analysed with a special focus on the reduction of the North Atlantic cold bias. In addition, the predictability of the oceanic carbon uptake with a special emphasis on the underlying mechanism is investigated. This addresses a combination of physical, biological and chemical processes.

  4. The Interaction of Spacecraft Cabin Atmospheric Quality and Water Processing System Performance

    NASA Technical Reports Server (NTRS)

    Perry, Jay L.; Croomes, Scott D. (Technical Monitor)

    2002-01-01

    Although designed to remove organic contaminants from a variety of waste water streams, the planned U.S.- and present Russian-provided water processing systems onboard the International Space Station (ISS) have capacity limits for some of the more common volatile cleaning solvents used for housekeeping purposes. Using large quantities of volatile cleaning solvents during the ground processing and in-flight operational phases of a crewed spacecraft such as the ISS can lead to significant challenges to the water processing systems. To understand the challenges facing the management of water processing capacity, the relationship between cabin atmospheric quality and humidity condensate loading is presented. This relationship is developed as a tool to determine the cabin atmospheric loading that may compromise water processing system performance. A comparison of cabin atmospheric loading with volatile cleaning solvents from ISS, Mir, and Shuttle are presented to predict acceptable limits to maintain optimal water processing system performance.

  5. Exploiting mAb structure characteristics for a directed QbD implementation in early process development.

    PubMed

    Karlberg, Micael; von Stosch, Moritz; Glassey, Jarka

    2018-03-07

    In today's biopharmaceutical industries, the lead time to develop and produce a new monoclonal antibody takes years before it can be launched commercially. The reasons lie in the complexity of the monoclonal antibodies and the need for high product quality to ensure clinical safety which has a significant impact on the process development time. Frameworks such as quality by design are becoming widely used by the pharmaceutical industries as they introduce a systematic approach for building quality into the product. However, full implementation of quality by design has still not been achieved due to attrition mainly from limited risk assessment of product properties as well as the large number of process factors affecting product quality that needs to be investigated during the process development. This has introduced a need for better methods and tools that can be used for early risk assessment and predictions of critical product properties and process factors to enhance process development and reduce costs. In this review, we investigate how the quantitative structure-activity relationships framework can be applied to an existing process development framework such as quality by design in order to increase product understanding based on the protein structure of monoclonal antibodies. Compared to quality by design, where the effect of process parameters on the drug product are explored, quantitative structure-activity relationships gives a reversed perspective which investigates how the protein structure can affect the performance in different unit operations. This provides valuable information that can be used during the early process development of new drug products where limited process understanding is available. Thus, quantitative structure-activity relationships methodology is explored and explained in detail and we investigate the means of directly linking the structural properties of monoclonal antibodies to process data. The resulting information as a decision tool can help to enhance the risk assessment to better aid process development and thereby overcome some of the limitations and challenges present in QbD implementation today.

  6. Transforming nanomedicine manufacturing toward Quality by Design and microfluidics.

    PubMed

    Colombo, Stefano; Beck-Broichsitter, Moritz; Bøtker, Johan Peter; Malmsten, Martin; Rantanen, Jukka; Bohr, Adam

    2018-04-05

    Nanopharmaceuticals aim at translating the unique features of nano-scale materials into therapeutic products and consequently their development relies critically on the progression in manufacturing technology to allow scalable processes complying with process economy and quality assurance. The relatively high failure rate in translational nanopharmaceutical research and development, with respect to new products on the market, is at least partly due to immature bottom-up manufacturing development and resulting sub-optimal control of quality attributes in nanopharmaceuticals. Recently, quality-oriented manufacturing of pharmaceuticals has undergone an unprecedented change toward process and product development interaction. In this context, Quality by Design (QbD) aims to integrate product and process development resulting in an increased number of product applications to regulatory agencies and stronger proprietary defense strategies of process-based products. Although QbD can be applied to essentially any production approach, microfluidic production offers particular opportunities for QbD-based manufacturing of nanopharmaceuticals. Microfluidics provides unique design flexibility, process control and parameter predictability, and also offers ample opportunities for modular production setups, allowing process feedback for continuously operating production and process control. The present review aims at outlining emerging opportunities in the synergistic implementation of QbD strategies and microfluidic production in contemporary development and manufacturing of nanopharmaceuticals. In doing so, aspects of design and development, but also technology management, are reviewed, as is the strategic role of these tools for aligning nanopharmaceutical innovation, development, and advanced industrialization in the broader pharmaceutical field. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Real-time parameter optimization based on neural network for smart injection molding

    NASA Astrophysics Data System (ADS)

    Lee, H.; Liau, Y.; Ryu, K.

    2018-03-01

    The manufacturing industry has been facing several challenges, including sustainability, performance and quality of production. Manufacturers attempt to enhance the competitiveness of companies by implementing CPS (Cyber-Physical Systems) through the convergence of IoT(Internet of Things) and ICT(Information & Communication Technology) in the manufacturing process level. Injection molding process has a short cycle time and high productivity. This features have been making it suitable for mass production. In addition, this process is used to produce precise parts in various industry fields such as automobiles, optics and medical devices. Injection molding process has a mixture of discrete and continuous variables. In order to optimized the quality, variables that is generated in the injection molding process must be considered. Furthermore, Optimal parameter setting is time-consuming work to predict the optimum quality of the product. Since the process parameter cannot be easily corrected during the process execution. In this research, we propose a neural network based real-time process parameter optimization methodology that sets optimal process parameters by using mold data, molding machine data, and response data. This paper is expected to have academic contribution as a novel study of parameter optimization during production compare with pre - production parameter optimization in typical studies.

  8. Statistical modeling methods to analyze the impacts of multiunit process variability on critical quality attributes of Chinese herbal medicine tablets.

    PubMed

    Sun, Fei; Xu, Bing; Zhang, Yi; Dai, Shengyun; Yang, Chan; Cui, Xianglong; Shi, Xinyuan; Qiao, Yanjiang

    2016-01-01

    The quality of Chinese herbal medicine tablets suffers from batch-to-batch variability due to a lack of manufacturing process understanding. In this paper, the Panax notoginseng saponins (PNS) immediate release tablet was taken as the research subject. By defining the dissolution of five active pharmaceutical ingredients and the tablet tensile strength as critical quality attributes (CQAs), influences of both the manipulated process parameters introduced by an orthogonal experiment design and the intermediate granules' properties on the CQAs were fully investigated by different chemometric methods, such as the partial least squares, the orthogonal projection to latent structures, and the multiblock partial least squares (MBPLS). By analyzing the loadings plots and variable importance in the projection indexes, the granule particle sizes and the minimal punch tip separation distance in tableting were identified as critical process parameters. Additionally, the MBPLS model suggested that the lubrication time in the final blending was also important in predicting tablet quality attributes. From the calculated block importance in the projection indexes, the tableting unit was confirmed to be the critical process unit of the manufacturing line. The results demonstrated that the combinatorial use of different multivariate modeling methods could help in understanding the complex process relationships as a whole. The output of this study can then be used to define a control strategy to improve the quality of the PNS immediate release tablet.

  9. High Throughput Multispectral Image Processing with Applications in Food Science.

    PubMed

    Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John

    2015-01-01

    Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.

  10. Use of a Process Analysis Tool for Diagnostic Study on Fine Particulate Matter Predictions in the U.S.-Part II: Analysis and Sensitivity Simulations

    EPA Science Inventory

    Following the Part I paper that described an application of the U.S. EPA Models-3/Community Multiscale Air Quality (CMAQ) modeling system to the 1999 Southern Oxidants Study episode, this paper presents results from process analysis (PA) using the PA tool embedded in CMAQ and s...

  11. Prediction of the Vickers Microhardness and Ultimate Tensile Strength of AA5754 H111 Friction Stir Welding Butt Joints Using Artificial Neural Network.

    PubMed

    De Filippis, Luigi Alberto Ciro; Serio, Livia Maria; Facchini, Francesco; Mummolo, Giovanni; Ludovico, Antonio Domenico

    2016-11-10

    A simulation model was developed for the monitoring, controlling and optimization of the Friction Stir Welding (FSW) process. This approach, using the FSW technique, allows identifying the correlation between the process parameters (input variable) and the mechanical properties (output responses) of the welded AA5754 H111 aluminum plates. The optimization of technological parameters is a basic requirement for increasing the seam quality, since it promotes a stable and defect-free process. Both the tool rotation and the travel speed, the position of the samples extracted from the weld bead and the thermal data, detected with thermographic techniques for on-line control of the joints, were varied to build the experimental plans. The quality of joints was evaluated through destructive and non-destructive tests (visual tests, macro graphic analysis, tensile tests, indentation Vickers hardness tests and t thermographic controls). The simulation model was based on the adoption of the Artificial Neural Networks (ANNs) characterized by back-propagation learning algorithm with different types of architecture, which were able to predict with good reliability the FSW process parameters for the welding of the AA5754 H111 aluminum plates in Butt-Joint configuration.

  12. Contaminant Permeation in the Ionomer-Membrane Water Processor (IWP) System

    NASA Technical Reports Server (NTRS)

    Kelsey, Laura K.; Finger, Barry W.; Pasadilla, Patrick; Perry, Jay

    2016-01-01

    The Ionomer-membrane Water Processor (IWP) is a patented membrane-distillation based urine brine water recovery system. The unique properties of the IWP membrane pair limit contaminant permeation from the brine to the recovered water and purge gas. A paper study was conducted to predict volatile trace contaminant permeation in the IWP system. Testing of a large-scale IWP Engineering Development Unit (EDU) with urine brine pretreated with the International Space Station (ISS) pretreatment formulation was then conducted to collect air and water samples for quality analysis. Distillate water quality and purge air GC-MS results are presented and compared to predictions, along with implications for the IWP brine processing system.

  13. The Effect of Service Quality on Patient loyalty: a Study of Private Hospitals in Tehran, Iran.

    PubMed

    Arab, M; Tabatabaei, Sm Ghazi; Rashidian, A; Forushani, A Rahimi; Zarei, E

    2012-01-01

    Service quality is perceived as an important factor for developing patient's loyalty. The aim of this study was to determine the hospital service quality from the patients' viewpoints and the relative importance of quality dimensions in predicting the patient's loyalty. A cross-sectional study was conducted in 2010. The study sample was composed of 943 patients selected from eight private general hospitals in Tehran. The survey instrument was a questionnaire included 24 items about the service quality and 3 items about the patient's loyalty. Exploratory factor analysis was employed to extracting the dimensions of service quality. Also, regression analysis was performed to determining the relative importance of the service quality dimensions in predicting the patient's loyalty. The mean score of service quality and patient's loyalty was 3.99 and 4.16 out of 5, respectively. About 29% of the loyalty variance was explained by the service quality dimensions. Four quality dimensions (Costing, Process Quality, Interaction Quality and Environment Quality) were found to be key determinants of the patient's loyalty in the private hospitals of Tehran. The patients' experience in relation to the private hospitals' services has strong impact on the outcome variables like willingness to return to the same hospital and reuse its services or recommend them to others. The relationship between the service quality and patient's loyalty proves the strategic importance of improving the service quality for dragging and retaining patients and expanding the market share.

  14. GIS Based Distributed Runoff Predictions in Variable Source Area Watersheds Employing the SCS-Curve Number

    NASA Astrophysics Data System (ADS)

    Steenhuis, T. S.; Mendoza, G.; Lyon, S. W.; Gerard Marchant, P.; Walter, M. T.; Schneiderman, E.

    2003-04-01

    Because the traditional Soil Conservation Service Curve Number (SCS-CN) approach continues to be ubiquitously used in GIS-BASED water quality models, new application methods are needed that are consistent with variable source area (VSA) hydrological processes in the landscape. We developed within an integrated GIS modeling environment a distributed approach for applying the traditional SCS-CN equation to watersheds where VSA hydrology is a dominant process. Spatial representation of hydrologic processes is important for watershed planning because restricting potentially polluting activities from runoff source areas is fundamental to controlling non-point source pollution. The methodology presented here uses the traditional SCS-CN method to predict runoff volume and spatial extent of saturated areas and uses a topographic index to distribute runoff source areas through watersheds. The resulting distributed CN-VSA method was incorporated in an existing GWLF water quality model and applied to sub-watersheds of the Delaware basin in the Catskill Mountains region of New York State. We found that the distributed CN-VSA approach provided a physically-based method that gives realistic results for watersheds with VSA hydrology.

  15. Modeling and optimization of joint quality for laser transmission joint of thermoplastic using an artificial neural network and a genetic algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Xiao; Zhang, Cheng; Li, Pin; Wang, Kai; Hu, Yang; Zhang, Peng; Liu, Huixia

    2012-11-01

    A central composite rotatable experimental design(CCRD) is conducted to design experiments for laser transmission joining of thermoplastic-Polycarbonate (PC). The artificial neural network was used to establish the relationships between laser transmission joining process parameters (the laser power, velocity, clamp pressure, scanning number) and joint strength and joint seam width. The developed mathematical models are tested by analysis of variance (ANOVA) method to check their adequacy and the effects of process parameters on the responses and the interaction effects of key process parameters on the quality are analyzed and discussed. Finally, the desirability function coupled with genetic algorithm is used to carry out the optimization of the joint strength and joint width. The results show that the predicted results of the optimization are in good agreement with the experimental results, so this study provides an effective method to enhance the joint quality.

  16. Redox Conditions in Selected Principal Aquifers of the United States

    USGS Publications Warehouse

    McMahon, P.B.; Cowdery, T.K.; Chapelle, F.H.; Jurgens, B.C.

    2009-01-01

    Reduction/oxidation (redox) processes affect the quality of groundwater in all aquifer systems. Redox processes can alternately mobilize or immobilize potentially toxic metals associated with naturally occurring aquifer materials, contribute to the degradation or preservation of anthropogenic contami-nants, and generate undesirable byproducts, such as dissolved manganese (Mn2+), ferrous iron (Fe2+), hydrogen sulfide (H2S), and methane (CH4). Determining the kinds of redox processes that occur in an aquifer system, documenting their spatial distribution, and understanding how they affect concentrations of natural or anthropogenic contaminants are central to assessing and predicting the chemical quality of groundwater. This Fact Sheet extends the analysis of U.S. Geological Survey authors to additional principal aquifer systems by applying a framework developed by the USGS to a larger set of water-quality data from the USGS national water databases. For a detailed explanation, see the 'Introduction' in the Fact Sheet.

  17. Scientific and Regulatory Considerations in Solid Oral Modified Release Drug Product Development.

    PubMed

    Li, Min; Sander, Sanna; Duan, John; Rosencrance, Susan; Miksinski, Sarah Pope; Yu, Lawrence; Seo, Paul; Rege, Bhagwant

    2016-11-01

    This review presents scientific and regulatory considerations for the development of solid oral modified release (MR) drug products. It includes a rationale for patient-focused development based on Quality-by-Design (QbD) principles. Product and process understanding of MR products includes identification and risk-based evaluation of critical material attributes (CMAs), critical process parameters (CPPs), and their impact on critical quality attributes (CQAs) that affect the clinical performance. The use of various biopharmaceutics tools that link the CQAs to a predictable and reproducible clinical performance for patient benefit is emphasized. Product and process understanding lead to a more comprehensive control strategy that can maintain product quality through the shelf life and the lifecycle of the drug product. The overall goal is to develop MR products that consistently meet the clinical objectives while mitigating the risks to patients by reducing the probability and increasing the detectability of CQA failures.

  18. Use of a continuous twin screw granulation and drying system during formulation development and process optimization.

    PubMed

    Vercruysse, J; Peeters, E; Fonteyne, M; Cappuyns, P; Delaet, U; Van Assche, I; De Beer, T; Remon, J P; Vervaet, C

    2015-01-01

    Since small scale is key for successful introduction of continuous techniques in the pharmaceutical industry to allow its use during formulation development and process optimization, it is essential to determine whether the product quality is similar when small quantities of materials are processed compared to the continuous processing of larger quantities. Therefore, the aim of this study was to investigate whether material processed in a single cell of the six-segmented fluid bed dryer of the ConsiGma™-25 system (a continuous twin screw granulation and drying system introduced by GEA Pharma Systems, Collette™, Wommelgem, Belgium) is predictive of granule and tablet quality during full-scale manufacturing when all drying cells are filled. Furthermore, the performance of the ConsiGma™-1 system (a mobile laboratory unit) was evaluated and compared to the ConsiGma™-25 system. A premix of two active ingredients, powdered cellulose, maize starch, pregelatinized starch and sodium starch glycolate was granulated with distilled water. After drying and milling (1000 μm, 800 rpm), granules were blended with magnesium stearate and compressed using a Modul™ P tablet press (tablet weight: 430 mg, main compression force: 12 kN). Single cell experiments using the ConsiGma™-25 system and ConsiGma™-1 system were performed in triplicate. Additionally, a 1h continuous run using the ConsiGma™-25 system was executed. Process outcomes (torque, barrel wall temperature, product temperature during drying) and granule (residual moisture content, particle size distribution, bulk and tapped density, hausner ratio, friability) as well as tablet (hardness, friability, disintegration time and dissolution) quality attributes were evaluated. By performing a 1h continuous run, it was detected that a stabilization period was needed for torque and barrel wall temperature due to initial layering of the screws and the screw chamber walls with material. Consequently, slightly deviating granule and tablet quality attributes were obtained during the start-up phase of the 1h run. For the single cell runs, granule and tablet properties were comparable with results obtained during the second part of the 1h run (after start-up). Although deviating granule quality (particle size distribution and Hausner ratio) was observed due to the divergent design of the ConsiGma™-1 unit and the ConsiGma™-25 system (horizontal set-up) used in this study, tablet quality produced from granules processed with the ConsiGma™-1 system was predictive for tablet quality obtained during continuous production using the ConsiGma™-25 system. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Thermoplastic matrix composite processing model

    NASA Technical Reports Server (NTRS)

    Dara, P. H.; Loos, A. C.

    1985-01-01

    The effects the processing parameters pressure, temperature, and time have on the quality of continuous graphite fiber reinforced thermoplastic matrix composites were quantitatively accessed by defining the extent to which intimate contact and bond formation has occurred at successive ply interfaces. Two models are presented predicting the extents to which the ply interfaces have achieved intimate contact and cohesive strength. The models are based on experimental observation of compression molded laminates and neat resin conditions, respectively. Identified as the mechanism explaining the phenomenon by which the plies bond to themselves is the theory of autohesion (or self diffusion). Theoretical predictions from the Reptation Theory between autohesive strength and contact time are used to explain the effects of the processing parameters on the observed experimental strengths. The application of a time-temperature relationship for autohesive strength predictions is evaluated. A viscoelastic compression molding model of a tow was developed to explain the phenomenon by which the prepreg ply interfaces develop intimate contact.

  20. Computational Simulation of Thermal and Spattering Phenomena and Microstructure in Selective Laser Melting of Inconel 625

    NASA Astrophysics Data System (ADS)

    Özel, Tuğrul; Arısoy, Yiğit M.; Criales, Luis E.

    Computational modelling of Laser Powder Bed Fusion (L-PBF) processes such as Selective laser Melting (SLM) can reveal information that is hard to obtain or unobtainable by in-situ experimental measurements. A 3D thermal field that is not visible by the thermal camera can be obtained by solving the 3D heat transfer problem. Furthermore, microstructural modelling can be used to predict the quality and mechanical properties of the product. In this paper, a nonlinear 3D Finite Element Method based computational code is developed to simulate the SLM process with different process parameters such as laser power and scan velocity. The code is further improved by utilizing an in-situ thermal camera recording to predict spattering which is in turn included as a stochastic heat loss. Then, thermal gradients extracted from the simulations applied to predict growth directions in the resulting microstructure.

  1. Analytical Modelling and Optimization of the Temperature-Dependent Dynamic Mechanical Properties of Fused Deposition Fabricated Parts Made of PC-ABS.

    PubMed

    Mohamed, Omar Ahmed; Masood, Syed Hasan; Bhowmik, Jahar Lal

    2016-11-04

    Fused deposition modeling (FDM) additive manufacturing has been intensively used for many industrial applications due to its attractive advantages over traditional manufacturing processes. The process parameters used in FDM have significant influence on the part quality and its properties. This process produces the plastic part through complex mechanisms and it involves complex relationships between the manufacturing conditions and the quality of the processed part. In the present study, the influence of multi-level manufacturing parameters on the temperature-dependent dynamic mechanical properties of FDM processed parts was investigated using IV-optimality response surface methodology (RSM) and multilayer feed-forward neural networks (MFNNs). The process parameters considered for optimization and investigation are slice thickness, raster to raster air gap, deposition angle, part print direction, bead width, and number of perimeters. Storage compliance and loss compliance were considered as response variables. The effect of each process parameter was investigated using developed regression models and multiple regression analysis. The surface characteristics are studied using scanning electron microscope (SEM). Furthermore, performance of optimum conditions was determined and validated by conducting confirmation experiment. The comparison between the experimental values and the predicted values by IV-Optimal RSM and MFNN was conducted for each experimental run and results indicate that the MFNN provides better predictions than IV-Optimal RSM.

  2. Analytical Modelling and Optimization of the Temperature-Dependent Dynamic Mechanical Properties of Fused Deposition Fabricated Parts Made of PC-ABS

    PubMed Central

    Mohamed, Omar Ahmed; Masood, Syed Hasan; Bhowmik, Jahar Lal

    2016-01-01

    Fused deposition modeling (FDM) additive manufacturing has been intensively used for many industrial applications due to its attractive advantages over traditional manufacturing processes. The process parameters used in FDM have significant influence on the part quality and its properties. This process produces the plastic part through complex mechanisms and it involves complex relationships between the manufacturing conditions and the quality of the processed part. In the present study, the influence of multi-level manufacturing parameters on the temperature-dependent dynamic mechanical properties of FDM processed parts was investigated using IV-optimality response surface methodology (RSM) and multilayer feed-forward neural networks (MFNNs). The process parameters considered for optimization and investigation are slice thickness, raster to raster air gap, deposition angle, part print direction, bead width, and number of perimeters. Storage compliance and loss compliance were considered as response variables. The effect of each process parameter was investigated using developed regression models and multiple regression analysis. The surface characteristics are studied using scanning electron microscope (SEM). Furthermore, performance of optimum conditions was determined and validated by conducting confirmation experiment. The comparison between the experimental values and the predicted values by IV-Optimal RSM and MFNN was conducted for each experimental run and results indicate that the MFNN provides better predictions than IV-Optimal RSM. PMID:28774019

  3. Soft sensor modeling based on variable partition ensemble method for nonlinear batch processes

    NASA Astrophysics Data System (ADS)

    Wang, Li; Chen, Xiangguang; Yang, Kai; Jin, Huaiping

    2017-01-01

    Batch processes are always characterized by nonlinear and system uncertain properties, therefore, the conventional single model may be ill-suited. A local learning strategy soft sensor based on variable partition ensemble method is developed for the quality prediction of nonlinear and non-Gaussian batch processes. A set of input variable sets are obtained by bootstrapping and PMI criterion. Then, multiple local GPR models are developed based on each local input variable set. When a new test data is coming, the posterior probability of each best performance local model is estimated based on Bayesian inference and used to combine these local GPR models to get the final prediction result. The proposed soft sensor is demonstrated by applying to an industrial fed-batch chlortetracycline fermentation process.

  4. [Imaging center - optimization of the imaging process].

    PubMed

    Busch, H-P

    2013-04-01

    Hospitals around the world are under increasing pressure to optimize the economic efficiency of treatment processes. Imaging is responsible for a great part of the success but also of the costs of treatment. In routine work an excessive supply of imaging methods leads to an "as well as" strategy up to the limit of the capacity without critical reflection. Exams that have no predictable influence on the clinical outcome are an unjustified burden for the patient. They are useless and threaten the financial situation and existence of the hospital. In recent years the focus of process optimization was exclusively on the quality and efficiency of performed single examinations. In the future critical discussion of the effectiveness of single exams in relation to the clinical outcome will be more important. Unnecessary exams can be avoided, only if in addition to the optimization of single exams (efficiency) there is an optimization strategy for the total imaging process (efficiency and effectiveness). This requires a new definition of processes (Imaging Pathway), new structures for organization (Imaging Center) and a new kind of thinking on the part of the medical staff. Motivation has to be changed from gratification of performed exams to gratification of process quality (medical quality, service quality, economics), including the avoidance of additional (unnecessary) exams. © Georg Thieme Verlag KG Stuttgart · New York.

  5. [Fast Detection of Camellia Sinensis Growth Process and Tea Quality Informations with Spectral Technology: A Review].

    PubMed

    Peng, Ji-yu; Song, Xing-lin; Liu, Fei; Bao, Yi-dan; He, Yong

    2016-03-01

    The research achievements and trends of spectral technology in fast detection of Camellia sinensis growth process information and tea quality information were being reviewed. Spectral technology is a kind of fast, nondestructive, efficient detection technology, which mainly contains infrared spectroscopy, fluorescence spectroscopy, Raman spectroscopy and mass spectroscopy. The rapid detection of Camellia sinensis growth process information and tea quality is helpful to realize the informatization and automation of tea production and ensure the tea quality and safety. This paper provides a review on its applications containing the detection of tea (Camellia sinensis) growing status(nitrogen, chlorophyll, diseases and insect pest), the discrimination of tea varieties, the grade discrimination of tea, the detection of tea internal quality (catechins, total polyphenols, caffeine, amino acid, pesticide residual and so on), the quality evaluation of tea beverage and tea by-product, the machinery of tea quality determination and discrimination. This paper briefly introduces the trends of the technology of the determination of tea growth process information, sensor and industrial application. In conclusion, spectral technology showed high potential to detect Camellia sinensis growth process information, to predict tea internal quality and to classify tea varieties and grades. Suitable chemometrics and preprocessing methods is helpful to improve the performance of the model and get rid of redundancy, which provides the possibility to develop the portable machinery. Future work is to develop the portable machinery and on-line detection system is recommended to improve the further application. The application and research achievement of spectral technology concerning about tea were outlined in this paper for the first time, which contained Camellia sinensis growth, tea production, the quality and safety of tea and by-produce and so on, as well as some problems to be solved and its future applicability in modern tea industrial.

  6. Multi-response optimization of T300/epoxy prepreg tape-wound cylinder by grey relational analysis coupled with the response surface method

    NASA Astrophysics Data System (ADS)

    Kang, Chao; Shi, Yaoyao; He, Xiaodong; Yu, Tao; Deng, Bo; Zhang, Hongji; Sun, Pengcheng; Zhang, Wenbin

    2017-09-01

    This study investigates the multi-objective optimization of quality characteristics for a T300/epoxy prepreg tape-wound cylinder. The method integrates the Taguchi method, grey relational analysis (GRA) and response surface methodology, and is adopted to improve tensile strength and reduce residual stress. In the winding process, the main process parameters involving winding tension, pressure, temperature and speed are selected to evaluate the parametric influences on tensile strength and residual stress. Experiments are conducted using the Box-Behnken design. Based on principal component analysis, the grey relational grades are properly established to convert multi-responses into an individual objective problem. Then the response surface method is used to build a second-order model of grey relational grade and predict the optimum parameters. The predictive accuracy of the developed model is proved by two test experiments with a low prediction error of less than 7%. The following process parameters, namely winding tension 124.29 N, pressure 2000 N, temperature 40 °C and speed 10.65 rpm, have the highest grey relational grade and give better quality characteristics in terms of tensile strength and residual stress. The confirmation experiment shows that better results are obtained with GRA improved by the proposed method than with ordinary GRA. The proposed method is proved to be feasible and can be applied to optimize the multi-objective problem in the filament winding process.

  7. Advanced multivariate data analysis to determine the root cause of trisulfide bond formation in a novel antibody-peptide fusion.

    PubMed

    Goldrick, Stephen; Holmes, William; Bond, Nicholas J; Lewis, Gareth; Kuiper, Marcel; Turner, Richard; Farid, Suzanne S

    2017-10-01

    Product quality heterogeneities, such as a trisulfide bond (TSB) formation, can be influenced by multiple interacting process parameters. Identifying their root cause is a major challenge in biopharmaceutical production. To address this issue, this paper describes the novel application of advanced multivariate data analysis (MVDA) techniques to identify the process parameters influencing TSB formation in a novel recombinant antibody-peptide fusion expressed in mammalian cell culture. The screening dataset was generated with a high-throughput (HT) micro-bioreactor system (Ambr TM 15) using a design of experiments (DoE) approach. The complex dataset was firstly analyzed through the development of a multiple linear regression model focusing solely on the DoE inputs and identified the temperature, pH and initial nutrient feed day as important process parameters influencing this quality attribute. To further scrutinize the dataset, a partial least squares model was subsequently built incorporating both on-line and off-line process parameters and enabled accurate predictions of the TSB concentration at harvest. Process parameters identified by the models to promote and suppress TSB formation were implemented on five 7 L bioreactors and the resultant TSB concentrations were comparable to the model predictions. This study demonstrates the ability of MVDA to enable predictions of the key performance drivers influencing TSB formation that are valid also upon scale-up. Biotechnol. Bioeng. 2017;114: 2222-2234. © 2017 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc. © 2017 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc.

  8. Evaluation of the Community Multiscale Air Quality Model for Simulating Winter Ozone Formation in the Uinta Basin with Intensive Oil and Gas Production

    NASA Astrophysics Data System (ADS)

    Matichuk, R.; Tonnesen, G.; Luecken, D.; Roselle, S. J.; Napelenok, S. L.; Baker, K. R.; Gilliam, R. C.; Misenis, C.; Murphy, B.; Schwede, D. B.

    2015-12-01

    The western United States is an important source of domestic energy resources. One of the primary environmental impacts associated with oil and natural gas production is related to air emission releases of a number of air pollutants. Some of these pollutants are important precursors to the formation of ground-level ozone. To better understand ozone impacts and other air quality issues, photochemical air quality models are used to simulate the changes in pollutant concentrations in the atmosphere on local, regional, and national spatial scales. These models are important for air quality management because they assist in identifying source contributions to air quality problems and designing effective strategies to reduce harmful air pollutants. The success of predicting oil and natural gas air quality impacts depends on the accuracy of the input information, including emissions inventories, meteorological information, and boundary conditions. The treatment of chemical and physical processes within these models is equally important. However, given the limited amount of data collected for oil and natural gas production emissions in the past and the complex terrain and meteorological conditions in western states, the ability of these models to accurately predict pollution concentrations from these sources is uncertain. Therefore, this presentation will focus on understanding the Community Multiscale Air Quality (CMAQ) model's ability to predict air quality impacts associated with oil and natural gas production and its sensitivity to input uncertainties. The results will focus on winter ozone issues in the Uinta Basin, Utah and identify the factors contributing to model performance issues. The results of this study will help support future air quality model development, policy and regulatory decisions for the oil and gas sector.

  9. Quality by Design approach for studying the impact of formulation and process variables on product quality of oral disintegrating films.

    PubMed

    Mazumder, Sonal; Pavurala, Naresh; Manda, Prashanth; Xu, Xiaoming; Cruz, Celia N; Krishnaiah, Yellela S R

    2017-07-15

    The present investigation was carried out to understand the impact of formulation and process variables on the quality of oral disintegrating films (ODF) using Quality by Design (QbD) approach. Lamotrigine (LMT) was used as a model drug. Formulation variable was plasticizer to film former ratio and process variables were drying temperature, air flow rate in the drying chamber, drying time and wet coat thickness of the film. A Definitive Screening Design of Experiments (DoE) was used to identify and classify the critical formulation and process variables impacting critical quality attributes (CQA). A total of 14 laboratory-scale DoE formulations were prepared and evaluated for mechanical properties (%elongation at break, yield stress, Young's modulus, folding endurance) and other CQA (dry thickness, disintegration time, dissolution rate, moisture content, moisture uptake, drug assay and drug content uniformity). The main factors affecting mechanical properties were plasticizer to film former ratio and drying temperature. Dissolution rate was found to be sensitive to air flow rate during drying and plasticizer to film former ratio. Data were analyzed for elucidating interactions between different variables, rank ordering the critical materials attributes (CMA) and critical process parameters (CPP), and for providing a predictive model for the process. Results suggested that plasticizer to film former ratio and process controls on drying are critical to manufacture LMT ODF with the desired CQA. Published by Elsevier B.V.

  10. Applying data mining techniques for increasing implantation rate by selecting best sperms for intra-cytoplasmic sperm injection treatment.

    PubMed

    Mirroshandel, Seyed Abolghasem; Ghasemian, Fatemeh; Monji-Azad, Sara

    2016-12-01

    Aspiration of a good-quality sperm during intracytoplasmic sperm injection (ICSI) is one of the main concerns. Understanding the influence of individual sperm morphology on fertilization, embryo quality, and pregnancy probability is one of the most important subjects in male factor infertility. Embryologists need to decide the best sperm for injection in real time during ICSI cycle. Our objective is to predict the quality of zygote, embryo, and implantation outcome before injection of each sperm in an ICSI cycle for male factor infertility with the aim of providing a decision support system on the sperm selection. The information was collected from 219 patients with male factor infertility at the infertility therapy center of Alzahra hospital in Rasht from 2012 through 2014. The prepared dataset included the quality of zygote, embryo, and implantation outcome of 1544 injected sperms into the related oocytes. In our study, embryo transfer was performed at day 3. Each sperm was represented with thirteen clinical features. Data preprocessing was the first step in the proposed data mining algorithm. After applying more than 30 classifiers, 9 successful classifiers were selected and evaluated by 10-fold cross validation technique using precision, recall, F1, and AUC measures. Another important experiment was measuring the effect of each feature in prediction process. In zygote and embryo quality prediction, IBK and RandomCommittee models provided 79.2% and 83.8% F1, respectively. In implantation outcome prediction, KStar model achieved 95.9% F1, which is even better than prediction of human experts. All these predictions can be done in real time. A machine learning-based decision support system would be helpful in sperm selection phase of ICSI cycle to improve the success rate of ICSI treatment. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Numerical modeling of overland flow due to rainfall-runoff

    USDA-ARS?s Scientific Manuscript database

    Runoff is a basic hydrologic process that can be influenced by management activities in agricultural watersheds. Better description of runoff patterns through modeling will help to understand and predict watershed sediment transport and water quality. Normally, runoff is studied with kinematic wave ...

  12. QUANTIFYING SPATIAL POSITION OF WETLANDS FOR STREAM HABITAT QUALITY PREDICTION

    EPA Science Inventory

    A watershed's capacity to store and filter water, and the resulting effects on the hydrologic regine, is a key forcing function for insteam processes and community structure. However, methods for describing wetland position have traditionally been qualitative. A Geographic Info...

  13. Statistical procedures for determination and verification of minimum reporting levels for drinking water methods.

    PubMed

    Winslow, Stephen D; Pepich, Barry V; Martin, John J; Hallberg, George R; Munch, David J; Frebis, Christopher P; Hedrick, Elizabeth J; Krop, Richard A

    2006-01-01

    The United States Environmental Protection Agency's Office of Ground Water and Drinking Water has developed a single-laboratory quantitation procedure: the lowest concentration minimum reporting level (LCMRL). The LCMRL is the lowest true concentration for which future recovery is predicted to fall, with high confidence (99%), between 50% and 150%. The procedure takes into account precision and accuracy. Multiple concentration replicates are processed through the entire analytical method and the data are plotted as measured sample concentration (y-axis) versus true concentration (x-axis). If the data support an assumption of constant variance over the concentration range, an ordinary least-squares regression line is drawn; otherwise, a variance-weighted least-squares regression is used. Prediction interval lines of 99% confidence are drawn about the regression. At the points where the prediction interval lines intersect with data quality objective lines of 50% and 150% recovery, lines are dropped to the x-axis. The higher of the two values is the LCMRL. The LCMRL procedure is flexible because the data quality objectives (50-150%) and the prediction interval confidence (99%) can be varied to suit program needs. The LCMRL determination is performed during method development only. A simpler procedure for verification of data quality objectives at a given minimum reporting level (MRL) is also presented. The verification procedure requires a single set of seven samples taken through the entire method procedure. If the calculated prediction interval is contained within data quality recovery limits (50-150%), the laboratory performance at the MRL is verified.

  14. Development of a multi-ensemble Prediction Model for China

    NASA Astrophysics Data System (ADS)

    Brasseur, G. P.; Bouarar, I.; Petersen, A. K.

    2016-12-01

    As part of the EU-sponsored Panda and MarcoPolo Projects, a multi-model prediction system including 7 models has been developed. Most regional models use global air quality predictions provided by the Copernicus Atmospheric Monitoring Service and downscale the forecast at relatively high spatial resolution in eastern China. The paper will describe the forecast system and show examples of forecasts produced for several Chinese urban areas and displayed on a web site developed by the Dutch Meteorological service. A discussion on the accuracy of the predictions based on a detailed validation process using surface measurements from the Chinese monitoring network will be presented.

  15. Effect of roll compaction on granule size distribution of microcrystalline cellulose–mannitol mixtures: computational intelligence modeling and parametric analysis

    PubMed Central

    Kazemi, Pezhman; Khalid, Mohammad Hassan; Pérez Gago, Ana; Kleinebudde, Peter; Jachowicz, Renata; Szlęk, Jakub; Mendyk, Aleksander

    2017-01-01

    Dry granulation using roll compaction is a typical unit operation for producing solid dosage forms in the pharmaceutical industry. Dry granulation is commonly used if the powder mixture is sensitive to heat and moisture and has poor flow properties. The output of roll compaction is compacted ribbons that exhibit different properties based on the adjusted process parameters. These ribbons are then milled into granules and finally compressed into tablets. The properties of the ribbons directly affect the granule size distribution (GSD) and the quality of final products; thus, it is imperative to study the effect of roll compaction process parameters on GSD. The understanding of how the roll compactor process parameters and material properties interact with each other will allow accurate control of the process, leading to the implementation of quality by design practices. Computational intelligence (CI) methods have a great potential for being used within the scope of quality by design approach. The main objective of this study was to show how the computational intelligence techniques can be useful to predict the GSD by using different process conditions of roll compaction and material properties. Different techniques such as multiple linear regression, artificial neural networks, random forest, Cubist and k-nearest neighbors algorithm assisted by sevenfold cross-validation were used to present generalized models for the prediction of GSD based on roll compaction process setting and material properties. The normalized root-mean-squared error and the coefficient of determination (R2) were used for model assessment. The best fit was obtained by Cubist model (normalized root-mean-squared error =3.22%, R2=0.95). Based on the results, it was confirmed that the material properties (true density) followed by compaction force have the most significant effect on GSD. PMID:28176905

  16. Effect of roll compaction on granule size distribution of microcrystalline cellulose-mannitol mixtures: computational intelligence modeling and parametric analysis.

    PubMed

    Kazemi, Pezhman; Khalid, Mohammad Hassan; Pérez Gago, Ana; Kleinebudde, Peter; Jachowicz, Renata; Szlęk, Jakub; Mendyk, Aleksander

    2017-01-01

    Dry granulation using roll compaction is a typical unit operation for producing solid dosage forms in the pharmaceutical industry. Dry granulation is commonly used if the powder mixture is sensitive to heat and moisture and has poor flow properties. The output of roll compaction is compacted ribbons that exhibit different properties based on the adjusted process parameters. These ribbons are then milled into granules and finally compressed into tablets. The properties of the ribbons directly affect the granule size distribution (GSD) and the quality of final products; thus, it is imperative to study the effect of roll compaction process parameters on GSD. The understanding of how the roll compactor process parameters and material properties interact with each other will allow accurate control of the process, leading to the implementation of quality by design practices. Computational intelligence (CI) methods have a great potential for being used within the scope of quality by design approach. The main objective of this study was to show how the computational intelligence techniques can be useful to predict the GSD by using different process conditions of roll compaction and material properties. Different techniques such as multiple linear regression, artificial neural networks, random forest, Cubist and k-nearest neighbors algorithm assisted by sevenfold cross-validation were used to present generalized models for the prediction of GSD based on roll compaction process setting and material properties. The normalized root-mean-squared error and the coefficient of determination ( R 2 ) were used for model assessment. The best fit was obtained by Cubist model (normalized root-mean-squared error =3.22%, R 2 =0.95). Based on the results, it was confirmed that the material properties (true density) followed by compaction force have the most significant effect on GSD.

  17. GenePRIMP: A Gene Prediction Improvement Pipeline For Prokaryotic Genomes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kyrpides, Nikos C.; Ivanova, Natalia N.; Pati, Amrita

    2010-07-08

    GenePRIMP (Gene Prediction Improvement Pipeline, Http://geneprimp.jgi-psf.org), a computational process that performs evidence-based evaluation of gene models in prokaryotic genomes and reports anomalies including inconsistent start sites, missing genes, and split genes. We show that manual curation of gene models using the anomaly reports generated by GenePRIMP improves their quality and demonstrate the applicability of GenePRIMP in improving finishing quality and comparing different genome sequencing and annotation technologies. Keywords in context: Gene model, Quality Control, Translation start sites, Automatic correction. Hardware requirements; PC, MAC; Operating System: UNIX/LINUX; Compiler/Version: Perl 5.8.5 or higher; Special requirements: NCBI Blast and nr installation; File Types:more » Source Code, Executable module(s), Sample problem input data; installation instructions other; programmer documentation. Location/transmission: http://geneprimp.jgi-psf.org/gp.tar.gz« less

  18. Development and validation of an APCI-MS/GC–MS approach for the classification and prediction of Cheddar cheese maturity

    PubMed Central

    Gan, Heng Hui; Yan, Bingnan; Linforth, Robert S.T.; Fisk, Ian D.

    2016-01-01

    Headspace techniques have been extensively employed in food analysis to measure volatile compounds, which play a central role in the perceived quality of food. In this study atmospheric pressure chemical ionisation-mass spectrometry (APCI-MS), coupled with gas chromatography–mass spectrometry (GC–MS), was used to investigate the complex mix of volatile compounds present in Cheddar cheeses of different maturity, processing and recipes to enable characterisation of the cheeses based on their ripening stages. Partial least squares-linear discriminant analysis (PLS-DA) provided a 70% success rate in correct prediction of the age of the cheeses based on their key headspace volatile profiles. In addition to predicting maturity, the analytical results coupled with chemometrics offered a rapid and detailed profiling of the volatile component of Cheddar cheeses, which could offer a new tool for quality assessment and accelerate product development. PMID:26212994

  19. Atmospheric Boundary Layer Wind Data During the Period January 1, 1998 Through January 31, 1999 at the Dallas-Fort Worth Airport. Volume 1; Quality Assessment

    NASA Technical Reports Server (NTRS)

    Zak, J. Allen; Rodgers, William G., Jr.

    2000-01-01

    The quality of the Aircraft Vortex Spacing System (AVOSS) is critically dependent on representative wind profiles in the atmospheric boundary layer. These winds observed from a number of sensor systems around the Dallas-Fort Worth airport were combined into single vertical wind profiles by an algorithm developed and implemented by MIT Lincoln Laboratory. This process, called the AVOSS Winds Analysis System (AWAS), is used by AVOSS for wake corridor predictions. During times when AWAS solutions were available, the quality of the resultant wind profiles and variance was judged from a series of plots combining all sensor observations and AWAS profiles during the period 1200 to 0400 UTC daily. First, input data was evaluated for continuity and consistency from criteria established. Next, the degree of agreement among all wind sensor systems was noted and cases of disagreement identified. Finally, the resultant AWAS solution was compared to the quality-assessed input data. When profiles differed by a specified amount from valid sensor consensus winds, times and altitudes were flagged. Volume one documents the process and quality of input sensor data. Volume two documents the data processing/sorting process and provides the resultant flagged files.

  20. Development of evaluation technique of GMAW welding quality based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Feng, Shengqiang; Terasaki, Hidenri; Komizo, Yuichi; Hu, Shengsun; Chen, Donggao; Ma, Zhihua

    2014-11-01

    Nondestructive techniques for appraising gas metal arc welding(GMAW) faults plays a very important role in on-line quality controllability and prediction of the GMAW process. On-line welding quality controllability and prediction have several disadvantages such as high cost, low efficiency, complication and greatly being affected by the environment. An enhanced, efficient evaluation technique for evaluating welding faults based on Mahalanobis distance(MD) and normal distribution is presented. In addition, a new piece of equipment, designated the weld quality tester(WQT), is developed based on the proposed evaluation technique. MD is superior to other multidimensional distances such as Euclidean distance because the covariance matrix used for calculating MD takes into account correlations in the data and scaling. The values of MD obtained from welding current and arc voltage are assumed to follow a normal distribution. The normal distribution has two parameters: the mean µ and standard deviation σ of the data. In the proposed evaluation technique used by the WQT, values of MD located in the range from zero to µ+3 σ are regarded as "good". Two experiments which involve changing the flow of shielding gas and smearing paint on the surface of the substrate are conducted in order to verify the sensitivity of the proposed evaluation technique and the feasibility of using WQT. The experimental results demonstrate the usefulness of the WQT for evaluating welding quality. The proposed technique can be applied to implement the on-line welding quality controllability and prediction, which is of great importance to design some novel equipment for weld quality detection.

  1. Temporal performance assessment of wastewater treatment plants by using multivariate statistical analysis.

    PubMed

    Ebrahimi, Milad; Gerber, Erin L; Rockaway, Thomas D

    2017-05-15

    For most water treatment plants, a significant number of performance data variables are attained on a time series basis. Due to the interconnectedness of the variables, it is often difficult to assess over-arching trends and quantify operational performance. The objective of this study was to establish simple and reliable predictive models to correlate target variables with specific measured parameters. This study presents a multivariate analysis of the physicochemical parameters of municipal wastewater. Fifteen quality and quantity parameters were analyzed using data recorded from 2010 to 2016. To determine the overall quality condition of raw and treated wastewater, a Wastewater Quality Index (WWQI) was developed. The index summarizes a large amount of measured quality parameters into a single water quality term by considering pre-established quality limitation standards. To identify treatment process performance, the interdependencies between the variables were determined by using Principal Component Analysis (PCA). The five extracted components from the 15 variables accounted for 75.25% of total dataset information and adequately represented the organic, nutrient, oxygen demanding, and ion activity loadings of influent and effluent streams. The study also utilized the model to predict quality parameters such as Biological Oxygen Demand (BOD), Total Phosphorus (TP), and WWQI. High accuracies ranging from 71% to 97% were achieved for fitting the models with the training dataset and relative prediction percentage errors less than 9% were achieved for the testing dataset. The presented techniques and procedures in this paper provide an assessment framework for the wastewater treatment monitoring programs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. From in situ coal to the final coal product: A case study of the Danville Coal Member (Indiana)

    USGS Publications Warehouse

    Mastalerz, Maria; Padgett, P.L.

    1999-01-01

    A surface coal mine operation and preparation plant in southwestern Indiana was sampled to examine variations in coal quality and coal petrography parameters for the Danville Coal Member of the Dugger Formation (Pennsylvanian-Desmoinesian, Westphalian D). Representative samples from in situ coal, preparation plant feeds, and a final coal product were collected in order to compare coal quality, coal petrography, trace element concentrations, and ash chemistry of the coal to those of the product. Coal quality parameters of the in situ samples and various feeds, coarse refuse, and final product were variable. The quality of the final coal product was best predicted by the coal quality of the clean coal feed (from the middle portions of the seam). Some trace element contents, especially lead and arsenic, varied between the coal feeds and the product. Lead contents increased in the feeds and product compared to the channel sample of the raw coal, possibly due to contamination in the handling process.A surface coal mine operation and preparation plant in southwestern Indiana was sampled to examine variations in coal quality and coal petrography parameters for the Danville Coal Member of the Dugger Formation (Pennsylvanian-Desmoinesian, Westphalian D). Representative samples from in situ coal, preparation plant feeds, and a final coal product were collected in order to compare coal quality, coal petrography, trace element concentrations, and ash chemistry of the coal to those of the product. Coal quality parameters of the in situ samples and various feeds, coarse refuse, and final product were variable. The quality of the final coal product was best predicted by the coal quality of the clean coal feed (from the middle portions of the seam). Some trace element contents, especially lead and arsenic, varied between the coal feeds and the product. Lead contents increased in the feeds and product compared to the channel sample of the raw coal, possibly due to contamination in the handling process.

  3. Development and validation of a predictive model for the influences of selected product and process variables on ascorbic acid degradation in simulated fruit juice.

    PubMed

    Gabriel, Alonzo A; Cayabyab, Jochelle Elysse C; Tan, Athalie Kaye L; Corook, Mark Lester F; Ables, Errol John O; Tiangson-Bayaga, Cecile Leah P

    2015-06-15

    A predictive response surface model for the influences of product (soluble solids and titratable acidity) and process (temperature and heating time) parameters on the degradation of ascorbic acid (AA) in heated simulated fruit juices (SFJs) was established. Physicochemical property ranges of freshly squeezed and processed juices, and a previously established decimal reduction times of Escherichiacoli O157:H7 at different heating temperatures were used in establishing a Central Composite Design of Experiment that determined the combinations of product and process variable used in the model building. Only the individual linear effects of temperature and heating time significantly (P<0.05) affected AA reduction (%AAr). Validating systems either over- or underestimated actual %AAr with bias factors 0.80-1.20. However, all validating systems still resulted in acceptable predictive efficacy, with accuracy factor 1.00-1.26. The model may be useful in establishing unique process schedules for specific products, for the simultaneous control and improvement of food safety and quality. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. A neuromathematical model of human information processing and its application to science content acquisition

    NASA Astrophysics Data System (ADS)

    Anderson, O. Roger

    The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.

  5. Mathematical simulation of the drying of suspensions and colloidal solutions by their depressurization

    NASA Astrophysics Data System (ADS)

    Lashkov, V. A.; Levashko, E. I.; Safin, R. G.

    2006-05-01

    The heat and mass transfer in the process of drying of high-humidity materials by their depressurization has been investigated. The results of experimental investigation and mathematical simulation of the indicated process are presented. They allow one to determine the regularities of this process and predict the quality of the finished product. A technological scheme and an engineering procedure for calculating the drying of the liquid base of a soap are presented.

  6. Application of a quality by design approach to the cell culture process of monoclonal antibody production, resulting in the establishment of a design space.

    PubMed

    Nagashima, Hiroaki; Watari, Akiko; Shinoda, Yasuharu; Okamoto, Hiroshi; Takuma, Shinya

    2013-12-01

    This case study describes the application of Quality by Design elements to the process of culturing Chinese hamster ovary cells in the production of a monoclonal antibody. All steps in the cell culture process and all process parameters in each step were identified by using a cause-and-effect diagram. Prospective risk assessment using failure mode and effects analysis identified the following four potential critical process parameters in the production culture step: initial viable cell density, culture duration, pH, and temperature. These parameters and lot-to-lot variability in raw material were then evaluated by process characterization utilizing a design of experiments approach consisting of a face-centered central composite design integrated with a full factorial design. Process characterization was conducted using a scaled down model that had been qualified by comparison with large-scale production data. Multivariate regression analysis was used to establish statistical prediction models for performance indicators and quality attributes; with these, we constructed contour plots and conducted Monte Carlo simulation to clarify the design space. The statistical analyses, especially for raw materials, identified set point values, which were most robust with respect to the lot-to-lot variability of raw materials while keeping the product quality within the acceptance criteria. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.

  7. Predicting lignin depolymerization yields from quantifiable properties using fractionated biorefinery lignins

    USDA-ARS?s Scientific Manuscript database

    Lignin depolymerization to aromatic monomers with high yields and selectivity is essential for the economic feasibility of many lignin-valorization strategies within integrated biorefining processes. Importantly, the quality and properties of the lignin source play an essential role in impacting the...

  8. The Evolution of Marketing in Education.

    ERIC Educational Resources Information Center

    Knight, Brent

    1982-01-01

    Looks at the progression of educational institutions through the five stages of Kotler's marketing process. Identifies anticipated changes and three activities critical for meeting the marketing challenge: research to predict consumer habits, attitudes, and needs; material and research development changes; and strict quality control and relevance…

  9. Extending the cost-benefit model of thermoregulation: high-temperature environments.

    PubMed

    Vickers, Mathew; Manicom, Carryn; Schwarzkopf, Lin

    2011-04-01

    The classic cost-benefit model of ectothermic thermoregulation compares energetic costs and benefits, providing a critical framework for understanding this process (Huey and Slatkin 1976 ). It considers the case where environmental temperature (T(e)) is less than the selected temperature of the organism (T(sel)), and it predicts that, to minimize increasing energetic costs of thermoregulation as habitat thermal quality declines, thermoregulatory effort should decrease until the lizard thermoconforms. We extended this model to include the case where T(e) exceeds T(sel), and we redefine costs and benefits in terms of fitness to include effects of body temperature (T(b)) on performance and survival. Our extended model predicts that lizards will increase thermoregulatory effort as habitat thermal quality declines, gaining the fitness benefits of optimal T(b) and maximizing the net benefit of activity. Further, to offset the disproportionately high fitness costs of high T(e) compared with low T(e), we predicted that lizards would thermoregulate more effectively at high values of T(e) than at low ones. We tested our predictions on three sympatric skink species (Carlia rostralis, Carlia rubrigularis, and Carlia storri) in hot savanna woodlands and found that thermoregulatory effort increased as thermal quality declined and that lizards thermoregulated most effectively at high values of T(e).

  10. Bayesian assurance and sample size determination in the process validation life-cycle.

    PubMed

    Faya, Paul; Seaman, John W; Stamey, James D

    2017-01-01

    Validation of pharmaceutical manufacturing processes is a regulatory requirement and plays a key role in the assurance of drug quality, safety, and efficacy. The FDA guidance on process validation recommends a life-cycle approach which involves process design, qualification, and verification. The European Medicines Agency makes similar recommendations. The main purpose of process validation is to establish scientific evidence that a process is capable of consistently delivering a quality product. A major challenge faced by manufacturers is the determination of the number of batches to be used for the qualification stage. In this article, we present a Bayesian assurance and sample size determination approach where prior process knowledge and data are used to determine the number of batches. An example is presented in which potency uniformity data is evaluated using a process capability metric. By using the posterior predictive distribution, we simulate qualification data and make a decision on the number of batches required for a desired level of assurance.

  11. Statistical modeling methods to analyze the impacts of multiunit process variability on critical quality attributes of Chinese herbal medicine tablets

    PubMed Central

    Sun, Fei; Xu, Bing; Zhang, Yi; Dai, Shengyun; Yang, Chan; Cui, Xianglong; Shi, Xinyuan; Qiao, Yanjiang

    2016-01-01

    The quality of Chinese herbal medicine tablets suffers from batch-to-batch variability due to a lack of manufacturing process understanding. In this paper, the Panax notoginseng saponins (PNS) immediate release tablet was taken as the research subject. By defining the dissolution of five active pharmaceutical ingredients and the tablet tensile strength as critical quality attributes (CQAs), influences of both the manipulated process parameters introduced by an orthogonal experiment design and the intermediate granules’ properties on the CQAs were fully investigated by different chemometric methods, such as the partial least squares, the orthogonal projection to latent structures, and the multiblock partial least squares (MBPLS). By analyzing the loadings plots and variable importance in the projection indexes, the granule particle sizes and the minimal punch tip separation distance in tableting were identified as critical process parameters. Additionally, the MBPLS model suggested that the lubrication time in the final blending was also important in predicting tablet quality attributes. From the calculated block importance in the projection indexes, the tableting unit was confirmed to be the critical process unit of the manufacturing line. The results demonstrated that the combinatorial use of different multivariate modeling methods could help in understanding the complex process relationships as a whole. The output of this study can then be used to define a control strategy to improve the quality of the PNS immediate release tablet. PMID:27932865

  12. The Effect of Service Quality on Patient loyalty: a Study of Private Hospitals in Tehran, Iran

    PubMed Central

    Arab, M; Tabatabaei, SM Ghazi; Rashidian, A; Forushani, A Rahimi; Zarei, E

    2012-01-01

    Background: Service quality is perceived as an important factor for developing patient’s loyalty. The aim of this study was to determine the hospital service quality from the patients’ viewpoints and the relative importance of quality dimensions in predicting the patient’s loyalty. Methods: A cross-sectional study was conducted in 2010. The study sample was composed of 943 patients selected from eight private general hospitals in Tehran. The survey instrument was a questionnaire included 24 items about the service quality and 3 items about the patient’s loyalty. Exploratory factor analysis was employed to extracting the dimensions of service quality. Also, regression analysis was performed to determining the relative importance of the service quality dimensions in predicting the patient’s loyalty. Result: The mean score of service quality and patient’s loyalty was 3.99 and 4.16 out of 5, respectively. About 29% of the loyalty variance was explained by the service quality dimensions. Four quality dimensions (Costing, Process Quality, Interaction Quality and Environment Quality) were found to be key determinants of the patient’s loyalty in the private hospitals of Tehran. Conclusion: The patients’ experience in relation to the private hospitals’ services has strong impact on the outcome variables like willingness to return to the same hospital and reuse its services or recommend them to others. The relationship between the service quality and patient’s loyalty proves the strategic importance of improving the service quality for dragging and retaining patients and expanding the market share. PMID:23193509

  13. Statistical process control: separating signal from noise in emergency department operations.

    PubMed

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do.

    PubMed

    Zhao, Linlin; Wang, Wenyi; Sedykh, Alexander; Zhu, Hao

    2017-06-30

    Numerous chemical data sets have become available for quantitative structure-activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting.

  15. Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do

    PubMed Central

    2017-01-01

    Numerous chemical data sets have become available for quantitative structure–activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting. PMID:28691113

  16. Prediction of Shrinkage Porosity Defect in Sand Casting Process of LM25

    NASA Astrophysics Data System (ADS)

    Rathod, Hardik; Dhulia, Jay K.; Maniar, Nirav P.

    2017-08-01

    In the present worldwide and aggressive environment, foundry commercial enterprises need to perform productively with least number of rejections and create casting parts in shortest lead time. It has become extremely difficult for foundry industries to meet demands of defects free casting and meet strict delivery schedules. The process of casting solidification is complex in nature. Prediction of shrinkage defect in metal casting is one of the critical concern in foundries and is one of the potential research areas in casting. Due to increasing pressure to improve quality and to reduce cost, it is very essential to upgrade the level of current methodology used in foundries. In the present research work, prediction methodology of shrinkage porosity defect in sand casting process of LM25 using experimentation and ANSYS is proposed. The objectives successfully achieved are prediction of shrinkage porosity distribution in Al-Si casting and determining effectiveness of investigated function for predicting shrinkage porosity by correlating results of simulating studies to those obtained experimentally. The real-time application of the research reflects from the fact that experimentation is performed on 9 different Y junctions at foundry industry and practical data obtained from experimentation are used for simulation.

  17. Formation of the predicted training parameters in the form of a discrete information stream

    NASA Astrophysics Data System (ADS)

    Smolentseva, T. E.; Sumin, V. I.; Zolnikov, V. K.; Lavlinsky, V. V.

    2018-03-01

    In work process of training in the form of a discrete information stream is considered. On each of stages of the considered process portions of the training information and quality of their assimilation are analysed. Individual characteristics and reaction trained for every portion of information on appropriate sections are defined. The control algorithm of training with the predicted number of control checks of the trainee who allows to define what operating influence is considered it is necessary to create for the trainee. On the basis of this algorithm the vector of probabilities of ignorance of elements of the training information is received. As a result of the conducted researches the algorithm on formation of the predicted training parameters is developed. In work the task of comparison of duration of training received experimentally with predicted on the basis of it is solved the conclusion is drawn on efficiency of formation of the predicted training parameters. The program complex on the basis of the values of individual parameters received as a result of experiments on each trainee who allows to calculate individual characteristics is developed, to form rating and to monitor process of change of parameters of training.

  18. Hyperspectral imaging using near infrared spectroscopy to monitor coat thickness uniformity in the manufacture of a transdermal drug delivery system.

    PubMed

    Pavurala, Naresh; Xu, Xiaoming; Krishnaiah, Yellela S R

    2017-05-15

    Hyperspectral imaging using near infrared spectroscopy (NIRS) integrates spectroscopy and conventional imaging to obtain both spectral and spatial information of materials. The non-invasive and rapid nature of hyperspectral imaging using NIRS makes it a valuable process analytical technology (PAT) tool for in-process monitoring and control of the manufacturing process for transdermal drug delivery systems (TDS). The focus of this investigation was to develop and validate the use of Near Infra-red (NIR) hyperspectral imaging to monitor coat thickness uniformity, a critical quality attribute (CQA) for TDS. Chemometric analysis was used to process the hyperspectral image and a partial least square (PLS) model was developed to predict the coat thickness of the TDS. The goodness of model fit and prediction were 0.9933 and 0.9933, respectively, indicating an excellent fit to the training data and also good predictability. The % Prediction Error (%PE) for internal and external validation samples was less than 5% confirming the accuracy of the PLS model developed in the present study. The feasibility of the hyperspectral imaging as a real-time process analytical tool for continuous processing was also investigated. When the PLS model was applied to detect deliberate variation in coating thickness, it was able to predict both the small and large variations as well as identify coating defects such as non-uniform regions and presence of air bubbles. Published by Elsevier B.V.

  19. Process Parameter Optimization for Wobbling Laser Spot Welding of Ti6Al4V Alloy

    NASA Astrophysics Data System (ADS)

    Vakili-Farahani, F.; Lungershausen, J.; Wasmer, K.

    Laser beam welding (LBW) coupled with "wobble effect" (fast oscillation of the laser beam) is very promising for high precision micro-joining industry. For this process, similarly to the conventional LBW, the laser welding process parameters play a very significant role in determining the quality of a weld joint. Consequently, four process parameters (laser power, wobble frequency, number of rotations within a single laser pulse and focused position) and 5 responses (penetration, width, heat affected zone (HAZ), area of the fusion zone, area of HAZ and hardness) were investigated for spot welding of Ti6Al4V alloy (grade 5) using a design of experiments (DoE) approach. This paper presents experimental results showing the effects of variating the considered most important process parameters on the spot weld quality of Ti6Al4V alloy. Semi-empirical mathematical models were developed to correlate laser welding parameters to each of the measured weld responses. Adequacies of the models were then examined by various methods such as ANOVA. These models not only allows a better understanding of the wobble laser welding process and predict the process performance but also determines optimal process parameters. Therefore, optimal combination of process parameters was determined considering certain quality criteria set.

  20. The dynamics of narrative writing in primary grade children: writing process factors predict story quality.

    PubMed

    von Koss Torkildsen, Janne; Morken, Frøydis; Helland, Wenche A; Helland, Turid

    In this study of third grade school children, we investigated the association between writing process measures recorded with key stroke logging and the final written product. Moreover, we examined the cognitive predictors of writing process and product measures. Analyses of key strokes showed that while most children spontaneously made local online revisions while writing, few revised previously written text. Children with good reading and spelling abilities made more online revisions than their peers. Two process factors, transcription fluency and online revision activity, contributed to explaining variance in narrative macrostructural quality and story length. As for cognitive predictors, spelling was the only factor that gave a unique contribution to explaining variance in writing process factors. Better spelling was associated with more revisions and faster transcription. The results show that developing writers' ability to make online revisions in creative writing tasks is related to both the quality of the final written product and to individual literacy skills. More generally, the findings indicate that investigations of the dynamics of the writing process may provide insights into the factors that contribute to creative writing during early stages of literacy.

  1. Predicting Acceptance of Diversity in Pre-Kindergarten Classrooms

    ERIC Educational Resources Information Center

    Sanders, Kay; Downer, Jason

    2012-01-01

    This study examined classroom-level contributors to an acceptance of diversity in publicly supported pre-kindergarten classrooms across 11 states. Classroom composition, process quality, and teacher characteristics were examined as predictors of diversity-promoting practices as measured by the ECERS-R, acceptance of diversity construct. Findings…

  2. Improving the accuracy of electronic moisture meters for runner-type peanuts

    USDA-ARS?s Scientific Manuscript database

    Runner-type peanut kernel moisture content (MC) is measured periodically during curing and post harvest processing with electronic moisture meters for marketing and quality control. MC is predicted for 250 g samples of kernels with a mathematical function from measurements of various physical prope...

  3. Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform

    PubMed Central

    Poucke, Sven Van; Zhang, Zhongheng; Schmitz, Martin; Vukicevic, Milan; Laenen, Margot Vander; Celi, Leo Anthony; Deyne, Cathy De

    2016-01-01

    With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM) Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner) supporting scalable predictive analytics using visual tools (RapidMiner’s Radoop extension). Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM), the ETL process (Extract, Transform, Load) was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research. PMID:26731286

  4. Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform.

    PubMed

    Van Poucke, Sven; Zhang, Zhongheng; Schmitz, Martin; Vukicevic, Milan; Laenen, Margot Vander; Celi, Leo Anthony; De Deyne, Cathy

    2016-01-01

    With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM) Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner) supporting scalable predictive analytics using visual tools (RapidMiner's Radoop extension). Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM), the ETL process (Extract, Transform, Load) was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research.

  5. Investigation of the emissions and profiles of a wide range of VOCs during the Clean air for London project

    NASA Astrophysics Data System (ADS)

    Holmes, Rachel; Lidster, Richard; Hamilton, Jacqueline; Lee, James; Hopkins, James; Whalley, Lisa; Lewis, Alistair

    2014-05-01

    The majority of the World's population live in polluted urbanized areas. Poor air quality is shortening life expectancy of people in the UK by an average 7-8 months and costs society around £20 billion per year.[1] Despite this, our understanding of atmospheric processing in urban environments and its effect on air quality is incomplete. Air quality models are used to predict how air quality changes given different concentrations of pollution precursors, such as volatile organic compounds (VOCs). The urban environment of megacities pose a unique challenge for air quality measurements and modelling, due to high population densities, pollution levels and complex infrastructure. For over 60 years the air quality in London has been monitored, however the existing measurements are limited to a small group of compounds. In order to fully understand the chemical and physical processes that occur in London, more intensive and comprehensive measurements should be made. The Clean air for London (ClearfLo) project was conducted to investigate the air quality, in particular the boundary layer pollution, of London. A relatively new technique, comprehensive two dimensional gas chromatography (GC×GC) [2] was combined with a well-established dual channel GC (DC-GC) [3] system to provide a more comprehensive measurement of VOCs. A total of 78 individual VOCs (36 aliphatics, 19 monoaromatics, 21 oxygenated and 2 halogenated) and 10 groups of VOCs (8 aliphatic, 1 monoaromatic and 1 monoterpene) from C1-C13+ were quantified. Seasonal and diurnal profiles of these VOCs have been found which show the influence of emission source and chemical processing. Including these extra VOCs should enhance the prediction capability of air quality models thus informing policy makers on how to potentially improve air quality in megacities. References 1. House of Commons Environmental Audit Committee, Air Quality: A follow-up report, Ninth Report of session 2012-12. 2. Lidster, R.T., J.F. Hamilton, and A.C. Lewis, The application of two total transfer valve modulators for comprehensive two-dimensional gas chromatography of volatile organic compounds. Journal of Separation Science, 2011. 34(7): p. 812-821. 3. Hopkins, J.R., C.E. Jones, and A.C. Lewis, A dual channel gas chromatograph for atmospheric analysis of volatile organic compounds including oxygenated and monoterpene compounds. Journal of Environmental Monitoring, 2011. 13(8): p. 2268-2276.

  6. Prediction of the Vickers Microhardness and Ultimate Tensile Strength of AA5754 H111 Friction Stir Welding Butt Joints Using Artificial Neural Network

    PubMed Central

    De Filippis, Luigi Alberto Ciro; Serio, Livia Maria; Facchini, Francesco; Mummolo, Giovanni; Ludovico, Antonio Domenico

    2016-01-01

    A simulation model was developed for the monitoring, controlling and optimization of the Friction Stir Welding (FSW) process. This approach, using the FSW technique, allows identifying the correlation between the process parameters (input variable) and the mechanical properties (output responses) of the welded AA5754 H111 aluminum plates. The optimization of technological parameters is a basic requirement for increasing the seam quality, since it promotes a stable and defect-free process. Both the tool rotation and the travel speed, the position of the samples extracted from the weld bead and the thermal data, detected with thermographic techniques for on-line control of the joints, were varied to build the experimental plans. The quality of joints was evaluated through destructive and non-destructive tests (visual tests, macro graphic analysis, tensile tests, indentation Vickers hardness tests and t thermographic controls). The simulation model was based on the adoption of the Artificial Neural Networks (ANNs) characterized by back-propagation learning algorithm with different types of architecture, which were able to predict with good reliability the FSW process parameters for the welding of the AA5754 H111 aluminum plates in Butt-Joint configuration. PMID:28774035

  7. On-line monitoring of extraction process of Flos Lonicerae Japonicae using near infrared spectroscopy combined with synergy interval PLS and genetic algorithm

    NASA Astrophysics Data System (ADS)

    Yang, Yue; Wang, Lei; Wu, Yongjiang; Liu, Xuesong; Bi, Yuan; Xiao, Wei; Chen, Yong

    2017-07-01

    There is a growing need for the effective on-line process monitoring during the manufacture of traditional Chinese medicine to ensure quality consistency. In this study, the potential of near infrared (NIR) spectroscopy technique to monitor the extraction process of Flos Lonicerae Japonicae was investigated. A new algorithm of synergy interval PLS with genetic algorithm (Si-GA-PLS) was proposed for modeling. Four different PLS models, namely Full-PLS, Si-PLS, GA-PLS, and Si-GA-PLS, were established, and their performances in predicting two quality parameters (viz. total acid and soluble solid contents) were compared. In conclusion, Si-GA-PLS model got the best results due to the combination of superiority of Si-PLS and GA. For Si-GA-PLS, the determination coefficient (Rp2) and root-mean-square error for the prediction set (RMSEP) were 0.9561 and 147.6544 μg/ml for total acid, 0.9062 and 0.1078% for soluble solid contents, correspondingly. The overall results demonstrated that the NIR spectroscopy technique combined with Si-GA-PLS calibration is a reliable and non-destructive alternative method for on-line monitoring of the extraction process of TCM on the production scale.

  8. How good are the Garvey-Kelson predictions of nuclear masses?

    NASA Astrophysics Data System (ADS)

    Morales, Irving O.; López Vieyra, J. C.; Hirsch, J. G.; Frank, A.

    2009-09-01

    The Garvey-Kelson relations are used in an iterative process to predict nuclear masses in the neighborhood of nuclei with measured masses. Average errors in the predicted masses for the first three iteration shells are smaller than those obtained with the best nuclear mass models. Their quality is comparable with the Audi-Wapstra extrapolations, offering a simple and reproducible procedure for short range mass predictions. A systematic study of the way the error grows as a function of the iteration and the distance to the known masses region, shows that a correlation exists between the error and the residual neutron-proton interaction, produced mainly by the implicit assumption that V varies smoothly along the nuclear landscape.

  9. Optimization of thermal processing of canned mussels.

    PubMed

    Ansorena, M R; Salvadori, V O

    2011-10-01

    The design and optimization of thermal processing of solid-liquid food mixtures, such as canned mussels, requires the knowledge of the thermal history at the slowest heating point. In general, this point does not coincide with the geometrical center of the can, and the results show that it is located along the axial axis at a height that depends on the brine content. In this study, a mathematical model for the prediction of the temperature at this point was developed using the discrete transfer function approach. Transfer function coefficients were experimentally obtained, and prediction equations fitted to consider other can dimensions and sampling interval. This model was coupled with an optimization routine in order to search for different retort temperature profiles to maximize a quality index. Both constant retort temperature (CRT) and variable retort temperature (VRT; discrete step-wise and exponential) were considered. In the CRT process, the optimal retort temperature was always between 134 °C and 137 °C, and high values of thiamine retention were achieved. A significant improvement in surface quality index was obtained for optimal VRT profiles compared to optimal CRT. The optimization procedure shown in this study produces results that justify its utilization in the industry.

  10. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  11. Implementation of a GPS-RO data processing system for the KIAPS-LETKF data assimilation system

    NASA Astrophysics Data System (ADS)

    Kwon, H.; Kang, J.-S.; Jo, Y.; Kang, J. H.

    2014-11-01

    The Korea Institute of Atmospheric Prediction Systems (KIAPS) has been developing a new global numerical weather prediction model and an advanced data assimilation system. As part of the KIAPS Package for Observation Processing (KPOP) system for data assimilation, preprocessing and quality control modules for bending angle measurements of global positioning system radio occultation (GPS-RO) data have been implemented and examined. GPS-RO data processing system is composed of several steps for checking observation locations, missing values, physical values for Earth radius of curvature, and geoid undulation. An observation-minus-background check is implemented by use of a one-dimensional observational bending angle operator and tangent point drift is also considered in the quality control process. We have tested GPS-RO observations utilized by the Korean Meteorological Administration (KMA) within KPOP, based on both the KMA global model and the National Center for Atmospheric Research (NCAR) Community Atmosphere Model-Spectral Element (CAM-SE) as a model background. Background fields from the CAM-SE model are incorporated for the preparation of assimilation experiments with the KIAPS-LETKF data assimilation system, which has been successfully implemented to a cubed-sphere model with fully unstructured quadrilateral meshes. As a result of data processing, the bending angle departure statistics between observation and background shows significant improvement. Also, the first experiment in assimilating GPS-RO bending angle resulting from KPOP within KIAPS-LETKF shows encouraging results.

  12. YouTube as a Potential Training Resource for Laparoscopic Fundoplication.

    PubMed

    Frongia, Giovanni; Mehrabi, Arianeb; Fonouni, Hamidreza; Rennert, Helga; Golriz, Mohammad; Günther, Patrick

    To analyze the surgical proficiency and educational quality of YouTube videos demonstrating laparoscopic fundoplication (LF). In this cross-sectional study, a search was performed on YouTube for videos demonstrating the LF procedure. The surgical and educational proficiency was evaluated using the objective component rating scale, the educational quality rating score, and total video quality score. Statistical significance was determined by analysis of variance, receiver operating characteristic curve, and odds ratio analysis. A total of 71 videos were included in the study; 28 (39.4%) videos were evaluated as good, 23 (32.4%) were moderate, and 20 (28.2%) were poor. Good-rated videos were significantly longer (good, 22.0 ± 5.2min; moderate, 7.8 ± 0.9min; poor, 8.5 ± 1.0min; p = 0.007) and video duration was predictive of good quality (AUC, 0.672 ± 0.067; 95% CI: 0.541-0.802; p = 0.015). For good quality, the cut-off video duration was 7:42 minute. This cut-off value had a sensitivity of 67.9%, a specificity of 60.5%, and an odds ratio of 3.23 (95% CI: 1.19-8.79; p = 0.022) in predicting good quality. Videos uploaded from industrial sources and with a higher views/days online ratio had a higher objective component rating scale and total video quality score. In contrast, the likes/dislikes ratio was not predictive of video quality. Many videos showing the LF procedure have been uploaded to YouTube with varying degrees of quality. A process for filtering LF videos with high surgical and educational quality is feasible by evaluating the video duration, uploading source, and the views/days online ratio. However, alternative videos platforms aimed at professionals should also be considered for educational purposes. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  13. Predicting cyanobacterial abundance, microcystin, and geosmin in a eutrophic drinking-water reservoir using a 14-year dataset

    USGS Publications Warehouse

    Harris, Ted D.; Graham, Jennifer L.

    2017-01-01

    Cyanobacterial blooms degrade water quality in drinking water supply reservoirs by producing toxic and taste-and-odor causing secondary metabolites, which ultimately cause public health concerns and lead to increased treatment costs for water utilities. There have been numerous attempts to create models that predict cyanobacteria and their secondary metabolites, most using linear models; however, linear models are limited by assumptions about the data and have had limited success as predictive tools. Thus, lake and reservoir managers need improved modeling techniques that can accurately predict large bloom events that have the highest impact on recreational activities and drinking-water treatment processes. In this study, we compared 12 unique linear and nonlinear regression modeling techniques to predict cyanobacterial abundance and the cyanobacterial secondary metabolites microcystin and geosmin using 14 years of physiochemical water quality data collected from Cheney Reservoir, Kansas. Support vector machine (SVM), random forest (RF), boosted tree (BT), and Cubist modeling techniques were the most predictive of the compared modeling approaches. SVM, RF, and BT modeling techniques were able to successfully predict cyanobacterial abundance, microcystin, and geosmin concentrations <60,000 cells/mL, 2.5 µg/L, and 20 ng/L, respectively. Only Cubist modeling predicted maxima concentrations of cyanobacteria and geosmin; no modeling technique was able to predict maxima microcystin concentrations. Because maxima concentrations are a primary concern for lake and reservoir managers, Cubist modeling may help predict the largest and most noxious concentrations of cyanobacteria and their secondary metabolites.

  14. Statistical and engineering methods for model enhancement

    NASA Astrophysics Data System (ADS)

    Chang, Chia-Jung

    Models which describe the performance of physical process are essential for quality prediction, experimental planning, process control and optimization. Engineering models developed based on the underlying physics/mechanics of the process such as analytic models or finite element models are widely used to capture the deterministic trend of the process. However, there usually exists stochastic randomness in the system which may introduce the discrepancy between physics-based model predictions and observations in reality. Alternatively, statistical models can be used to develop models to obtain predictions purely based on the data generated from the process. However, such models tend to perform poorly when predictions are made away from the observed data points. This dissertation contributes to model enhancement research by integrating physics-based model and statistical model to mitigate the individual drawbacks and provide models with better accuracy by combining the strengths of both models. The proposed model enhancement methodologies including the following two streams: (1) data-driven enhancement approach and (2) engineering-driven enhancement approach. Through these efforts, more adequate models are obtained, which leads to better performance in system forecasting, process monitoring and decision optimization. Among different data-driven enhancement approaches, Gaussian Process (GP) model provides a powerful methodology for calibrating a physical model in the presence of model uncertainties. However, if the data contain systematic experimental errors, the GP model can lead to an unnecessarily complex adjustment of the physical model. In Chapter 2, we proposed a novel enhancement procedure, named as “Minimal Adjustment”, which brings the physical model closer to the data by making minimal changes to it. This is achieved by approximating the GP model by a linear regression model and then applying a simultaneous variable selection of the model and experimental bias terms. Two real examples and simulations are presented to demonstrate the advantages of the proposed approach. Different from enhancing the model based on data-driven perspective, an alternative approach is to focus on adjusting the model by incorporating the additional domain or engineering knowledge when available. This often leads to models that are very simple and easy to interpret. The concepts of engineering-driven enhancement are carried out through two applications to demonstrate the proposed methodologies. In the first application where polymer composite quality is focused, nanoparticle dispersion has been identified as a crucial factor affecting the mechanical properties. Transmission Electron Microscopy (TEM) images are commonly used to represent nanoparticle dispersion without further quantifications on its characteristics. In Chapter 3, we developed the engineering-driven nonhomogeneous Poisson random field modeling strategy to characterize nanoparticle dispersion status of nanocomposite polymer, which quantitatively represents the nanomaterial quality presented through image data. The model parameters are estimated through the Bayesian MCMC technique to overcome the challenge of limited amount of accessible data due to the time consuming sampling schemes. The second application is to calibrate the engineering-driven force models of laser-assisted micro milling (LAMM) process statistically, which facilitates a systematic understanding and optimization of targeted processes. In Chapter 4, the force prediction interval has been derived by incorporating the variability in the runout parameters as well as the variability in the measured cutting forces. The experimental results indicate that the model predicts the cutting force profile with good accuracy using a 95% confidence interval. To conclude, this dissertation is the research drawing attention to model enhancement, which has considerable impacts on modeling, design, and optimization of various processes and systems. The fundamental methodologies of model enhancement are developed and further applied to various applications. These research activities developed engineering compliant models for adequate system predictions based on observational data with complex variable relationships and uncertainty, which facilitate process planning, monitoring, and real-time control.

  15. Toward automated biochemotype annotation for large compound libraries.

    PubMed

    Chen, Xian; Liang, Yizeng; Xu, Jun

    2006-08-01

    Combinatorial chemistry allows scientists to probe large synthetically accessible chemical space. However, identifying the sub-space which is selectively associated with an interested biological target, is crucial to drug discovery and life sciences. This paper describes a process to automatically annotate biochemotypes of compounds in a library and thus to identify bioactivity related chemotypes (biochemotypes) from a large library of compounds. The process consists of two steps: (1) predicting all possible bioactivities for each compound in a library, and (2) deriving possible biochemotypes based on predictions. The Prediction of Activity Spectra for Substances program (PASS) was used in the first step. In second step, structural similarity and scaffold-hopping technologies are employed. These technologies are used to derive biochemotypes from bioactivity predictions and the corresponding annotated biochemotypes from MDL Drug Data Report (MDDR) database. About a one million (982,889) commercially available compound library (CACL) has been tested using this process. This paper demonstrates the feasibility of automatically annotating biochemotypes for large libraries of compounds. Nevertheless, some issues need to be considered in order to improve the process. First, the prediction accuracy of PASS program has no significant correlation with the number of compounds in a training set. Larger training sets do not necessarily increase the maximal error of prediction (MEP), nor do they increase the hit structural diversity. Smaller training sets do not necessarily decrease MEP, nor do they decrease the hit structural diversity. Second, the success of systematic bioactivity prediction relies on modeling, training data, and the definition of bioactivities (biochemotype ontology). Unfortunately, the biochemotype ontology was not well developed in the PASS program. Consequently, "ill-defined" bioactivities can reduce the quality of predictions. This paper suggests the ways in which the systematic bioactivities prediction program should be improved.

  16. Untrained consumer assessment of the eating quality of beef: 1. A single composite score can predict beef quality grades.

    PubMed

    Bonny, S P F; Hocquette, J-F; Pethick, D W; Legrand, I; Wierzbicki, J; Allen, P; Farmer, L J; Polkinghorne, R J; Gardner, G E

    2017-08-01

    Quantifying consumer responses to beef across a broad range of demographics, nationalities and cooking methods is vitally important for any system evaluating beef eating quality. On the basis of previous work, it was expected that consumer scores would be highly accurate in determining quality grades for beef, thereby providing evidence that such a technique could be used to form the basis of and eating quality grading system for beef. Following the Australian MSA (Meat Standards Australia) testing protocols, over 19 000 consumers from Northern Ireland, Poland, Ireland, France and Australia tasted cooked beef samples, then allocated them to a quality grade; unsatisfactory, good-every-day, better-than-every-day and premium. The consumers also scored beef samples for tenderness, juiciness, flavour-liking and overall-liking. The beef was sourced from all countries involved in the study and cooked by four different cooking methods and to three different degrees of doneness, with each experimental group in the study consisting of a single cooking doneness within a cooking method for each country. For each experimental group, and for the data set as a whole, a linear discriminant function was calculated, using the four sensory scores which were used to predict the quality grade. This process was repeated using two conglomerate scores which are derived from weighting and combining the consumer sensory scores for tenderness, juiciness, flavour-liking and overall-liking, the original meat quality 4 score (oMQ4) (0.4, 0.1, 0.2, 0.3) and current meat quality 4 score (cMQ4) (0.3, 0.1, 0.3, 0.3). From the results of these analyses, the optimal weightings of the sensory scores to generate an 'ideal meat quality 4 score (MQ4)' for each country were calculated, and the MQ4 values that reflected the boundaries between the four quality grades were determined. The oMQ4 weightings were far more accurate in categorising European meat samples than the cMQ4 weightings, highlighting that tenderness is more important than flavour to the consumer when determining quality. The accuracy of the discriminant analysis to predict the consumer scored quality grades was similar across all consumer groups, 68%, and similar to previously reported values. These results demonstrate that this technique, as used in the MSA system, could be used to predict consumer assessment of beef eating quality and therefore to underpin a commercial eating quality guarantee for all European consumers.

  17. Laser Brazing with Beam Scanning: Experimental and Simulative Analysis

    NASA Astrophysics Data System (ADS)

    Heitmanek, M.; Dobler, M.; Graudenz, M.; Perret, W.; Göbel, G.; Schmidt, M.; Beyer, E.

    Laser beam brazing with copper based filler wire is a widely established technology for joining zinc-coated steel plates in the body-shop. Successful applications are the divided tailgate or the zero-gap joint, which represents the joint between the side panel and the roof-top of the body-in-white. These joints are in direct view to the customer, and therefore have to fulfil highest optical quality requirements. For this reason a stable and efficient laser brazing process is essential. In this paper the current results on quality improvement due to one dimensional laser beam deflections in feed direction are presented. Additionally to the experimental results a transient three-dimensional simulation model for the laser beam brazing process is taken into account. With this model the influence of scanning parameters on filler wire temperature and melt pool characteristics is analyzed. The theoretical predictions are in good accordance with the experimental results. They show that the beam scanning approach is a very promising method to increase process stability and seam quality.

  18. Modeling Benthic Sediment Processes to Predict Water ...

    EPA Pesticide Factsheets

    The benthic sediment acts as a huge reservoir of particulate and dissolved material (within interstitial water) which can contribute to loading of contaminants and nutrients to the water column. A benthic sediment model is presented in this report to predict spatial and temporal benthic fluxes of nutrients and chemicals in Narragansett Bay. A benthic sediment model is presented in this report to identify benthic flux into the water column in Narragansett Bay. Benthic flux is essential to properly model water quality and ecology in estuarine and coastal systems.

  19. Improving the Accuracy of Extracting Surface Water Quality Levels (SWQLs) Using Remote Sensing and Artificial Neural Network: a Case Study in the Saint John River, Canada

    NASA Astrophysics Data System (ADS)

    Sammartano, G.; Spanò, A.

    2017-09-01

    Delineating accurate surface water quality levels (SWQLs) always presents a great challenge to researchers. Existing methods of assessing surface water quality only provide individual concentrations of monitoring stations without providing the overall SWQLs. Therefore, the results of existing methods are usually difficult to be understood by decision-makers. Conversely, the water quality index (WQI) can simplify surface water quality assessment process to be accessible to decision-makers. However, in most cases, the WQI reflects inaccurate SWQLs due to the lack of representative water samples. It is very challenging to provide representative water samples because this process is costly and time consuming. To solve this problem, we introduce a cost-effective method which combines the Landsat-8 imagery and artificial intelligence to develop models to derive representative water samples by correlating concentrations of ground truth water samples to satellite spectral information. Our method was validated and the correlation between concentrations of ground truth water samples and predicted concentrations from the developed models reached a high level of coefficient of determination (R2) > 0.80, which is trustworthy. Afterwards, the predicted concentrations over each pixel of the study area were used as an input to the WQI developed by the Canadian Council of Ministers of the Environment to extract accurate SWQLs, for drinking purposes, in the Saint John River. The results indicated that SWQL was observed as 67 (Fair) and 59 (Marginal) for the lower and middle basins of the river, respectively. These findings demonstrate the potential of using our approach in surface water quality management.

  20. Arabinoxylan content and characterisation throughout the bread-baking process

    USDA-ARS?s Scientific Manuscript database

    End-use quality of wheat (Triticum aestivum L.) is influenced in a variety of ways by non-starch polysaccharides, especially arabinoxylans (AX). The assessment of AX content and structural properties is often performed on flour and extrapolated to predict the role that AX may play in baked products....

  1. A REGIONAL MODEL FOR PCDD/F'S BASED ON A PHOTOCHEMICAL MODEL FOR AIR QUALITY AND PARTICULATE MATTER

    EPA Science Inventory

    How important is gas to particle partitioning in predicting air concentrations and deposition of Poly-Chlorinated Dibenzo-p-Dioxins and Furans (PCDD/F's)? Literature indicates that the fate of emissions changes because the summation of atmospheric processes has a different balanc...

  2. Improving SWAT model prediction using an upgraded denitrification scheme and constrained auto calibration

    USDA-ARS?s Scientific Manuscript database

    The reliability of common calibration practices for process based water quality models has recently been questioned. A so-called “adequately calibrated model” may contain input errors not readily identifiable by model users, or may not realistically represent intra-watershed responses. These short...

  3. Green-ampt infiltration parameters in riparian buffers

    Treesearch

    L.M. Stahr; D.E. Eisenhauer; M.J. Helmers; Mike G. Dosskey; T.G. Franti

    2004-01-01

    Riparian buffers can improve surface water quality by filtering contaminants from runoff before they enter streams. Infiltration is an important process in riparian buffers. Computer models are often used to assess the performance of riparian buffers. Accurate prediction of infiltration by these models is dependent upon accurate estimates of infiltration parameters....

  4. Simplification of a light-based model for estimating final internode length in greenhouse cucumber canopies.

    PubMed

    Kahlen, Katrin; Stützel, Hartmut

    2011-10-01

    Light quantity and quality affect internode lengths in cucumber (Cucumis sativus), whereby leaf area and the optical properties of the leaves mainly control light quality within a cucumber plant community. This modelling study aimed at providing a simple, non-destructive method to predict final internode lengths (FILs) using light quantity and leaf area data. Several simplifications of a light quantity and quality sensitive model for estimating FILs in cucumber have been tested. The direct simplifications substitute the term for the red : far-red (R : FR) ratios, by a term for (a) the leaf area index (LAI, m(2) m(-2)) or (b) partial LAI, the cumulative leaf area per m(2) ground, where leaf area per m(2) ground is accumulated from the top of each plant until a number, n, of leaves per plant is reached. The indirect simplifications estimate the input R : FR ratio based on partial leaf area and plant density. In all models, simulated FILs were in line with the measured FILs over various canopy architectures and light conditions, but the prediction quality varied. The indirect simplification based on leaf area of ten leaves revealed the best fit with measured data. Its prediction quality was even higher than of the original model. This study showed that for vertically trained cucumber plants, leaf area data can substitute local light quality data for estimating FIL data. In unstressed canopies, leaf area over the upper ten ranks seems to represent the feedback of the growing architecture on internode elongation with respect to light quality. This highlights the role of this domain of leaves as the primary source for the specific R : FR signal controlling the final length of an internode and could therefore guide future research on up-scaling local processes to the crop level.

  5. Modeling of weld bead geometry for rapid manufacturing by robotic GMAW

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Xiong, Jun; Chen, Hui; Chen, Yong

    2015-03-01

    Weld-based rapid prototyping (RP) has shown great promises for fabricating 3D complex parts. During the layered deposition of forming metallic parts with robotic gas metal arc welding, the geometry of a single weld bead has an important influence on surface finish quality, layer thickness and dimensional accuracy of the deposited layer. In order to obtain accurate, predictable and controllable bead geometry, it is essential to understand the relationships between the process variables with the bead geometry (bead width, bead height and ratio of bead width to bead height). This paper highlights an experimental study carried out to develop mathematical models to predict deposited bead geometry through the quadratic general rotary unitized design. The adequacy and significance of the models were verified via the analysis of variance. Complicated cause-effect relationships between the process parameters and the bead geometry were revealed. Results show that the developed models can be applied to predict the desired bead geometry with great accuracy in layered deposition with accordance to the slicing process of RP.

  6. Understanding and Predicting the Process of Software Maintenance Releases

    NASA Technical Reports Server (NTRS)

    Basili, Victor; Briand, Lionel; Condon, Steven; Kim, Yong-Mi; Melo, Walcelio L.; Valett, Jon D.

    1996-01-01

    One of the major concerns of any maintenance organization is to understand and estimate the cost of maintenance releases of software systems. Planning the next release so as to maximize the increase in functionality and the improvement in quality are vital to successful maintenance management. The objective of this paper is to present the results of a case study in which an incremental approach was used to better understand the effort distribution of releases and build a predictive effort model for software maintenance releases. This study was conducted in the Flight Dynamics Division (FDD) of NASA Goddard Space Flight Center(GSFC). This paper presents three main results: 1) a predictive effort model developed for the FDD's software maintenance release process; 2) measurement-based lessons learned about the maintenance process in the FDD; and 3) a set of lessons learned about the establishment of a measurement-based software maintenance improvement program. In addition, this study provides insights and guidelines for obtaining similar results in other maintenance organizations.

  7. Predicting Mountainous Watershed Biogeochemical Dynamics, Including Response to Droughts and Early Snowmelt

    NASA Astrophysics Data System (ADS)

    Hubbard, S. S.; Williams, K. H.; Long, P.; Agarwal, D.; Banfield, J. F.; Beller, H. R.; Bouskill, N.; Brodie, E.; Maxwell, R. M.; Nico, P. S.; Steefel, C. I.; Steltzer, H.; Tokunaga, T. K.; Wainwright, H. M.

    2016-12-01

    Climate change, extreme weather, land-use change, and other perturbations are significantly reshaping interactions with in watersheds throughout the world. While mountainous watersheds are recognized as the water towers for the world, hydrological processes in watersheds also mediate biogeochemical processes that support all terrestrial life. Developing predictive understanding of watershed hydrological and biogeochemical functioning is challenging, as complex interactions occurring within a heterogeneous watershed can lead to a cascade of effects on downstream water availability and quality. Although these interactions can have significant implications for energy production, agriculture, water quality, and other benefits valued by society, uncertainty associated with predicting watershed function is high. The Watershed Function project aims to substantially reduce this uncertainty through developing a predictive understanding of how mountainous watersheds retain and release downgradient water, nutrients, carbon, and metals. In particular, the project is exploring how early snowmelt, drought, and other disturbances will influence mountainous watershed dynamics at seasonal to decadal timescales. The Watershed Function project is being carried out in a headwater mountainous catchment of the Upper Colorado River Basin, within a watershed characterized by significant gradients in elevation, vegetation and hydrogeology. A system-within system project perspective posits that the integrated watershed response to disturbances can be adequately predicted through consideration of interactions and feedbacks occurring within a limited number of subsystems, each having distinct vegetation-subsurface biogeochemical-hydrological characteristics. A key technological goal is the development of scale-adaptive simulation capabilities that can incorporate genomic information where and when it is useful for predicting the overall watershed response to disturbance. Through developing and integrating new microbial ecology, geochemical, hydrological, ecohydrological, computational and geophysical approaches, the project is developing new insights about biogeochemical dynamics from genome to watershed scales.

  8. Decision Making on the Labor and Delivery Unit: An Investigation of Influencing Factors.

    PubMed

    Gregory, Megan E; Sonesh, Shirley C; Feitosa, Jennifer; Benishek, Lauren E; Hughes, Ashley M; Salas, Eduardo

    2017-09-01

    Objective The aim of this study was to describe the relationship between negative affect (NA), decision-making style, time stress, and decision quality in health care. Background Health care providers must often make swift, high-stakes decisions. Influencing factors of the decision-making process in this context have been understudied. Method Within a sample of labor and delivery nurses, physicians, and allied personnel, we used self-report measures to examine the impact of trait factors, including NA, decision-making style, and perceived time stress, on decision quality in a situational judgment test (Study 1). In Study 2, we observed the influence of state NA, state decision-making style, state time stress, and their relationship with decision quality on real clinical decisions. Results In Study 1, we found that trait NA significantly predicted avoidant decision-making style. Furthermore, those who were higher on trait time stress and trait avoidant decision-making style exhibited poorer decisions. In Study 2, we observed associations between state NA with state avoidant and analytical decision-making styles. We also observed that these decision-making styles, when considered in tandem with time stress, were influential in predicting clinical decision quality. Conclusion NA predicts some decision-making styles, and decision-making style can affect decision quality under time stress. This is particularly true for state factors. Application Individual differences, such as affect and decision-making style, should be considered during selection. Training to reduce time stress perceptions should be provided.

  9. Predictors of affect following treatment decision-making for prostate cancer: conversations, cognitive processing, and coping.

    PubMed

    Christie, Kysa M; Meyerowitz, Beth E; Giedzinska-Simons, Antoinette; Gross, Mitchell; Agus, David B

    2009-05-01

    Research suggests that cancer patients who are more involved in treatment decision-making (TDM) report better quality of life following treatment. This study examines the association and possible mechanisms between prostate cancer patient's discussions about TDM and affect following treatment. We predicted that the length of time patients spent discussing treatment options with social networks and physicians prior to treatment would predict emotional adjustment after treatment. We further predicted that cognitive processing, coping, and patient understanding of treatment options would mediate this association. Fifty-seven patients completed questionnaires prior to treatment and at 1 and 6 months following treatment completion. Findings from the present study suggest that discussing treatment options with others, prior to beginning treatment for prostate cancer, significantly contributed to improvements in affect 1 and 6 months following treatment. Residualized regression analyses indicated that discussing treatment options with patient's social networks predicted a decrease in negative affect 1 and 6 months following treatment, while discussions with physicians predicted an increase in positive affect 1 month following treatment. Patients who spent more time discussing treatment options with family and friends also reported greater pre-treatment social support and emotional expression. Mediation analyses indicated that these coping strategies facilitated cognitive processing (as measured by a decrease in intrusive thoughts) and that cognitive processing predicted improvement in affect. Greater time spent talking with family and friends about treatment options may provide opportunities for patients to cope with their cancer diagnosis and facilitate cognitive processing, which may improve patient distress over time. Copyright (c) 2008 John Wiley & Sons Ltd.

  10. Study on the Influence of Building Materials on Indoor Pollutants and Pollution Sources

    NASA Astrophysics Data System (ADS)

    Wang, Yao

    2018-01-01

    The paper summarizes the achievements and problems of indoor air quality research at home and abroad. The pollutants and pollution sources in the room are analyzed systematically. The types of building materials and pollutants are also discussed. The physical and chemical properties and health effects of main pollutants were analyzed and studied. According to the principle of mass balance, the basic mathematical model of indoor air quality is established. Considering the release rate of pollutants and indoor ventilation, a mathematical model for predicting the concentration of indoor air pollutants is derived. The model can be used to analyze and describe the variation of pollutant concentration in indoor air, and to predict and calculate the concentration of pollutants in indoor air at a certain time. The results show that the mathematical model established in this study can be used to analyze and predict the variation law of pollutant concentration in indoor air. The evaluation model can be used to evaluate the impact of indoor air quality and evaluation of current situation. Especially in the process of building and interior decoration, through pre-evaluation, it can provide reliable design parameters for selecting building materials and determining ventilation volume.

  11. Improvement of Meteorological Inputs for TexAQS-II Air Quality Simulations

    NASA Astrophysics Data System (ADS)

    Ngan, F.; Byun, D.; Kim, H.; Cheng, F.; Kim, S.; Lee, D.

    2008-12-01

    An air quality forecasting system (UH-AQF) for Eastern Texas, which is in operation by the Institute for Multidimensional Air Quality Studies (IMAQS) at the University of Houston, uses the Fifth-Generation PSU/NCAR Mesoscale Model MM5 model as the meteorological driver for modeling air quality with the Community Multiscale Air Quality (CMAQ) model. While the forecasting system was successfully used for the planning and implementation of various measurement activities, evaluations of the forecasting results revealed a few systematic problems in the numerical simulations. From comparison with observations, we observe some times over-prediction of northerly winds caused by inaccurate synoptic inputs and other times too strong southerly winds caused by local sea breeze development. Discrepancies in maximum and minimum temperature are also seen for certain days. Precipitation events, as well as clouds, are simulated at the incorrect locations and times occasionally. Model simulatednrealistic thunderstorms are simulated, causing sometimes cause unrealistically strong outflows. To understand physical and chemical processes influencing air quality measures, a proper description of real world meteorological conditions is essential. The objective of this study is to generate better meteorological inputs than the AQF results to support the chemistry modeling. We utilized existing objective analysis and nudging tools in the MM5 system to develop the MUltiscale Nest-down Data Assimilation System (MUNDAS), which incorporates extensive meteorological observations available in the simulated domain for the retrospective simulation of the TexAQS-II period. With the re-simulated meteorological input, we are able to better predict ozone events during TexAQS-II period. In addition, base datasets in MM5 such as land use/land cover, vegetation fraction, soil type and sea surface temperature are updated by satellite data to represent the surface features more accurately. They are key physical parameters inputs affecting transfer of heat, momentum and soil moisture in land-surface process in MM5. Using base the accurate input datasets, we are able to have improved see the differences of predictions of ground temperatures, winds and even thunderstorm activities within boundary layer.

  12. CodingQuarry: highly accurate hidden Markov model gene prediction in fungal genomes using RNA-seq transcripts.

    PubMed

    Testa, Alison C; Hane, James K; Ellwood, Simon R; Oliver, Richard P

    2015-03-11

    The impact of gene annotation quality on functional and comparative genomics makes gene prediction an important process, particularly in non-model species, including many fungi. Sets of homologous protein sequences are rarely complete with respect to the fungal species of interest and are often small or unreliable, especially when closely related species have not been sequenced or annotated in detail. In these cases, protein homology-based evidence fails to correctly annotate many genes, or significantly improve ab initio predictions. Generalised hidden Markov models (GHMM) have proven to be invaluable tools in gene annotation and, recently, RNA-seq has emerged as a cost-effective means to significantly improve the quality of automated gene annotation. As these methods do not require sets of homologous proteins, improving gene prediction from these resources is of benefit to fungal researchers. While many pipelines now incorporate RNA-seq data in training GHMMs, there has been relatively little investigation into additionally combining RNA-seq data at the point of prediction, and room for improvement in this area motivates this study. CodingQuarry is a highly accurate, self-training GHMM fungal gene predictor designed to work with assembled, aligned RNA-seq transcripts. RNA-seq data informs annotations both during gene-model training and in prediction. Our approach capitalises on the high quality of fungal transcript assemblies by incorporating predictions made directly from transcript sequences. Correct predictions are made despite transcript assembly problems, including those caused by overlap between the transcripts of adjacent gene loci. Stringent benchmarking against high-confidence annotation subsets showed CodingQuarry predicted 91.3% of Schizosaccharomyces pombe genes and 90.4% of Saccharomyces cerevisiae genes perfectly. These results are 4-5% better than those of AUGUSTUS, the next best performing RNA-seq driven gene predictor tested. Comparisons against whole genome Sc. pombe and S. cerevisiae annotations further substantiate a 4-5% improvement in the number of correctly predicted genes. We demonstrate the success of a novel method of incorporating RNA-seq data into GHMM fungal gene prediction. This shows that a high quality annotation can be achieved without relying on protein homology or a training set of genes. CodingQuarry is freely available ( https://sourceforge.net/projects/codingquarry/ ), and suitable for incorporation into genome annotation pipelines.

  13. An approach to predict water quality in data-sparse catchments using hydrological catchment similarity

    NASA Astrophysics Data System (ADS)

    Pohle, Ina; Glendell, Miriam; Stutter, Marc I.; Helliwell, Rachel C.

    2017-04-01

    An understanding of catchment response to climate and land use change at a regional scale is necessary for the assessment of mitigation and adaptation options addressing diffuse nutrient pollution. It is well documented that the physicochemical properties of a river ecosystem respond to change in a non-linear fashion. This is particularly important when threshold water concentrations, relevant to national and EU legislation, are exceeded. Large scale (regional) model assessments required for regulatory purposes must represent the key processes and mechanisms that are more readily understood in catchments with water quantity and water quality data monitored at high spatial and temporal resolution. While daily discharge data are available for most catchments in Scotland, nitrate and phosphorus are mostly available on a monthly basis only, as typified by regulatory monitoring. However, high resolution (hourly to daily) water quantity and water quality data exist for a limited number of research catchments. To successfully implement adaptation measures across Scotland, an upscaling from data-rich to data-sparse catchments is required. In addition, the widespread availability of spatial datasets affecting hydrological and biogeochemical responses (e.g. soils, topography/geomorphology, land use, vegetation etc.) provide an opportunity to transfer predictions between data-rich and data-sparse areas by linking processes and responses to catchment attributes. Here, we develop a framework of catchment typologies as a prerequisite for transferring information from data-rich to data-sparse catchments by focusing on how hydrological catchment similarity can be used as an indicator of grouped behaviours in water quality response. As indicators of hydrological catchment similarity we use flow indices derived from observed discharge data across Scotland as well as hydrological model parameters. For the latter, we calibrated the lumped rainfall-runoff model TUWModel using multiple objective functions. The relationships between indicators of hydrological catchment similarity, physical catchment characteristics and nitrate and phosphorus concentrations in rivers are then investigated using multivariate statistics. This understanding of the relationship between catchment characteristics, hydrological processes and water quality will allow us to implement more efficient regulatory water quality monitoring strategies, to improve existing water quality models and to model mitigation and adaptation scenarios to global change in data-sparse catchments.

  14. Prediction of global and local model quality in CASP8 using the ModFOLD server.

    PubMed

    McGuffin, Liam J

    2009-01-01

    The development of effective methods for predicting the quality of three-dimensional (3D) models is fundamentally important for the success of tertiary structure (TS) prediction strategies. Since CASP7, the Quality Assessment (QA) category has existed to gauge the ability of various model quality assessment programs (MQAPs) at predicting the relative quality of individual 3D models. For the CASP8 experiment, automated predictions were submitted in the QA category using two methods from the ModFOLD server-ModFOLD version 1.1 and ModFOLDclust. ModFOLD version 1.1 is a single-model machine learning based method, which was used for automated predictions of global model quality (QMODE1). ModFOLDclust is a simple clustering based method, which was used for automated predictions of both global and local quality (QMODE2). In addition, manual predictions of model quality were made using ModFOLD version 2.0--an experimental method that combines the scores from ModFOLDclust and ModFOLD v1.1. Predictions from the ModFOLDclust method were the most successful of the three in terms of the global model quality, whilst the ModFOLD v1.1 method was comparable in performance to other single-model based methods. In addition, the ModFOLDclust method performed well at predicting the per-residue, or local, model quality scores. Predictions of the per-residue errors in our own 3D models, selected using the ModFOLD v2.0 method, were also the most accurate compared with those from other methods. All of the MQAPs described are publicly accessible via the ModFOLD server at: http://www.reading.ac.uk/bioinf/ModFOLD/. The methods are also freely available to download from: http://www.reading.ac.uk/bioinf/downloads/. Copyright 2009 Wiley-Liss, Inc.

  15. A portable device for rapid nondestructive detection of fresh meat quality

    NASA Astrophysics Data System (ADS)

    Lin, Wan; Peng, Yankun

    2014-05-01

    Quality attributes of fresh meat influence nutritional value and consumers' purchasing power. In order to meet the demand of inspection department for portable device, a rapid and nondestructive detection device for fresh meat quality based on ARM (Advanced RISC Machines) processor and VIS/NIR technology was designed. Working principal, hardware composition, software system and functional test were introduced. Hardware system consisted of ARM processing unit, light source unit, detection probe unit, spectral data acquisition unit, LCD (Liquid Crystal Display) touch screen display unit, power unit and the cooling unit. Linux operating system and quality parameters acquisition processing application were designed. This system has realized collecting spectral signal, storing, displaying and processing as integration with the weight of 3.5 kg. 40 pieces of beef were used in experiment to validate the stability and reliability. The results indicated that prediction model developed using PLSR method using SNV as pre-processing method had good performance, with the correlation coefficient of 0.90 and root mean square error of 1.56 for validation set for L*, 0.95 and 1.74 for a*,0.94 and 0.59 for b*, 0.88 and 0.13 for pH, 0.79 and 12.46 for tenderness, 0.89 and 0.91 for water content, respectively. The experimental result shows that this device can be a useful tool for detecting quality of meat.

  16. Method and Apparatus for Evaluating the Visual Quality of Processed Digital Video Sequences

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B. (Inventor)

    2002-01-01

    A Digital Video Quality (DVQ) apparatus and method that incorporate a model of human visual sensitivity to predict the visibility of artifacts. The DVQ method and apparatus are used for the evaluation of the visual quality of processed digital video sequences and for adaptively controlling the bit rate of the processed digital video sequences without compromising the visual quality. The DVQ apparatus minimizes the required amount of memory and computation. The input to the DVQ apparatus is a pair of color image sequences: an original (R) non-compressed sequence, and a processed (T) sequence. Both sequences (R) and (T) are sampled, cropped, and subjected to color transformations. The sequences are then subjected to blocking and discrete cosine transformation, and the results are transformed to local contrast. The next step is a time filtering operation which implements the human sensitivity to different time frequencies. The results are converted to threshold units by dividing each discrete cosine transform coefficient by its respective visual threshold. At the next stage the two sequences are subtracted to produce an error sequence. The error sequence is subjected to a contrast masking operation, which also depends upon the reference sequence (R). The masked errors can be pooled in various ways to illustrate the perceptual error over various dimensions, and the pooled error can be converted to a visual quality measure.

  17. Peer Review Quality and Transparency of the Peer-Review Process in Open Access and Subscription Journals.

    PubMed

    Wicherts, Jelte M

    2016-01-01

    Recent controversies highlighting substandard peer review in Open Access (OA) and traditional (subscription) journals have increased the need for authors, funders, publishers, and institutions to assure quality of peer-review in academic journals. I propose that transparency of the peer-review process may be seen as an indicator of the quality of peer-review, and develop and validate a tool enabling different stakeholders to assess transparency of the peer-review process. Based on editorial guidelines and best practices, I developed a 14-item tool to rate transparency of the peer-review process on the basis of journals' websites. In Study 1, a random sample of 231 authors of papers in 92 subscription journals in different fields rated transparency of the journals that published their work. Authors' ratings of the transparency were positively associated with quality of the peer-review process but unrelated to journal's impact factors. In Study 2, 20 experts on OA publishing assessed the transparency of established (non-OA) journals, OA journals categorized as being published by potential predatory publishers, and journals from the Directory of Open Access Journals (DOAJ). Results show high reliability across items (α = .91) and sufficient reliability across raters. Ratings differentiated the three types of journals well. In Study 3, academic librarians rated a random sample of 140 DOAJ journals and another 54 journals that had received a hoax paper written by Bohannon to test peer-review quality. Journals with higher transparency ratings were less likely to accept the flawed paper and showed higher impact as measured by the h5 index from Google Scholar. The tool to assess transparency of the peer-review process at academic journals shows promising reliability and validity. The transparency of the peer-review process can be seen as an indicator of peer-review quality allowing the tool to be used to predict academic quality in new journals.

  18. Peer Review Quality and Transparency of the Peer-Review Process in Open Access and Subscription Journals

    PubMed Central

    Wicherts, Jelte M.

    2016-01-01

    Background Recent controversies highlighting substandard peer review in Open Access (OA) and traditional (subscription) journals have increased the need for authors, funders, publishers, and institutions to assure quality of peer-review in academic journals. I propose that transparency of the peer-review process may be seen as an indicator of the quality of peer-review, and develop and validate a tool enabling different stakeholders to assess transparency of the peer-review process. Methods and Findings Based on editorial guidelines and best practices, I developed a 14-item tool to rate transparency of the peer-review process on the basis of journals’ websites. In Study 1, a random sample of 231 authors of papers in 92 subscription journals in different fields rated transparency of the journals that published their work. Authors’ ratings of the transparency were positively associated with quality of the peer-review process but unrelated to journal’s impact factors. In Study 2, 20 experts on OA publishing assessed the transparency of established (non-OA) journals, OA journals categorized as being published by potential predatory publishers, and journals from the Directory of Open Access Journals (DOAJ). Results show high reliability across items (α = .91) and sufficient reliability across raters. Ratings differentiated the three types of journals well. In Study 3, academic librarians rated a random sample of 140 DOAJ journals and another 54 journals that had received a hoax paper written by Bohannon to test peer-review quality. Journals with higher transparency ratings were less likely to accept the flawed paper and showed higher impact as measured by the h5 index from Google Scholar. Conclusions The tool to assess transparency of the peer-review process at academic journals shows promising reliability and validity. The transparency of the peer-review process can be seen as an indicator of peer-review quality allowing the tool to be used to predict academic quality in new journals. PMID:26824759

  19. Simulation of Triple Oxidation Ditch Wastewater Treatment Process

    NASA Astrophysics Data System (ADS)

    Yang, Yue; Zhang, Jinsong; Liu, Lixiang; Hu, Yongfeng; Xu, Ziming

    2010-11-01

    This paper presented the modeling mechanism and method of a sewage treatment system. A triple oxidation ditch process of a WWTP was simulated based on activated sludge model ASM2D with GPS-X software. In order to identify the adequate model structure to be implemented into the GPS-X environment, the oxidation ditch was divided into several completely stirred tank reactors depended on the distribution of aeration devices and dissolved oxygen concentration. The removal efficiency of COD, ammonia nitrogen, total nitrogen, total phosphorus and SS were simulated by GPS-X software with influent quality data of this WWTP from June to August 2009, to investigate the differences between the simulated results and the actual results. The results showed that, the simulated values could well reflect the actual condition of the triple oxidation ditch process. Mathematical modeling method was appropriate in effluent quality predicting and process optimizing.

  20. Deep learning architecture for air quality predictions.

    PubMed

    Li, Xiang; Peng, Ling; Hu, Yuan; Shao, Jing; Chi, Tianhe

    2016-11-01

    With the rapid development of urbanization and industrialization, many developing countries are suffering from heavy air pollution. Governments and citizens have expressed increasing concern regarding air pollution because it affects human health and sustainable development worldwide. Current air quality prediction methods mainly use shallow models; however, these methods produce unsatisfactory results, which inspired us to investigate methods of predicting air quality based on deep architecture models. In this paper, a novel spatiotemporal deep learning (STDL)-based air quality prediction method that inherently considers spatial and temporal correlations is proposed. A stacked autoencoder (SAE) model is used to extract inherent air quality features, and it is trained in a greedy layer-wise manner. Compared with traditional time series prediction models, our model can predict the air quality of all stations simultaneously and shows the temporal stability in all seasons. Moreover, a comparison with the spatiotemporal artificial neural network (STANN), auto regression moving average (ARMA), and support vector regression (SVR) models demonstrates that the proposed method of performing air quality predictions has a superior performance.

  1. Biogenic organic emissions, air quality and climate

    NASA Astrophysics Data System (ADS)

    Guenther, A. B.

    2015-12-01

    Living organisms produce copious amounts of a diverse array of metabolites including many volatile organic compounds that are released into the atmosphere. These compounds participate in numerous chemical reactions that influence the atmospheric abundance of important air pollutants and short-lived climate forcers including organic aerosol, ozone and methane. The production and release of these organics are strongly influenced by environmental conditions including air pollution, temperature, solar radiation, and water availability and they are highly sensitive to stress and extreme events. As a result, releases of biogenic organics to the atmosphere have an impact on, and are sensitive to, air quality and climate leading to potential feedback couplings. Their role in linking air quality and climate is conceptually clear but an accurate quantitative representation is needed for predictive models. Progress towards this goal will be presented including numerical model development and assessments of the predictive capability of the Model of Emission of Gases and Aerosols from Nature (MEGAN). Recent studies of processes controlling the magnitude and variations in biogenic organic emissions will be described and observations of their impact on atmospheric composition will be shown. Recent advances and priorities for future research will be discussed including laboratory process studies, long-term measurements, multi-scale regional studies, global satellite observations, and the development of a next generation model for simulating land-atmosphere chemical exchange.

  2. Performance prediction of optical image stabilizer using SVM for shaker-free production line

    NASA Astrophysics Data System (ADS)

    Kim, HyungKwan; Lee, JungHyun; Hyun, JinWook; Lim, Haekeun; Kim, GyuYeol; Moon, HyukSoo

    2016-04-01

    Recent smartphones adapt the camera module with optical image stabilizer(OIS) to enhance imaging quality in handshaking conditions. However, compared to the non-OIS camera module, the cost for implementing the OIS module is still high. One reason is that the production line for the OIS camera module requires a highly precise shaker table in final test process, which increases the unit cost of the production. In this paper, we propose a framework for the OIS quality prediction that is trained with the support vector machine and following module characterizing features : noise spectral density of gyroscope, optically measured linearity and cross-axis movement of hall and actuator. The classifier was tested on an actual production line and resulted in 88% accuracy of recall rate.

  3. Two personalities, one relationship: both partners' personality traits shape the quality of their relationship.

    PubMed

    Robins, R W; Caspi, A; Moffitt, T E

    2000-08-01

    This research tested 6 models of the independent and interactive effects of stable personality traits on each partner's reports of relationship satisfaction and quality. Both members of 360 couples (N = 720) completed the Multidimensional Personality Questionnaire and were interviewed about their relationship. Findings show that a woman's relationship happiness is predicted by her partner's low Negative Emotionality, high Positive Emotionality, and high Constraint, whereas a man's relationship happiness is predicted only by his partner's low Negative Emotionality. Findings also show evidence of additive but not interactive effects: Each partner's personality contributed independently to relationship outcomes but not in a synergistic way. These results are discussed in relation to models that seek to integrate research on individual differences in personality traits with research on interpersonal processes in intimate relationships.

  4. Groundwater pollution by nitrates from livestock wastes.

    PubMed Central

    Goldberg, V M

    1989-01-01

    Utilization of wastes from livestock complexes for irrigation involves the danger of groundwater pollution by nitrates. In order to prevent and minimize pollution, it is necessary to apply geological-hydrogeological evidence and concepts to the situation of wastewater irrigation for the purposes of studying natural groundwater protectiveness and predicting changes in groundwater quality as a result of infiltrating wastes. The procedure of protectiveness evaluation and quality prediction is described. With groundwater pollution by nitrate nitrogen, the concentration of ammonium nitrogen noticeably increases. One of the reasons for this change is the process of denitrification due to changes in the hydrogeochemical conditions in a layer. At representative field sites, it is necessary to collect systematic stationary observations of the concentrations of nitrogenous compounds in groundwater and changes in redox conditions and temperature. PMID:2620669

  5. Challenges and opportunities to improve understanding on wetland ecosystem and function at the local catchment scale: data fusion, data-model integration, and prediction uncertainty.

    NASA Astrophysics Data System (ADS)

    Yeo, I. Y.

    2016-12-01

    Wetlands are valuable landscape features that provide important ecosystem functions and services. The ecosystem processes in wetlands are highly dependent on the hydrology. However, hydroperiod (i.e., change dynamics in inundation extent) is highly variable spatially and temporarily, and extremely difficult to predict owing to the complexity in hydrological processes within wetlands and its interaction with surrounding areas. This study reports the challenges and progress in assessing the catchment scale benefits of wetlands to regulate hydrological regime and water quality improvement in agricultural watershed. A process-based watershed model, Soil and Water Assessment Tool (SWAT) was improved to simulate the cumulative impacts of wetlands on downstream. Newly developed remote sensing products from LiDAR intensity and time series Landsat records, which show the inter-annual changes in fraction inundation, were utilized to describe the change status of inundated areas within forested wetlands, develop spatially varying wetland parameters, and evaluate the predicted inundated areas at the landscape level. We outline the challenges on developing the time series inundation mapping products at a high spatial and temporal resolution and reconciling the catchment scale model with the moderate remote sensing products. We then highlight the importance of integrating spatialized information to model calibration and evaluation to address the issues of equi-finality and prediction uncertainty. This integrated approach was applied to the upper region of Choptank River Watershed, the agricultural watershed in the Coastal Plain of Chesapeake Bay Watershed (in US). In the Mid- Atlantic US, the provision of pollution regulation services provided by wetlands has been emphasized due to declining water quality within the Chesapeake Bay and watersheds, and the preservation and restoration of wetlands has become the top priority to manage nonpoint source water pollution.

  6. Spatiotemporal dynamics of landscape pattern and hydrologic process in watershed systems

    NASA Astrophysics Data System (ADS)

    Randhir, Timothy O.; Tsvetkova, Olga

    2011-06-01

    SummaryLand use change is influenced by spatial and temporal factors that interact with watershed resources. Modeling these changes is critical to evaluate emerging land use patterns and to predict variation in water quantity and quality. The objective of this study is to model the nature and emergence of spatial patterns in land use and water resource impacts using a spatially explicit and dynamic landscape simulation. Temporal changes are predicted using a probabilistic Markovian process and spatial interaction through cellular automation. The MCMC (Monte Carlo Markov Chain) analysis with cellular automation is linked to hydrologic equations to simulate landscape patterns and processes. The spatiotemporal watershed dynamics (SWD) model is applied to a subwatershed in the Blackstone River watershed of Massachusetts to predict potential land use changes and expected runoff and sediment loading. Changes in watershed land use and water resources are evaluated over 100 years at a yearly time step. Results show high potential for rapid urbanization that could result in lowering of groundwater recharge and increased storm water peaks. The watershed faces potential decreases in agricultural and forest area that affect open space and pervious cover of the watershed system. Water quality deteriorated due to increased runoff which can also impact stream morphology. While overland erosion decreased, instream erosion increased from increased runoff from urban areas. Use of urban best management practices (BMPs) in sensitive locations, preventive strategies, and long-term conservation planning will be useful in sustaining the watershed system.

  7. A dyadic analysis of stress processes in Latinas with breast cancer and their family caregivers.

    PubMed

    Segrin, Chris; Badger, Terry A; Sikorskii, Alla; Crane, Tracy E; Pace, Thaddeus W W

    2018-03-01

    Breast cancer diagnosis and treatment negatively affect quality of life for survivors and their family caregivers. The stress process model has been useful for describing the cascade of social and psychological experiences that culminate in degraded quality of life for both survivors and their family caregivers. This study is designed to test theoretically specified predictors of negative psychosocial outcomes in a dyadic context. Participants were 230 dyads composed of Latinas recently diagnosed with breast cancer and their primary family caregiver, who completed measures of socioeconomic status, stress, family conflict, depression, and anxiety. Data were analyzed following the Actor-Partner Interdependence Mediation Model in structural equation modeling. For both survivors and caregivers, there were significant direct and indirect actor effects (through family conflict) of perceived stress on depression and anxiety. Several indirect partner effects were also evident in this sample. Specifically, caregivers' stress was predictive of survivors' depression and anxiety through survivors' increased perceptions of family conflict. As predicted by the stress process model, stress and family conflict were predictive of psychological distress in breast cancer survivors and their family caregivers. Significant partner effects in the Actor-Partner Interdependence Mediation Model suggest that there are some dyadic influences, particularly from caregivers' stress to survivors' perceptions of exacerbated family conflict. These findings show how strained family relationships can aggravate the well-being of cancer survivors and their family caregivers through this challenging experience. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Risk Factors Analysis and Death Prediction in Some Life-Threatening Ailments Using Chi-Square Case-Based Reasoning (χ2 CBR) Model.

    PubMed

    Adeniyi, D A; Wei, Z; Yang, Y

    2018-01-30

    A wealth of data are available within the health care system, however, effective analysis tools for exploring the hidden patterns in these datasets are lacking. To alleviate this limitation, this paper proposes a simple but promising hybrid predictive model by suitably combining the Chi-square distance measurement with case-based reasoning technique. The study presents the realization of an automated risk calculator and death prediction in some life-threatening ailments using Chi-square case-based reasoning (χ 2 CBR) model. The proposed predictive engine is capable of reducing runtime and speeds up execution process through the use of critical χ 2 distribution value. This work also showcases the development of a novel feature selection method referred to as frequent item based rule (FIBR) method. This FIBR method is used for selecting the best feature for the proposed χ 2 CBR model at the preprocessing stage of the predictive procedures. The implementation of the proposed risk calculator is achieved through the use of an in-house developed PHP program experimented with XAMP/Apache HTTP server as hosting server. The process of data acquisition and case-based development is implemented using the MySQL application. Performance comparison between our system, the NBY, the ED-KNN, the ANN, the SVM, the Random Forest and the traditional CBR techniques shows that the quality of predictions produced by our system outperformed the baseline methods studied. The result of our experiment shows that the precision rate and predictive quality of our system in most cases are equal to or greater than 70%. Our result also shows that the proposed system executes faster than the baseline methods studied. Therefore, the proposed risk calculator is capable of providing useful, consistent, faster, accurate and efficient risk level prediction to both the patients and the physicians at any time, online and on a real-time basis.

  9. Evolution of Quality Assurance for Clinical Immunohistochemistry in the Era of Precision Medicine: Part 4: Tissue Tools for Quality Assurance in Immunohistochemistry.

    PubMed

    Cheung, Carol C; D'Arrigo, Corrado; Dietel, Manfred; Francis, Glenn D; Fulton, Regan; Gilks, C Blake; Hall, Jacqueline A; Hornick, Jason L; Ibrahim, Merdol; Marchetti, Antonio; Miller, Keith; van Krieken, J Han; Nielsen, Soren; Swanson, Paul E; Taylor, Clive R; Vyberg, Mogens; Zhou, Xiaoge; Torlakovic, Emina E

    2017-04-01

    The numbers of diagnostic, prognostic, and predictive immunohistochemistry (IHC) tests are increasing; the implementation and validation of new IHC tests, revalidation of existing tests, as well as the on-going need for daily quality assurance monitoring present significant challenges to clinical laboratories. There is a need for proper quality tools, specifically tissue tools that will enable laboratories to successfully carry out these processes. This paper clarifies, through the lens of laboratory tissue tools, how validation, verification, and revalidation of IHC tests can be performed in order to develop and maintain high quality "fit-for-purpose" IHC testing in the era of precision medicine. This is the final part of the 4-part series "Evolution of Quality Assurance for Clinical Immunohistochemistry in the Era of Precision Medicine."

  10. Using technology to enhance the quality of home health care: three case studies of health information technology initiatives at the visiting nurse service of New York.

    PubMed

    Russell, David; Rosenfeld, Peri; Ames, Sylvia; Rosati, Robert J

    2010-01-01

    There is a growing recognition among health services researchers and policy makers that Health Information Technology (HIT) has the potential to address challenging issues that face patients and providers of healthcare. The Visiting Nurse Service of New York (VNSNY), a large not-for-profit home healthcare agency, has integrated technology applications into the service delivery model of several programs. Case studies, including the development and implementation, of three informatics initiatives at VNSNY are presented on: (1) Quality Scorecards that utilize process, outcomes, cost, and satisfaction measures to assess performance among clinical staff and programs; (2) a tool to identify patients at risk of being hospitalized, and (3) a predictive model that identifies patients who are eligible for physical rehabilitation services. Following a description of these initiatives, we discuss their impact on quality and process indicators, as well as the opportunities and challenges to implementation. © 2010 National Association for Healthcare Quality.

  11. Measuring up: Implementing a dental quality measure in the electronic health record context.

    PubMed

    Bhardwaj, Aarti; Ramoni, Rachel; Kalenderian, Elsbeth; Neumann, Ana; Hebballi, Nutan B; White, Joel M; McClellan, Lyle; Walji, Muhammad F

    2016-01-01

    Quality improvement requires using quality measures that can be implemented in a valid manner. Using guidelines set forth by the Meaningful Use portion of the Health Information Technology for Economic and Clinical Health Act, the authors assessed the feasibility and performance of an automated electronic Meaningful Use dental clinical quality measure to determine the percentage of children who received fluoride varnish. The authors defined how to implement the automated measure queries in a dental electronic health record. Within records identified through automated query, the authors manually reviewed a subsample to assess the performance of the query. The automated query results revealed that 71.0% of patients had fluoride varnish compared with the manual chart review results that indicated 77.6% of patients had fluoride varnish. The automated quality measure performance results indicated 90.5% sensitivity, 90.8% specificity, 96.9% positive predictive value, and 75.2% negative predictive value. The authors' findings support the feasibility of using automated dental quality measure queries in the context of sufficient structured data. Information noted only in free text rather than in structured data would require using natural language processing approaches to effectively query electronic health records. To participate in self-directed quality improvement, dental clinicians must embrace the accountability era. Commitment to quality will require enhanced documentation to support near-term automated calculation of quality measures. Copyright © 2016 American Dental Association. Published by Elsevier Inc. All rights reserved.

  12. Comparison of laboratory and field remote sensing methods to measure forage quality.

    PubMed

    Guo, Xulin; Wilmshurst, John F; Li, Zhaoqin

    2010-09-01

    Recent research in range ecology has emphasized the importance of forage quality as a key indicator of rangeland condition. However, we lack tools to evaluate forage quality at scales appropriate for management. Using canopy reflectance data to measure forage quality has been conducted at both laboratory and field levels separately, but little work has been conducted to evaluate these methods simultaneously. The objective of this study is to find a reliable way of assessing grassland quality through measuring forage chemistry with reflectance. We studied a mixed grass ecosystem in Grasslands National Park of Canada and surrounding pastures, located in southern Saskatchewan. Spectral reflectance was collected at both in-situ field level and in the laboratory. Vegetation samples were collected at each site, sorted into the green grass portion, and then sent to a chemical company for measuring forage quality variables, including protein, lignin, ash, moisture at 135 °C, Neutral Detergent Fiber (NDF), Acid Detergent Fiber (ADF), Total Digestible, Digestible Energy, Net Energy for Lactation, Net Energy for Maintenance, and Net Energy for Gain. Reflectance data were processed with the first derivative transformation and continuum removal method. Correlation analysis was conducted on spectral and forage quality variables. A regression model was further built to investigate the possibility of using canopy spectral measurements to predict the grassland quality. Results indicated that field level prediction of protein of mixed grass species was possible (r² = 0.63). However, the relationship between canopy reflectance and the other forage quality variables was not strong.

  13. Applying Knowledge Discovery in Databases in Public Health Data Set: Challenges and Concerns

    PubMed Central

    Volrathongchia, Kanittha

    2003-01-01

    In attempting to apply Knowledge Discovery in Databases (KDD) to generate a predictive model from a health care dataset that is currently available to the public, the first step is to pre-process the data to overcome the challenges of missing data, redundant observations, and records containing inaccurate data. This study will demonstrate how to use simple pre-processing methods to improve the quality of input data. PMID:14728545

  14. Laser welding of polymers: phenomenological model for a quick and reliable process quality estimation considering beam shape influences

    NASA Astrophysics Data System (ADS)

    Timpe, Nathalie F.; Stuch, Julia; Scholl, Marcus; Russek, Ulrich A.

    2016-03-01

    This contribution presents a phenomenological, analytical model for laser welding of polymers which is suited for a quick process quality estimation for the practitioner. Besides material properties of the polymer and processing parameters like welding pressure, feed rate and laser power the model is based on a simple few parameter description of the size and shape of the laser power density distribution (PDD) in the processing zone. The model allows an estimation of the weld seam tensile strength. It is based on energy balance considerations within a thin sheet with the thickness of the optical penetration depth on the surface of the absorbing welding partner. The joining process itself is modelled by a phenomenological approach. The model reproduces the experimentally known process windows for the main process parameters correctly. Using the parameters describing the shape of the laser PDD the critical dependence of the process windows on the PDD shape will be predicted and compared with experiments. The adaption of the model to other laser manufacturing processes where the PDD influence can be modelled comparably will be discussed.

  15. Quality of Education Predicts Performance on the Wide Range Achievement Test-4th Edition Word Reading Subtest

    PubMed Central

    Sayegh, Philip; Arentoft, Alyssa; Thaler, Nicholas S.; Dean, Andy C.; Thames, April D.

    2014-01-01

    The current study examined whether self-rated education quality predicts Wide Range Achievement Test-4th Edition (WRAT-4) Word Reading subtest and neurocognitive performance, and aimed to establish this subtest's construct validity as an educational quality measure. In a community-based adult sample (N = 106), we tested whether education quality both increased the prediction of Word Reading scores beyond demographic variables and predicted global neurocognitive functioning after adjusting for WRAT-4. As expected, race/ethnicity and education predicted WRAT-4 reading performance. Hierarchical regression revealed that when including education quality, the amount of WRAT-4's explained variance increased significantly, with race/ethnicity and both education quality and years as significant predictors. Finally, WRAT-4 scores, but not education quality, predicted neurocognitive performance. Results support WRAT-4 Word Reading as a valid proxy measure for education quality and a key predictor of neurocognitive performance. Future research should examine these findings in larger, more diverse samples to determine their robust nature. PMID:25404004

  16. Fluorescence Spectroscopy and Chemometric Modeling for Bioprocess Monitoring

    PubMed Central

    Faassen, Saskia M.; Hitzmann, Bernd

    2015-01-01

    On-line sensors for the detection of crucial process parameters are desirable for the monitoring, control and automation of processes in the biotechnology, food and pharma industry. Fluorescence spectroscopy as a highly developed and non-invasive technique that enables the on-line measurements of substrate and product concentrations or the identification of characteristic process states. During a cultivation process significant changes occur in the fluorescence spectra. By means of chemometric modeling, prediction models can be calculated and applied for process supervision and control to provide increased quality and the productivity of bioprocesses. A range of applications for different microorganisms and analytes has been proposed during the last years. This contribution provides an overview of different analysis methods for the measured fluorescence spectra and the model-building chemometric methods used for various microbial cultivations. Most of these processes are observed using the BioView® Sensor, thanks to its robustness and insensitivity to adverse process conditions. Beyond that, the PLS-method is the most frequently used chemometric method for the calculation of process models and prediction of process variables. PMID:25942644

  17. Using prediction markets to forecast research evaluations.

    PubMed

    Munafo, Marcus R; Pfeiffer, Thomas; Altmejd, Adam; Heikensten, Emma; Almenberg, Johan; Bird, Alexander; Chen, Yiling; Wilson, Brad; Johannesson, Magnus; Dreber, Anna

    2015-10-01

    The 2014 Research Excellence Framework (REF2014) was conducted to assess the quality of research carried out at higher education institutions in the UK over a 6 year period. However, the process was criticized for being expensive and bureaucratic, and it was argued that similar information could be obtained more simply from various existing metrics. We were interested in whether a prediction market on the outcome of REF2014 for 33 chemistry departments in the UK would provide information similar to that obtained during the REF2014 process. Prediction markets have become increasingly popular as a means of capturing what is colloquially known as the 'wisdom of crowds', and enable individuals to trade 'bets' on whether a specific outcome will occur or not. These have been shown to be successful at predicting various outcomes in a number of domains (e.g. sport, entertainment and politics), but have rarely been tested against outcomes based on expert judgements such as those that formed the basis of REF2014.

  18. Using prediction markets to forecast research evaluations

    PubMed Central

    Munafo, Marcus R.; Pfeiffer, Thomas; Altmejd, Adam; Heikensten, Emma; Almenberg, Johan; Bird, Alexander; Chen, Yiling; Wilson, Brad; Johannesson, Magnus; Dreber, Anna

    2015-01-01

    The 2014 Research Excellence Framework (REF2014) was conducted to assess the quality of research carried out at higher education institutions in the UK over a 6 year period. However, the process was criticized for being expensive and bureaucratic, and it was argued that similar information could be obtained more simply from various existing metrics. We were interested in whether a prediction market on the outcome of REF2014 for 33 chemistry departments in the UK would provide information similar to that obtained during the REF2014 process. Prediction markets have become increasingly popular as a means of capturing what is colloquially known as the ‘wisdom of crowds’, and enable individuals to trade ‘bets’ on whether a specific outcome will occur or not. These have been shown to be successful at predicting various outcomes in a number of domains (e.g. sport, entertainment and politics), but have rarely been tested against outcomes based on expert judgements such as those that formed the basis of REF2014. PMID:26587243

  19. Maps showing predicted probabilities for selected dissolved oxygen and dissolved manganese threshold events in depth zones used by the domestic and public drinking water supply wells, Central Valley, California

    USGS Publications Warehouse

    Rosecrans, Celia Z.; Nolan, Bernard T.; Gronberg, JoAnn M.

    2018-01-31

    The purpose of the prediction grids for selected redox constituents—dissolved oxygen and dissolved manganese—are intended to provide an understanding of groundwater-quality conditions at the domestic and public-supply drinking water depths. The chemical quality of groundwater and the fate of many contaminants is influenced by redox processes in all aquifers, and understanding the redox conditions horizontally and vertically is critical in evaluating groundwater quality. The redox condition of groundwater—whether oxic (oxygen present) or anoxic (oxygen absent)—strongly influences the oxidation state of a chemical in groundwater. The anoxic dissolved oxygen thresholds of <0.5 milligram per liter (mg/L), <1.0 mg/L, and <2.0 mg/L were selected to apply broadly to regional groundwater-quality investigations. Although the presence of dissolved manganese in groundwater indicates strongly reducing (anoxic) groundwater conditions, it is also considered a “nuisance” constituent in drinking water, making drinking water undesirable with respect to taste, staining, or scaling. Three dissolved manganese thresholds, <50 micrograms per liter (µg/L), <150 µg/L, and <300 µg/L, were selected to create predicted probabilities of exceedances in depth zones used by domestic and public-supply water wells. The 50 µg/L event threshold represents the secondary maximum contaminant level (SMCL) benchmark for manganese (U.S. Environmental Protection Agency, 2017; California Division of Drinking Water, 2014), whereas the 300 µg/L event threshold represents the U.S. Geological Survey (USGS) health-based screening level (HBSL) benchmark, used to put measured concentrations of drinking-water contaminants into a human-health context (Toccalino and others, 2014). The 150 µg/L event threshold represents one-half the USGS HBSL. The resultant dissolved oxygen and dissolved manganese prediction grids may be of interest to water-resource managers, water-quality researchers, and groundwater modelers concerned with the occurrence of natural and anthropogenic contaminants related to anoxic conditions. Prediction grids for selected redox constituents and thresholds were created by the USGS National Water-Quality Assessment (NAWQA) modeling and mapping team.

  20. Longitudinal associations between sleep and anxiety during pregnancy, and the moderating effect of resilience, using parallel process latent growth curve models.

    PubMed

    van der Zwan, Judith Esi; de Vente, Wieke; Tolvanen, Mimmi; Karlsson, Hasse; Buil, J Marieke; Koot, Hans M; Paavonen, E Juulia; Polo-Kantola, Päivi; Huizink, Anja C; Karlsson, Linnea

    2017-12-01

    For many women, pregnancy-related sleep disturbances and pregnancy-related anxiety change as pregnancy progresses and both are associated with lower maternal quality of life and less favorable birth outcomes. Thus, the interplay between these two problems across pregnancy is of interest. In addition, psychological resilience may explain individual differences in this association, as it may promote coping with both sleep disturbances and anxiety, and thereby reduce their mutual effects. Therefore, the aim of the current study was to examine whether sleep quality and sleep duration, and changes in sleep are associated with the level of and changes in anxiety during pregnancy. Furthermore, the study tested the moderating effect of resilience on these associations. At gestational weeks 14, 24, and 34, 532 pregnant women from the FinnBrain Birth Cohort Study in Finland filled out questionnaires on general sleep quality, sleep duration and pregnancy-related anxiety; resilience was assessed in week 14. Parallel process latent growth curve models showed that shorter initial sleep duration predicted a higher initial level of anxiety, and a higher initial anxiety level predicted a faster shortening of sleep duration. Changes in sleep duration and changes in anxiety over the course of pregnancy were not related. The predicted moderating effect of resilience was not found. The results suggested that pregnant women reporting anxiety problems should also be screened for sleeping problems, and vice versa, because women who experienced one of these pregnancy-related problems were also at risk of experiencing or developing the other problem. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Prediction Interval Development for Wind-Tunnel Balance Check-Loading

    NASA Technical Reports Server (NTRS)

    Landman, Drew; Toro, Kenneth G.; Commo, Sean A.; Lynn, Keith C.

    2014-01-01

    Results from the Facility Analysis Verification and Operational Reliability project revealed a critical gap in capability in ground-based aeronautics research applications. Without a standardized process for check-loading the wind-tunnel balance or the model system, the quality of the aerodynamic force data collected varied significantly between facilities. A prediction interval is required in order to confirm a check-loading. The prediction interval provides an expected upper and lower bound on balance load prediction at a given confidence level. A method has been developed which accounts for sources of variability due to calibration and check-load application. The prediction interval method of calculation and a case study demonstrating its use is provided. Validation of the methods is demonstrated for the case study based on the probability of capture of confirmation points.

  2. Lexical quality and executive control predict children's first and second language reading comprehension.

    PubMed

    Raudszus, Henriette; Segers, Eliane; Verhoeven, Ludo

    2018-01-01

    This study compared how lexical quality (vocabulary and decoding) and executive control (working memory and inhibition) predict reading comprehension directly as well as indirectly, via syntactic integration, in monolingual and bilingual fourth grade children. The participants were 76 monolingual and 102 bilingual children (mean age 10 years, SD  = 5 months) learning to read Dutch in the Netherlands. Bilingual children showed lower Dutch vocabulary, syntactic integration and reading comprehension skills, but better decoding skills than their monolingual peers. There were no differences in working memory or inhibition. Multigroup path analysis showed relatively invariant connections between predictors and reading comprehension for monolingual and bilingual readers. For both groups, there was a direct effect of lexical quality on reading comprehension. In addition, lexical quality and executive control indirectly influenced reading comprehension via syntactic integration. The groups differed in that inhibition more strongly predicted syntactic integration for bilingual than for monolingual children. For a subgroup of bilingual children, for whom home language vocabulary data were available ( n  = 56), there was an additional positive effect of home language vocabulary on second language reading comprehension. Together, the results suggest that similar processes underlie reading comprehension in first and second language readers, but that syntactic integration requires more executive control in second language reading. Moreover, bilingual readers additionally benefit from first language vocabulary to arrive at second language reading comprehension.

  3. Stress generation in a developmental context: the role of youth depressive symptoms, maternal depression, the parent-child relationship, and family stress.

    PubMed

    Chan, Priscilla T; Doan, Stacey N; Tompson, Martha C

    2014-02-01

    The present study examined stress generation in a developmental and family context among 171 mothers and their preadolescent children, ages 8-12 years, at baseline (Time 1) and 1-year follow-up (Time 2). In the current study, we examined the bidirectional relationship between children's depressive symptoms and dependent family stress. Results suggest that children's baseline level of depressive symptoms predicted the generation of dependent family stress 1 year later. However, baseline dependent family stress did not predict an increase in children's depressive symptoms 1 year later. In addition, we examined whether a larger context of both child chronic strain (indicated by academic, behavioral, and peer stress) and family factors, including socioeconomic status and parent-child relationship quality, would influence the stress generation process. Although both chronic strain and socioeconomic status were not associated with dependent family stress at Time 2, poorer parent-child relationship quality significantly predicted greater dependent family stress at Time 2. Child chronic strain, but neither socioeconomic status nor parent-child relationship quality, predicted children's depression symptoms at Time 2. Finally, gender, maternal depression history, and current maternal depressive symptoms did not moderate the relationship between level of dependent family stress and depressive symptoms. Overall, findings provide partial support for a developmental stress generation model operating in the preadolescent period.

  4. A hybrid artificial neural network as a software sensor for optimal control of a wastewater treatment process.

    PubMed

    Choi, D J; Park, H

    2001-11-01

    For control and automation of biological treatment processes, lack of reliable on-line sensors to measure water quality parameters is one of the most important problems to overcome. Many parameters cannot be measured directly with on-line sensors. The accuracy of existing hardware sensors is also not sufficient and maintenance problems such as electrode fouling often cause trouble. This paper deals with the development of software sensor techniques that estimate the target water quality parameter from other parameters using the correlation between water quality parameters. We focus our attention on the preprocessing of noisy data and the selection of the best model feasible to the situation. Problems of existing approaches are also discussed. We propose a hybrid neural network as a software sensor inferring wastewater quality parameter. Multivariate regression, artificial neural networks (ANN), and a hybrid technique that combines principal component analysis as a preprocessing stage are applied to data from industrial wastewater processes. The hybrid ANN technique shows an enhancement of prediction capability and reduces the overfitting problem of neural networks. The result shows that the hybrid ANN technique can be used to extract information from noisy data and to describe the nonlinearity of complex wastewater treatment processes.

  5. Influence of Traffic Vehicles Against Ground Fundamental Frequency Prediction using Ambient Vibration Technique

    NASA Astrophysics Data System (ADS)

    Kamarudin, A. F.; Noh, M. S. Md; Mokhatar, S. N.; Anuar, M. A. Mohd; Ibrahim, A.; Ibrahim, Z.; Daud, M. E.

    2018-04-01

    Ambient vibration (AV) technique is widely used nowadays for ground fundamental frequency prediction. This technique is easy, quick, non-destructive, less operator required and reliable result. The input motions of ambient vibration are originally collected from surrounding natural and artificial excitations. But, careful data acquisition controlled must be implemented to reduce the intrusion of short period noise that could imply the quality of frequency prediction of an investigated site. In this study, investigation on the primary noise intrusion under peak (morning, afternoon and evening) and off peak (early morning) traffic flows (only 8 meter from sensor to road shoulder) against the stability and quality of ground fundamental frequency prediction were carried out. None of specific standard is available for AV data acquisition and processing. Thus, some field and processing parameters recommended by previous studies and guideline were considered. Two units of 1 Hz tri-axial seismometer sensor were closely positioned in front of the main entrance Universiti Tun Hussein Onn Malaysia. 15 minutes of recording length were taken during peak and off peak periods of traffic flows. All passing vehicles were counted and grouped into four classes. Three components of ambient vibration time series recorded in the North-South: NS, East-West: EW and vertical: UD directions were automatically computed into Horizontal to Vertical Spectral Ratio (HVSR), by using open source software of GEOPSY for fundamental ground frequency, Fo determination. Single sharp peak pattern of HVSR curves have been obtained at peak frequencies between 1.33 to 1.38 Hz which classified under soft to dense soil classification. Even identical HVSR curves pattern with close frequencies prediction were obtained under both periods of AV measurement, however the total numbers of stable and quality windows selected for HVSR computation were significantly different but both have satisfied the requirement given by SESAME (2004) guideline. Besides, the second peak frequencies from the early morning HVSR curve was clearly indicated between 8.23 to 8.55 Hz at very low amplitude (Ao < 2), but it should be neglected according to the similar guideline criteria. In conclusion, the ground fundamental frequency using HVSR method was successfully determined by 1 Hz seismometer instrument with recommended to specific parameters consideration on field as well as data processing, without disruption from the nearest traffic excitations.

  6. The structure and health correlates of trait repetitive thought in older adults.

    PubMed

    Segerstrom, Suzanne C; Roach, Abbey R; Evans, Daniel R; Schipper, Lindsey J; Darville, Audrey K

    2010-09-01

    Repetitive thought (RT) involves frequent or prolonged thoughts about oneself and one's world, encompassing discrete forms such as trait worry, rumination, processing, and reminiscing. These forms of RT can be described using 3 basic, underlying qualities: total propensity for RT of all types, valence (positive vs. negative content), and purpose (searching or uncertainty vs. solving or certainty). The adaptiveness of discrete forms with regard to health is likely to be related to these qualities, particularly valence and total propensity. The present study confirmed the model and identified the relationship of these qualities of RT to subjective psychological, physical, and cognitive health in older adults aged 60-94 (N = 179). As predicted, more negatively valenced trait RT was associated with worse psychological, physical, and cognitive health. More total propensity for RT was associated only with worse psychological health. Searching purpose was associated only with worse cognitive health. In turn, negatively valenced RT was predicted by poorer executive functions, suggesting that such functions may be important for directing this quality of RT. The valence of older adults' RT is important insofar as it may contribute to their sense of good or ill health. However, the propensity for all kinds of RT to associate with poorer psychological health may reflect the co-occurrence of negative and positive RT, such as rumination and emotional processing. Although RT has not been extensively investigated in older adults, it appears to play an important role in their subjective health. (c) 2010 APA, all rights reserved.

  7. Image enhancement using the hypothesis selection filter: theory and application to JPEG decoding.

    PubMed

    Wong, Tak-Shing; Bouman, Charles A; Pollak, Ilya

    2013-03-01

    We introduce the hypothesis selection filter (HSF) as a new approach for image quality enhancement. We assume that a set of filters has been selected a priori to improve the quality of a distorted image containing regions with different characteristics. At each pixel, HSF uses a locally computed feature vector to predict the relative performance of the filters in estimating the corresponding pixel intensity in the original undistorted image. The prediction result then determines the proportion of each filter used to obtain the final processed output. In this way, the HSF serves as a framework for combining the outputs of a number of different user selected filters, each best suited for a different region of an image. We formulate our scheme in a probabilistic framework where the HSF output is obtained as the Bayesian minimum mean square error estimate of the original image. Maximum likelihood estimates of the model parameters are determined from an offline fully unsupervised training procedure that is derived from the expectation-maximization algorithm. To illustrate how to apply the HSF and to demonstrate its potential, we apply our scheme as a post-processing step to improve the decoding quality of JPEG-encoded document images. The scheme consistently improves the quality of the decoded image over a variety of image content with different characteristics. We show that our scheme results in quantitative improvements over several other state-of-the-art JPEG decoding methods.

  8. Direct and Indirect Effects of a Family-Based Intervention in Early Adolescence on Parent-Youth Relationship Quality, Late Adolescent Health, and Early Adult Obesity

    PubMed Central

    Van Ryzin, Mark J.; Nowicka, Paulina

    2013-01-01

    We explored family processes in adolescence that may influence the likelihood of obesity in early adulthood using a randomized trial of a family-based intervention (the Family CheckUp, or FCU). The FCU has been shown to reduce escalations in antisocial behavior and depression in adolescence by supporting positive family management practices, but no research has examined the mechanisms by which the FCU could influence health-related attitudes and behaviors linked to obesity. Participants were 998 adolescents (n = 526 male; n = 423 European American; M age 12.21 yrs) and their families, recruited in 6th grade from 3 middle schools in the Pacific Northwest. We used structural equation modeling (SEM) and an Intent-To-Treat (ITT) design to evaluate the direct and indirect effects of the FCU on parent–youth relationship quality (ages 12–15), healthy lifestyle behaviors, eating attitudes, depressive symptoms (all measured at age 17), and obesity (age 22). We found that the FCU led to greater parent–youth relationship quality, which predicted enhanced health-related behaviors, reduced maladaptive eating attitudes, and reduced depression. In turn, reduced maladaptive eating attitudes predicted reduced odds of obesity. The indirect effect of the FCU on obesity by way of parent–youth relationship quality and eating attitudes was significant. Our findings illustrate how family processes may influence adolescent health and suggest that family functioning may be an additional factor to consider when developing intervention programs for obesity. PMID:23421838

  9. Media milling process optimization for manufacture of drug nanoparticles using design of experiments (DOE).

    PubMed

    Nekkanti, Vijaykumar; Marwah, Ashwani; Pillai, Raviraj

    2015-01-01

    Design of experiments (DOE), a component of Quality by Design (QbD), is systematic and simultaneous evaluation of process variables to develop a product with predetermined quality attributes. This article presents a case study to understand the effects of process variables in a bead milling process used for manufacture of drug nanoparticles. Experiments were designed and results were computed according to a 3-factor, 3-level face-centered central composite design (CCD). The factors investigated were motor speed, pump speed and bead volume. Responses analyzed for evaluating these effects and interactions were milling time, particle size and process yield. Process validation batches were executed using the optimum process conditions obtained from software Design-Expert® to evaluate both the repeatability and reproducibility of bead milling technique. Milling time was optimized to <5 h to obtain the desired particle size (d90 < 400 nm). The desirability function used to optimize the response variables and observed responses were in agreement with experimental values. These results demonstrated the reliability of selected model for manufacture of drug nanoparticles with predictable quality attributes. The optimization of bead milling process variables by applying DOE resulted in considerable decrease in milling time to achieve the desired particle size. The study indicates the applicability of DOE approach to optimize critical process parameters in the manufacture of drug nanoparticles.

  10. Development of a design space and predictive statistical model for capsule filling of low-fill-weight inhalation products.

    PubMed

    Faulhammer, E; Llusa, M; Wahl, P R; Paudel, A; Lawrence, S; Biserni, S; Calzolari, V; Khinast, J G

    2016-01-01

    The objectives of this study were to develop a predictive statistical model for low-fill-weight capsule filling of inhalation products with dosator nozzles via the quality by design (QbD) approach and based on that to create refined models that include quadratic terms for significant parameters. Various controllable process parameters and uncontrolled material attributes of 12 powders were initially screened using a linear model with partial least square (PLS) regression to determine their effect on the critical quality attributes (CQA; fill weight and weight variability). After identifying critical material attributes (CMAs) and critical process parameters (CPPs) that influenced the CQA, model refinement was performed to study if interactions or quadratic terms influence the model. Based on the assessment of the effects of the CPPs and CMAs on fill weight and weight variability for low-fill-weight inhalation products, we developed an excellent linear predictive model for fill weight (R(2 )= 0.96, Q(2 )= 0.96 for powders with good flow properties and R(2 )= 0.94, Q(2 )= 0.93 for cohesive powders) and a model that provides a good approximation of the fill weight variability for each powder group. We validated the model, established a design space for the performance of different types of inhalation grade lactose on low-fill weight capsule filling and successfully used the CMAs and CPPs to predict fill weight of powders that were not included in the development set.

  11. Crop biometric maps: the key to prediction.

    PubMed

    Rovira-Más, Francisco; Sáiz-Rubio, Verónica

    2013-09-23

    The sustainability of agricultural production in the twenty-first century, both in industrialized and developing countries, benefits from the integration of farm management with information technology such that individual plants, rows, or subfields may be endowed with a singular "identity." This approach approximates the nature of agricultural processes to the engineering of industrial processes. In order to cope with the vast variability of nature and the uncertainties of agricultural production, the concept of crop biometrics is defined as the scientific analysis of agricultural observations confined to spaces of reduced dimensions and known position with the purpose of building prediction models. This article develops the idea of crop biometrics by setting its principles, discussing the selection and quantization of biometric traits, and analyzing the mathematical relationships among measured and predicted traits. Crop biometric maps were applied to the case of a wine-production vineyard, in which vegetation amount, relative altitude in the field, soil compaction, berry size, grape yield, juice pH, and grape sugar content were selected as biometric traits. The enological potential of grapes was assessed with a quality-index map defined as a combination of titratable acidity, sugar content, and must pH. Prediction models for yield and quality were developed for high and low resolution maps, showing the great potential of crop biometric maps as a strategic tool for vineyard growers as well as for crop managers in general, due to the wide versatility of the methodology proposed.

  12. Crop Biometric Maps: The Key to Prediction

    PubMed Central

    Rovira-Más, Francisco; Sáiz-Rubio, Verónica

    2013-01-01

    The sustainability of agricultural production in the twenty-first century, both in industrialized and developing countries, benefits from the integration of farm management with information technology such that individual plants, rows, or subfields may be endowed with a singular “identity.” This approach approximates the nature of agricultural processes to the engineering of industrial processes. In order to cope with the vast variability of nature and the uncertainties of agricultural production, the concept of crop biometrics is defined as the scientific analysis of agricultural observations confined to spaces of reduced dimensions and known position with the purpose of building prediction models. This article develops the idea of crop biometrics by setting its principles, discussing the selection and quantization of biometric traits, and analyzing the mathematical relationships among measured and predicted traits. Crop biometric maps were applied to the case of a wine-production vineyard, in which vegetation amount, relative altitude in the field, soil compaction, berry size, grape yield, juice pH, and grape sugar content were selected as biometric traits. The enological potential of grapes was assessed with a quality-index map defined as a combination of titratable acidity, sugar content, and must pH. Prediction models for yield and quality were developed for high and low resolution maps, showing the great potential of crop biometric maps as a strategic tool for vineyard growers as well as for crop managers in general, due to the wide versatility of the methodology proposed. PMID:24064605

  13. Seasonal Drought Prediction: Advances, Challenges, and Future Prospects

    NASA Astrophysics Data System (ADS)

    Hao, Zengchao; Singh, Vijay P.; Xia, Youlong

    2018-03-01

    Drought prediction is of critical importance to early warning for drought managements. This review provides a synthesis of drought prediction based on statistical, dynamical, and hybrid methods. Statistical drought prediction is achieved by modeling the relationship between drought indices of interest and a suite of potential predictors, including large-scale climate indices, local climate variables, and land initial conditions. Dynamical meteorological drought prediction relies on seasonal climate forecast from general circulation models (GCMs), which can be employed to drive hydrological models for agricultural and hydrological drought prediction with the predictability determined by both climate forcings and initial conditions. Challenges still exist in drought prediction at long lead time and under a changing environment resulting from natural and anthropogenic factors. Future research prospects to improve drought prediction include, but are not limited to, high-quality data assimilation, improved model development with key processes related to drought occurrence, optimal ensemble forecast to select or weight ensembles, and hybrid drought prediction to merge statistical and dynamical forecasts.

  14. Determining habitat quality for species that demonstrate dynamic habitat selection

    USGS Publications Warehouse

    Beerens, James M.; Frederick, Peter C; Noonburg, Erik G; Gawlik, Dale E.

    2015-01-01

    Determining habitat quality for wildlife populations requires relating a species' habitat to its survival and reproduction. Within a season, species occurrence and density can be disconnected from measures of habitat quality when resources are highly seasonal, unpredictable over time, and patchy. Here we establish an explicit link among dynamic selection of changing resources, spatio-temporal species distributions, and fitness for predictive abundance and occurrence models that are used for short-term water management and long-term restoration planning. We used the wading bird distribution and evaluation models (WADEM) that estimate (1) daily changes in selection across resource gradients, (2) landscape abundance of flocks and individuals, (3) conspecific foraging aggregation, and (4) resource unit occurrence (at fixed 400 m cells) to quantify habitat quality and its consequences on reproduction for wetland indicator species. We linked maximum annual numbers of nests detected across the study area and nesting success of Great Egrets (Ardea alba), White Ibises (Eudocimus albus), and Wood Storks (Mycteria americana) over a 20-year period to estimated daily dynamics of food resources produced by WADEM over a 7490 km2 area. For all species, increases in predicted species abundance in March and high abundance in April were strongly linked to breeding responses. Great Egret nesting effort and success were higher when birds also showed greater conspecific foraging aggregation. Synthesis and applications: This study provides the first empirical evidence that dynamic habitat selection processes and distributions of wading birds over environmental gradients are linked with reproductive measures over periods of decades. Further, predictor variables at a variety of temporal (daily-multiannual) resolutions and spatial (400 m to regional) scales effectively explained variation in ecological processes that change habitat quality. The process used here allows managers to develop short- and long-term conservation strategies that (1) consider flexible behavioral patterns and (2) are robust to environmental variation over time.

  15. A survey of quality assurance practices in biomedical open source software projects.

    PubMed

    Koru, Günes; El Emam, Khaled; Neisa, Angelica; Umarji, Medha

    2007-05-07

    Open source (OS) software is continuously gaining recognition and use in the biomedical domain, for example, in health informatics and bioinformatics. Given the mission critical nature of applications in this domain and their potential impact on patient safety, it is important to understand to what degree and how effectively biomedical OS developers perform standard quality assurance (QA) activities such as peer reviews and testing. This would allow the users of biomedical OS software to better understand the quality risks, if any, and the developers to identify process improvement opportunities to produce higher quality software. A survey of developers working on biomedical OS projects was conducted to examine the QA activities that are performed. We took a descriptive approach to summarize the implementation of QA activities and then examined some of the factors that may be related to the implementation of such practices. Our descriptive results show that 63% (95% CI, 54-72) of projects did not include peer reviews in their development process, while 82% (95% CI, 75-89) did include testing. Approximately 74% (95% CI, 67-81) of developers did not have a background in computing, 80% (95% CI, 74-87) were paid for their contributions to the project, and 52% (95% CI, 43-60) had PhDs. A multivariate logistic regression model to predict the implementation of peer reviews was not significant (likelihood ratio test = 16.86, 9 df, P = .051) and neither was a model to predict the implementation of testing (likelihood ratio test = 3.34, 9 df, P = .95). Less attention is paid to peer review than testing. However, the former is a complementary, and necessary, QA practice rather than an alternative. Therefore, one can argue that there are quality risks, at least at this point in time, in transitioning biomedical OS software into any critical settings that may have operational, financial, or safety implications. Developers of biomedical OS applications should invest more effort in implementing systemic peer review practices throughout the development and maintenance processes.

  16. An Overview of Atmospheric Chemistry and Air Quality Modeling

    NASA Technical Reports Server (NTRS)

    Johnson, Matthew S.

    2017-01-01

    This presentation will include my personal research experience and an overview of atmospheric chemistry and air quality modeling to the participants of the NASA Student Airborne Research Program (SARP 2017). The presentation will also provide examples on ways to apply airborne observations for chemical transport (CTM) and air quality (AQ) model evaluation. CTM and AQ models are important tools in understanding tropospheric-stratospheric composition, atmospheric chemistry processes, meteorology, and air quality. This presentation will focus on how NASA scientist currently apply CTM and AQ models to better understand these topics. Finally, the importance of airborne observation in evaluating these topics and how in situ and remote sensing observations can be used to evaluate and improve CTM and AQ model predictions will be highlighted.

  17. Emotional Intelligence predicts individual differences in social exchange reasoning.

    PubMed

    Reis, Deidre L; Brackett, Marc A; Shamosh, Noah A; Kiehl, Kent A; Salovey, Peter; Gray, Jeremy R

    2007-04-15

    When assessed with performance measures, Emotional Intelligence (EI) correlates positively with the quality of social relationships. However, the bases of such correlations are not understood in terms of cognitive and neural information processing mechanisms. We investigated whether a performance measure of EI is related to reasoning about social situations (specifically social exchange reasoning) using versions of the Wason Card Selection Task. In an fMRI study (N=16), higher EI predicted hemodynamic responses during social reasoning in the left frontal polar and left anterior temporal brain regions, even when controlling for responses on a very closely matched task (precautionary reasoning). In a larger behavioral study (N=48), higher EI predicted faster social exchange reasoning, after controlling for precautionary reasoning. The results are the first to directly suggest that EI is mediated in part by mechanisms supporting social reasoning and validate a new approach to investigating EI in terms of more basic information processing mechanisms.

  18. Computationally efficient confidence intervals for cross-validated area under the ROC curve estimates.

    PubMed

    LeDell, Erin; Petersen, Maya; van der Laan, Mark

    In binary classification problems, the area under the ROC curve (AUC) is commonly used to evaluate the performance of a prediction model. Often, it is combined with cross-validation in order to assess how the results will generalize to an independent data set. In order to evaluate the quality of an estimate for cross-validated AUC, we obtain an estimate of its variance. For massive data sets, the process of generating a single performance estimate can be computationally expensive. Additionally, when using a complex prediction method, the process of cross-validating a predictive model on even a relatively small data set can still require a large amount of computation time. Thus, in many practical settings, the bootstrap is a computationally intractable approach to variance estimation. As an alternative to the bootstrap, we demonstrate a computationally efficient influence curve based approach to obtaining a variance estimate for cross-validated AUC.

  19. Computationally efficient confidence intervals for cross-validated area under the ROC curve estimates

    PubMed Central

    Petersen, Maya; van der Laan, Mark

    2015-01-01

    In binary classification problems, the area under the ROC curve (AUC) is commonly used to evaluate the performance of a prediction model. Often, it is combined with cross-validation in order to assess how the results will generalize to an independent data set. In order to evaluate the quality of an estimate for cross-validated AUC, we obtain an estimate of its variance. For massive data sets, the process of generating a single performance estimate can be computationally expensive. Additionally, when using a complex prediction method, the process of cross-validating a predictive model on even a relatively small data set can still require a large amount of computation time. Thus, in many practical settings, the bootstrap is a computationally intractable approach to variance estimation. As an alternative to the bootstrap, we demonstrate a computationally efficient influence curve based approach to obtaining a variance estimate for cross-validated AUC. PMID:26279737

  20. Impact of chemical polishing on surface roughness and dimensional quality of electron beam melting process (EBM) parts

    NASA Astrophysics Data System (ADS)

    Dolimont, Adrien; Rivière-Lorphèvre, Edouard; Ducobu, François; Backaert, Stéphane

    2018-05-01

    Additive manufacturing is growing faster and faster. This leads us to study the functionalization of the parts that are produced by these processes. Electron Beam melting (EBM) is one of these technologies. It is a powder based additive manufacturing (AM) method. With this process, it is possible to manufacture high-density metal parts with complex topology. One of the big problems with these technologies is the surface finish. To improve the quality of the surface, some finishing operations are needed. In this study, the focus is set on chemical polishing. The goal is to determine how the chemical etching impacts the dimensional accuracy and the surface roughness of EBM parts. To this end, an experimental campaign was carried out on the most widely used material in EBM, Ti6Al4V. Different exposure times were tested. The impact of these times on surface quality was evaluated. To help predicting the excess thickness to be provided, the dimensional impact of chemical polishing on EBM parts was estimated. 15 parts were measured before and after chemical machining. The improvement of surface quality was also evaluated after each treatment.

  1. A novel frame-level constant-distortion bit allocation for smooth H.264/AVC video quality

    NASA Astrophysics Data System (ADS)

    Liu, Li; Zhuang, Xinhua

    2009-01-01

    It is known that quality fluctuation has a major negative effect on visual perception. In previous work, we introduced a constant-distortion bit allocation method [1] for H.263+ encoder. However, the method in [1] can not be adapted to the newest H.264/AVC encoder directly as the well-known chicken-egg dilemma resulted from the rate-distortion optimization (RDO) decision process. To solve this problem, we propose a new two stage constant-distortion bit allocation (CDBA) algorithm with enhanced rate control for H.264/AVC encoder. In stage-1, the algorithm performs RD optimization process with a constant quantization QP. Based on prediction residual signals from stage-1 and target distortion for smooth video quality purpose, the frame-level bit target is allocated by using a close-form approximations of ratedistortion relationship similar to [1], and a fast stage-2 encoding process is performed with enhanced basic unit rate control. Experimental results show that, compared with original rate control algorithm provided by H.264/AVC reference software JM12.1, the proposed constant-distortion frame-level bit allocation scheme reduces quality fluctuation and delivers much smoother PSNR on all testing sequences.

  2. Abrasive slurry jet cutting model based on fuzzy relations

    NASA Astrophysics Data System (ADS)

    Qiang, C. H.; Guo, C. W.

    2017-12-01

    The cutting process of pre-mixed abrasive slurry or suspension jet (ASJ) is a complex process affected by many factors, and there is a highly nonlinear relationship between the cutting parameters and cutting quality. In this paper, guided by fuzzy theory, the fuzzy cutting model of ASJ was developed. In the modeling of surface roughness, the upper surface roughness prediction model and the lower surface roughness prediction model were established respectively. The adaptive fuzzy inference system combines the learning mechanism of neural networks and the linguistic reasoning ability of the fuzzy system, membership functions, and fuzzy rules are obtained by adaptive adjustment. Therefore, the modeling process is fast and effective. In this paper, the ANFIS module of MATLAB fuzzy logic toolbox was used to establish the fuzzy cutting model of ASJ, which is found to be quite instrumental to ASJ cutting applications.

  3. SENSITIVITY OF OZONE AND AEROSOL PREDICTIONS TO THE TRANSPORT ALGORITHMS IN THE MODELS-3 COMMUNITY MULTI-SCALE AIR QUALITY (CMAQ) MODELING SYSTEM

    EPA Science Inventory

    EPA's Models-3 CMAQ system is intended to provide a community modeling paradigm that allows continuous improvement of the one-atmosphere modeling capability in a unified fashion. CMAQ's modular design promotes incorporation of several sets of science process modules representing ...

  4. GlutoPeak profile analysis for wheat classification: skipping the refinement process

    USDA-ARS?s Scientific Manuscript database

    The GlutoPeak test can predict wheat flour quality by measuring gluten aggregation properties in a short time and using a small amount of sample; thus has usefulness along the entire wheat delivery chain. However, no information on the suitability of this new test on whole grain flours is available...

  5. Over a Barrel: The High Costs of Rising Tuitions.

    ERIC Educational Resources Information Center

    Yanikowski, Richard A.

    1986-01-01

    Participants in the tuition-setting process lean toward "aggressive" pricing strategies because they want to maintain or improve quality, assure continued vitality, and keep the campus in good repair. Recent trends in tuition pricing are reviewed and some elements of budgetary strategies predicted on tuition increases above inflation are examined.…

  6. Heat Transfer during Blanching and Hydrocooling of Broccoli Florets.

    PubMed

    Iribe-Salazar, Rosalina; Caro-Corrales, José; Hernández-Calderón, Óscar; Zazueta-Niebla, Jorge; Gutiérrez-Dorado, Roberto; Carrazco-Escalante, Marco; Vázquez-López, Yessica

    2015-12-01

    The objective of this work was to simulate heat transfer during blanching (90 °C) and hydrocooling (5 °C) of broccoli florets (Brassica oleracea L. Italica) and to evaluate the impact of these processes on the physicochemical and nutrimental quality properties. Thermophysical properties (thermal conductivity [line heat source], specific heat capacity [differential scanning calorimetry], and bulk density [volume displacement]) of stem and inflorescence were measured as a function of temperature (5, 10, 20, 40, 60, and 80 °C). The activation energy and the frequency factor (Arrhenius model) of these thermophysical properties were calculated. A 3-dimensional finite element model was developed to predict the temperature history at different points inside the product. Comparison of the theoretical and experimental temperature histories was carried out. Quality parameters (firmness, total color difference, and vitamin C content) and peroxidase activity were measured. The satisfactory validation of the finite element model allows the prediction of temperature histories and profiles under different process conditions, which could lead to an eventual optimization aimed to minimize the nutritional and sensorial losses in broccoli florets. © 2015 Institute of Food Technologists®

  7. Group-regularized individual prediction: theory and application to pain.

    PubMed

    Lindquist, Martin A; Krishnan, Anjali; López-Solà, Marina; Jepma, Marieke; Woo, Choong-Wan; Koban, Leonie; Roy, Mathieu; Atlas, Lauren Y; Schmidt, Liane; Chang, Luke J; Reynolds Losin, Elizabeth A; Eisenbarth, Hedwig; Ashar, Yoni K; Delk, Elizabeth; Wager, Tor D

    2017-01-15

    Multivariate pattern analysis (MVPA) has become an important tool for identifying brain representations of psychological processes and clinical outcomes using fMRI and related methods. Such methods can be used to predict or 'decode' psychological states in individual subjects. Single-subject MVPA approaches, however, are limited by the amount and quality of individual-subject data. In spite of higher spatial resolution, predictive accuracy from single-subject data often does not exceed what can be accomplished using coarser, group-level maps, because single-subject patterns are trained on limited amounts of often-noisy data. Here, we present a method that combines population-level priors, in the form of biomarker patterns developed on prior samples, with single-subject MVPA maps to improve single-subject prediction. Theoretical results and simulations motivate a weighting based on the relative variances of biomarker-based prediction-based on population-level predictive maps from prior groups-and individual-subject, cross-validated prediction. Empirical results predicting pain using brain activity on a trial-by-trial basis (single-trial prediction) across 6 studies (N=180 participants) confirm the theoretical predictions. Regularization based on a population-level biomarker-in this case, the Neurologic Pain Signature (NPS)-improved single-subject prediction accuracy compared with idiographic maps based on the individuals' data alone. The regularization scheme that we propose, which we term group-regularized individual prediction (GRIP), can be applied broadly to within-person MVPA-based prediction. We also show how GRIP can be used to evaluate data quality and provide benchmarks for the appropriateness of population-level maps like the NPS for a given individual or study. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Perceptual quality prediction on authentically distorted images using a bag of features approach

    PubMed Central

    Ghadiyaram, Deepti; Bovik, Alan C.

    2017-01-01

    Current top-performing blind perceptual image quality prediction models are generally trained on legacy databases of human quality opinion scores on synthetically distorted images. Therefore, they learn image features that effectively predict human visual quality judgments of inauthentic and usually isolated (single) distortions. However, real-world images usually contain complex composite mixtures of multiple distortions. We study the perceptually relevant natural scene statistics of such authentically distorted images in different color spaces and transform domains. We propose a “bag of feature maps” approach that avoids assumptions about the type of distortion(s) contained in an image and instead focuses on capturing consistencies—or departures therefrom—of the statistics of real-world images. Using a large database of authentically distorted images, human opinions of them, and bags of features computed on them, we train a regressor to conduct image quality prediction. We demonstrate the competence of the features toward improving automatic perceptual quality prediction by testing a learned algorithm using them on a benchmark legacy database as well as on a newly introduced distortion-realistic resource called the LIVE In the Wild Image Quality Challenge Database. We extensively evaluate the perceptual quality prediction model and algorithm and show that it is able to achieve good-quality prediction power that is better than other leading models. PMID:28129417

  9. Quality-by-Design approach to monitor the operation of a batch bioreactor in an industrial avian vaccine manufacturing process.

    PubMed

    Largoni, Martina; Facco, Pierantonio; Bernini, Donatella; Bezzo, Fabrizio; Barolo, Massimiliano

    2015-10-10

    Monitoring batch bioreactors is a complex task, due to the fact that several sources of variability can affect a running batch and impact on the final product quality. Additionally, the product quality itself may not be measurable on line, but requires sampling and lab analysis taking several days to be completed. In this study we show that, by using appropriate process analytical technology tools, the operation of an industrial batch bioreactor used in avian vaccine manufacturing can be effectively monitored as the batch progresses. Multivariate statistical models are built from historical databases of batches already completed, and they are used to enable the real time identification of the variability sources, to reliably predict the final product quality, and to improve process understanding, paving the way to a reduction of final product rejections, as well as to a reduction of the product cycle time. It is also shown that the product quality "builds up" mainly during the first half of a batch, suggesting on the one side that reducing the variability during this period is crucial, and on the other side that the batch length can possibly be shortened. Overall, the study demonstrates that, by using a Quality-by-Design approach centered on the appropriate use of mathematical modeling, quality can indeed be built "by design" into the final product, whereas the role of end-point product testing can progressively reduce its importance in product manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Model-Based PAT for Quality Management in Pharmaceuticals Freeze-Drying: State of the Art

    PubMed Central

    Fissore, Davide

    2017-01-01

    Model-based process analytical technologies can be used for the in-line control and optimization of a pharmaceuticals freeze-drying process, as well as for the off-line design of the process, i.e., the identification of the optimal operating conditions. This paper aims at presenting the state of the art in this field, focusing, particularly, on three groups of systems, namely, those based on the temperature measurement (i.e., the soft sensor), on the chamber pressure measurement (i.e., the systems based on the test of pressure rise and of pressure decrease), and on the sublimation flux estimate (i.e., the tunable diode laser absorption spectroscopy and the valveless monitoring system). The application of these systems for in-line process optimization (e.g., using a model predictive control algorithm) and to get a true quality by design (e.g., through the off-line calculation of the design space of the process) is presented and discussed. PMID:28224123

  11. Association of journal quality indicators with methodological quality of clinical research articles.

    PubMed

    Lee, Kirby P; Schotland, Marieka; Bacchetti, Peter; Bero, Lisa A

    2002-06-05

    The ability to identify scientific journals that publish high-quality research would help clinicians, scientists, and health-policy analysts to select the most up-to-date medical literature to review. To assess whether journal characteristics of (1) peer-review status, (2) citation rate, (3) impact factor, (4) circulation, (5) manuscript acceptance rate, (6) MEDLINE indexing, and (7) Brandon/Hill Library List indexing are predictors of methodological quality of research articles, we conducted a cross-sectional study of 243 original research articles involving human subjects published in general internal medical journals. The mean (SD) quality score of the 243 articles was 1.37 (0.22). All journals reported a peer-review process and were indexed on MEDLINE. In models that controlled for article type (randomized controlled trial [RCT] or non-RCT), journal citation rate was the most statistically significant predictor (0.051 increase per doubling; 95% confidence interval [CI], 0.037-0.065; P<.001). In separate analyses by article type, acceptance rate was the strongest predictor for RCT quality (-0.113 per doubling; 95% CI, -0.148 to -0.078; P<.001), while journal citation rate was the most predictive factor for non-RCT quality (0.051 per doubling; 95% CI, 0.044-0.059; P<.001). High citation rates, impact factors, and circulation rates, and low manuscript acceptance rates and indexing on Brandon/Hill Library List appear to be predictive of higher methodological quality scores for journal articles.

  12. Static Thermochemical Model of COREX Melter Gasifier

    NASA Astrophysics Data System (ADS)

    Srishilan, C.; Shukla, Ajay Kumar

    2018-02-01

    COREX is one of the commercial smelting reduction processes. It uses the finer size ore and semi-soft coal instead of metallurgical coke to produce hot metal from iron ore. The use of top gas with high calorific value as a by-product export gas makes the process economical and green. The predictive thermochemical model of the COREX process presented here enables rapid computation of process parameters such as (1) required amount of ore, coal, and flux; (2) amount of slag and gas generated; and (3) gas compositions (based on the raw material and desired hot metal quality). The model helps in predicting the variations in process parameters with respect to the (1) degree of metallization and (2) post-combustion ratio for given raw material conditions. In general reduction in coal, flux, and oxygen, the requirement is concomitant with an increase in the degree of metallization and post-combustion ratio. The model reported here has been benchmarked using industrial data obtained from the JSW Steel Plant, India.

  13. Denitrification in Agricultural Soils: Integrated control and Modelling at various scales (DASIM)

    NASA Astrophysics Data System (ADS)

    Müller, Christoph; Well, Reinhard; Böttcher, Jürgen; Butterbach-Bahl, Klaus; Dannenmann, Michael; Deppe, Marianna; Dittert, Klaus; Dörsch, Peter; Horn, Marcus; Ippisch, Olaf; Mikutta, Robert; Senbayram, Mehmet; Vogel, Hans-Jörg; Wrage-Mönnig, Nicole; Müller, Carsten

    2016-04-01

    The new research unit DASIM brings together the expertise of 11 working groups to study the process of denitrification at unprecedented spatial and temporal resolution. Based on state-of-the art analytical techniques our aim is to develop improved denitrification models ranging from the microscale to the field/plot scale. Denitrification, the process of nitrate reduction allowing microbes to sustain respiration under anaerobic conditions, is the key process returning reactive nitrogen as N2to the atmosphere. Actively denitrifying communities in soil show distinct regulatory phenotypes (DRP) with characteristic controls on the single reaction steps and end-products. It is unresolved whether DRPs are anchored in the taxonomic composition of denitrifier communities and how environmental conditions shape them. Despite being intensively studied for more than 100 years, denitrification rates and emissions of its gaseous products can still not be satisfactorily predicted. While the impact of single environmental parameters is well understood, the complexity of the process itself with its intricate cellular regulation in response to highly variable factors in the soil matrix prevents robust prediction of gaseous emissions. Key parameters in soil are pO2, organic matter content and quality, pH and the microbial community structure, which in turn are affected by the soil structure, chemistry and soil-plant interactions. In the DASIM research unit, we aim at the quantitative prediction of denitrification rates as a function of microscale soil structure, organic matter quality, DRPs and atmospheric boundary conditions via a combination of state-of-the-art experimental and analytical tools (X-ray μCT, 15N tracing, NanoSIMS, microsensors, advanced flux detection, NMR spectroscopy, and molecular methods including next generation sequencing of functional gene transcripts). We actively seek collaboration with researchers working in the field of denitrification.

  14. Chesapeake Bay Forecast System: Oxygen Prediction for the Sustainable Ecosystem Management

    NASA Astrophysics Data System (ADS)

    Mathukumalli, B.; Long, W.; Zhang, X.; Wood, R.; Murtugudde, R. G.

    2010-12-01

    The Chesapeake Bay Forecast System (CBFS) is a flexible, end-to-end expert prediction tool for decision makers that will provide customizable, user-specified predictions and projections of the region’s climate, air and water quality, local chemistry, and ecosystems at days to decades. As a part of CBFS, the long-term water quality data were collected and assembled to develop ecological models for the sustainable management of the Chesapeake Bay. Cultural eutrophication depletes oxygen levels in this ecosystem particularly in summer which has several negative implications on the structure and function of ecosystem. In order to understand dynamics and prediction of spatially-explicit oxygen levels in the Bay, an empirical process based ecological model is developed with long-term control variables (water temperature, salinity, nitrogen and phosphorus). Statistical validation methods were employed to demonstrate usability of predictions for management purposes and the predicted oxygen levels are quite faithful to observations. The predicted oxygen values and other physical outputs from downscaling of regional weather and climate predictions, or forecasts from hydrodynamic models can be used to forecast various ecological components. Such forecasts would be useful for both recreational and commercial users of the bay (for example, bass fishing). Furthermore, this work can also be used to predict extent of hypoxia/anoxia not only from anthropogenic nutrient pollution, but also from global warming. Some hindcasts and forecasts are discussed along with the ongoing efforts at a mechanistic ecosystem model to provide prognostic oxygen predictions and projections and upper trophic modeling using an energetics approach.

  15. Assessment of Process Capability: the case of Soft Drinks Processing Unit

    NASA Astrophysics Data System (ADS)

    Sri Yogi, Kottala

    2018-03-01

    The process capability studies have significant impact in investigating process variation which is important in achieving product quality characteristics. Its indices are to measure the inherent variability of a process and thus to improve the process performance radically. The main objective of this paper is to understand capability of the process being produced within specification of the soft drinks processing unit, a premier brands being marketed in India. A few selected critical parameters in soft drinks processing: concentration of gas volume, concentration of brix, torque of crock has been considered for this study. Assessed some relevant statistical parameters: short term capability, long term capability as a process capability indices perspective. For assessment we have used real time data of soft drinks bottling company which is located in state of Chhattisgarh, India. As our research output suggested reasons for variations in the process which is validated using ANOVA and also predicted Taguchi cost function, assessed also predicted waste monetarily this shall be used by organization for improving process parameters. This research work has substantially benefitted the organization in understanding the various variations of selected critical parameters for achieving zero rejection.

  16. Energy-based culture medium design for biomanufacturing optimization: A case study in monoclonal antibody production by GS-NS0 cells.

    PubMed

    Quiroga-Campano, Ana L; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2018-03-02

    Demand for high-value biologics, a rapidly growing pipeline, and pressure from competition, time-to-market and regulators, necessitate novel biomanufacturing approaches, including Quality by Design (QbD) principles and Process Analytical Technologies (PAT), to facilitate accelerated, efficient and effective process development platforms that ensure consistent product quality and reduced lot-to-lot variability. Herein, QbD and PAT principles were incorporated within an innovative in vitro-in silico integrated framework for upstream process development (UPD). The central component of the UPD framework is a mathematical model that predicts dynamic nutrient uptake and average intracellular ATP content, based on biochemical reaction networks, to quantify and characterize energy metabolism and its adaptive response, metabolic shifts, to maintain ATP homeostasis. The accuracy and flexibility of the model depends on critical cell type/product/clone-specific parameters, which are experimentally estimated. The integrated in vitro-in silico platform and the model's predictive capacity reduced burden, time and expense of experimentation resulting in optimal medium design compared to commercially available culture media (80% amino acid reduction) and a fed-batch feeding strategy that increased productivity by 129%. The framework represents a flexible and efficient tool that transforms, improves and accelerates conventional process development in biomanufacturing with wide applications, including stem cell-based therapies. Copyright © 2018. Published by Elsevier Inc.

  17. Data assimilation of GNSS zenith total delays from a Nordic processing centre

    NASA Astrophysics Data System (ADS)

    Lindskog, Magnus; Ridal, Martin; Thorsteinsson, Sigurdur; Ning, Tong

    2017-11-01

    Atmospheric moisture-related information estimated from Global Navigation Satellite System (GNSS) ground-based receiver stations by the Nordic GNSS Analysis Centre (NGAA) have been used within a state-of-the-art kilometre-scale numerical weather prediction system. Different processing techniques have been implemented to derive the moisture-related GNSS information in the form of zenith total delays (ZTDs) and these are described and compared. In addition full-scale data assimilation and modelling experiments have been carried out to investigate the impact of utilizing moisture-related GNSS data from the NGAA processing centre on a numerical weather prediction (NWP) model initial state and on the ensuing forecast quality. The sensitivity of results to aspects of the data processing, station density, bias-correction and data assimilation have been investigated. Results show benefits to forecast quality when using GNSS ZTD as an additional observation type. The results also show a sensitivity to thinning distance applied for GNSS ZTD observations but not to modifications to the number of predictors used in the variational bias correction applied. In addition, it is demonstrated that the assimilation of GNSS ZTD can benefit from more general data assimilation enhancements and that there is an interaction of GNSS ZTD with other types of observations used in the data assimilation. Future plans include further investigation of optimal thinning distances and application of more advanced data assimilation techniques.

  18. Operational prediction of air quality for the United States: applications of satellite observations

    NASA Astrophysics Data System (ADS)

    Stajner, Ivanka; Lee, Pius; Tong, Daniel; Pan, Li; McQueen, Jeff; Huang, Jianping; Huang, Ho-Chun; Draxler, Roland; Kondragunta, Shobha; Upadhayay, Sikchya

    2015-04-01

    Operational predictions of ozone and wildfire smoke over United States (U.S.) and predictions of airborne dust over the contiguous 48 states are provided by NOAA at http://airquality.weather.gov/. North American Mesoscale (NAM) weather predictions with inventory based emissions estimates from the U.S. Environmental Protection Agency (EPA) and chemical processes within the Community Multiscale Air Quality (CMAQ) model are combined together to produce ozone predictions. Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model is used to predict wildfire smoke and dust storm predictions. Routine verification of ozone predictions relies on AIRNow compilation of observations from surface monitors. Retrievals of smoke column integrals from GOES satellites and dust column integrals from MODIS satellite instruments are used for verification of smoke and dust predictions. Recent updates of NOAA's operational air quality predictions have focused on mobile emissions using the projections of mobile sources for 2012. Since emission inventories are complex and take years to assemble and evaluate causing a lag of information, we recently began combing inventory information with projections of mobile sources. In order to evaluate this emission update, these changes in projected NOx emissions from 2005-2012 were compared with observed changes in Ozone Monitoring Instrument (OMI) NO2 observations and NOx measured by surface monitors over large U.S. cities over the same period. Comparisons indicate that projected decreases in NOx emissions from 2005 to 2012 are similar, but not as strong as the decreases in the observed NOx concentrations and in OMI NO2 retrievals. Nevertheless, the use of projected mobile NOx emissions in the predictions reduced biases in predicted NOx concentrations, with the largest improvement in the urban areas. Ozone biases are reduced as well, with the largest improvement seen in rural areas. Recent testing of PM2.5 predictions is relying on emissions inventories augmented by real time sources from wildfires and dust storms. The evaluation of these test predictions relies on surface monitor data, but efforts are in progress to include comparisons with satellite observed aerosol optical depth (AOD) products. Testing of PM2.5 predictions continues to exhibit seasonal biases: overprediction in the winter and underprediction in the summer. The current efforts focus on bias correction and development of linkages with global atmospheric composition predictions.

  19. Developing and implementing the use of predictive models for estimating water quality at Great Lakes beaches

    USGS Publications Warehouse

    Francy, Donna S.; Brady, Amie M.G.; Carvin, Rebecca B.; Corsi, Steven R.; Fuller, Lori M.; Harrison, John H.; Hayhurst, Brett A.; Lant, Jeremiah; Nevers, Meredith B.; Terrio, Paul J.; Zimmerman, Tammy M.

    2013-01-01

    Predictive models have been used at beaches to improve the timeliness and accuracy of recreational water-quality assessments over the most common current approach to water-quality monitoring, which relies on culturing fecal-indicator bacteria such as Escherichia coli (E. coli.). Beach-specific predictive models use environmental and water-quality variables that are easily and quickly measured as surrogates to estimate concentrations of fecal-indicator bacteria or to provide the probability that a State recreational water-quality standard will be exceeded. When predictive models are used for beach closure or advisory decisions, they are referred to as “nowcasts.” During the recreational seasons of 2010-12, the U.S. Geological Survey (USGS), in cooperation with 23 local and State agencies, worked to improve existing nowcasts at 4 beaches, validate predictive models at another 38 beaches, and collect data for predictive-model development at 7 beaches throughout the Great Lakes. This report summarizes efforts to collect data and develop predictive models by multiple agencies and to compile existing information on the beaches and beach-monitoring programs into one comprehensive report. Local agencies measured E. coli concentrations and variables expected to affect E. coli concentrations such as wave height, turbidity, water temperature, and numbers of birds at the time of sampling. In addition to these field measurements, equipment was installed by the USGS or local agencies at or near several beaches to collect water-quality and metrological measurements in near real time, including nearshore buoys, weather stations, and tributary staff gages and monitors. The USGS worked with local agencies to retrieve data from existing sources either manually or by use of tools designed specifically to compile and process data for predictive-model development. Predictive models were developed by use of linear regression and (or) partial least squares techniques for 42 beaches that had at least 2 years of data (2010-11 and sometimes earlier) and for 1 beach that had 1 year of data. For most models, software designed for model development by the U.S. Environmental Protection Agency (Virtual Beach) was used. The selected model for each beach was based on a combination of explanatory variables including, most commonly, turbidity, day of the year, change in lake level over 24 hours, wave height, wind direction and speed, and antecedent rainfall for various time periods. Forty-two predictive models were validated against data collected during an independent year (2012) and compared to the current method for assessing recreational water quality-using the previous day’s E. coli concentration (persistence model). Goals for good predictive-model performance were responses that were at least 5 percent greater than the persistence model and overall correct responses greater than or equal to 80 percent, sensitivities (percentage of exceedances of the bathing-water standard that were correctly predicted by the model) greater than or equal to 50 percent, and specificities (percentage of nonexceedances correctly predicted by the model) greater than or equal to 85 percent. Out of 42 predictive models, 24 models yielded over-all correct responses that were at least 5 percent greater than the use of the persistence model. Predictive-model responses met the performance goals more often than the persistence-model responses in terms of overall correctness (28 versus 17 models, respectively), sensitivity (17 versus 4 models), and specificity (34 versus 25 models). Gaining knowledge of each beach and the factors that affect E. coli concentrations is important for developing good predictive models. Collection of additional years of data with a wide range of environmental conditions may also help to improve future model performance. The USGS will continue to work with local agencies in 2013 and beyond to develop and validate predictive models at beaches and improve existing nowcasts, restructuring monitoring activities to accommodate future uncertainties in funding and resources.

  20. A proposed framework on hybrid feature selection techniques for handling high dimensional educational data

    NASA Astrophysics Data System (ADS)

    Shahiri, Amirah Mohamed; Husain, Wahidah; Rashid, Nur'Aini Abd

    2017-10-01

    Huge amounts of data in educational datasets may cause the problem in producing quality data. Recently, data mining approach are increasingly used by educational data mining researchers for analyzing the data patterns. However, many research studies have concentrated on selecting suitable learning algorithms instead of performing feature selection process. As a result, these data has problem with computational complexity and spend longer computational time for classification. The main objective of this research is to provide an overview of feature selection techniques that have been used to analyze the most significant features. Then, this research will propose a framework to improve the quality of students' dataset. The proposed framework uses filter and wrapper based technique to support prediction process in future study.

  1. Dispositional optimism and sleep quality: a test of mediating pathways

    PubMed Central

    Cribbet, Matthew; Kent de Grey, Robert G.; Cronan, Sierra; Trettevik, Ryan; Smith, Timothy W.

    2016-01-01

    Dispositional optimism has been related to beneficial influences on physical health outcomes. However, its links to global sleep quality and the psychological mediators responsible for such associations are less studied. This study thus examined if trait optimism predicted global sleep quality, and if measures of subjective well-being were statistical mediators of such links. A community sample of 175 participants (93 men, 82 women) completed measures of trait optimism, depression, and life satisfaction. Global sleep quality was assessed using the Pittsburgh Sleep Quality Index. Results indicated that trait optimism was a strong predictor of better PSQI global sleep quality. Moreover, this association was mediated by depression and life satisfaction in both single and multiple mediator models. These results highlight the importance of optimism for the restorative process of sleep, as well as the utility of multiple mediator models in testing distinct psychological pathways. PMID:27592128

  2. Dispositional optimism and sleep quality: a test of mediating pathways.

    PubMed

    Uchino, Bert N; Cribbet, Matthew; de Grey, Robert G Kent; Cronan, Sierra; Trettevik, Ryan; Smith, Timothy W

    2017-04-01

    Dispositional optimism has been related to beneficial influences on physical health outcomes. However, its links to global sleep quality and the psychological mediators responsible for such associations are less studied. This study thus examined if trait optimism predicted global sleep quality, and if measures of subjective well-being were statistical mediators of such links. A community sample of 175 participants (93 men, 82 women) completed measures of trait optimism, depression, and life satisfaction. Global sleep quality was assessed using the Pittsburgh Sleep Quality Index. Results indicated that trait optimism was a strong predictor of better PSQI global sleep quality. Moreover, this association was mediated by depression and life satisfaction in both single and multiple mediator models. These results highlight the importance of optimism for the restorative process of sleep, as well as the utility of multiple mediator models in testing distinct psychological pathways.

  3. Nondestructive detection of pork quality based on dual-band VIS/NIR spectroscopy

    NASA Astrophysics Data System (ADS)

    Wang, Wenxiu; Peng, Yankun; Li, Yongyu; Tang, Xiuying; Liu, Yuanyuan

    2015-05-01

    With the continuous development of living standards and the relative change of dietary structure, consumers' rising and persistent demand for better quality of meat is emphasized. Colour, pH value, and cooking loss are important quality attributes when evaluating meat. To realize nondestructive detection of multi-parameter of meat quality simultaneously is popular in production and processing of meat and meat products. The objectives of this research were to compare the effectiveness of two bands for rapid nondestructive and simultaneous detection of pork quality attributes. Reflectance spectra of 60 chilled pork samples were collected from a dual-band visible/near-infrared spectroscopy system which covered 350-1100 nm and 1000-2600 nm. Then colour, pH value and cooking loss were determined by standard methods as reference values. Standard normal variables transform (SNVT) was employed to eliminate the spectral noise. A spectrum connection method was put forward for effective integration of the dual-band spectrum to make full use of the whole efficient information. Partial least squares regression (PLSR) and Principal component analysis (PCA) were applied to establish prediction models using based on single-band spectrum and dual-band spectrum, respectively. The experimental results showed that the PLSR model based on dual-band spectral information was superior to the models based on single band spectral information with lower root means quare error (RMSE) and higher accuracy. The PLSR model based on dual-band (use the overlapping part of first band) yielded the best prediction result with correlation coefficient of validation (Rv) of 0.9469, 0.9495, 0.9180, 0.9054 and 0.8789 for L*, a*, b*, pH value and cooking loss, respectively. This mainly because dual-band spectrum can provide sufficient and comprehensive information which reflected the quality attributes. Data fusion from dual-band spectrum could significantly improve pork quality parameters prediction performance. The research also indicated that multi-band spectral information fusion has potential to comprehensively evaluate other quality and safety attributes of pork.

  4. Adult Age Differences in Dual Information Processes: Implications for the Role of Affective and Deliberative Processes in Older Adults' Decision Making.

    PubMed

    Peters, Ellen; Hess, Thomas M; Västfjäll, Daniel; Auman, Corinne

    2007-03-01

    Age differences in affective/experiential and deliberative processes have important theoretical implications for judgment and decision theory and important pragmatic implications for older-adult decision making. Age-related declines in the efficiency of deliberative processes predict poorer-quality decisions as we age. However, age-related adaptive processes, including motivated selectivity in the use of deliberative capacity, an increased focus on emotional goals, and greater experience, predict better or worse decisions for older adults depending on the situation. The aim of the current review is to examine adult age differences in affective and deliberative information processes in order to understand their potential impact on judgments and decisions. We review evidence for the role of these dual processes in judgment and decision making and then review two representative life-span perspectives (based on aging-related changes to cognitive or motivational processes) on the interplay between these processes. We present relevant predictions for older-adult decisions and make note of contradictions and gaps that currently exist in the literature. Finally, we review the sparse evidence about age differences in decision making and how theories and findings regarding dual processes could be applied to decision theory and decision aiding. In particular, we focus on prospect theory (Kahneman & Tversky, 1979) and how prospect theory and theories regarding age differences in information processing can inform one another. © 2007 Association for Psychological Science.

  5. Fecal indicator organism modeling and microbial source tracking in environmental waters: Chapter 3.4.6

    USGS Publications Warehouse

    Nevers, Meredith; Byappanahalli, Muruleedhara; Phanikumar, Mantha S.; Whitman, Richard L.

    2016-01-01

    Mathematical models have been widely applied to surface waters to estimate rates of settling, resuspension, flow, dispersion, and advection in order to calculate movement of particles that influence water quality. Of particular interest are the movement, survival, and persistence of microbial pathogens or their surrogates, which may contaminate recreational water, drinking water, or shellfish. Most models devoted to microbial water quality have been focused on fecal indicator organisms (FIO), which act as a surrogate for pathogens and viruses. Process-based modeling and statistical modeling have been used to track contamination events to source and to predict future events. The use of these two types of models require different levels of expertise and input; process-based models rely on theoretical physical constructs to explain present conditions and biological distribution while data-based, statistical models use extant paired data to do the same. The selection of the appropriate model and interpretation of results is critical to proper use of these tools in microbial source tracking. Integration of the modeling approaches could provide insight for tracking and predicting contamination events in real time. A review of modeling efforts reveals that process-based modeling has great promise for microbial source tracking efforts; further, combining the understanding of physical processes influencing FIO contamination developed with process-based models and molecular characterization of the population by gene-based (i.e., biological) or chemical markers may be an effective approach for locating sources and remediating contamination in order to protect human health better.

  6. Implementation of a GPS-RO data processing system for the KIAPS-LETKF data assimilation system

    NASA Astrophysics Data System (ADS)

    Kwon, H.; Kang, J.-S.; Jo, Y.; Kang, J. H.

    2015-03-01

    The Korea Institute of Atmospheric Prediction Systems (KIAPS) has been developing a new global numerical weather prediction model and an advanced data assimilation system. As part of the KIAPS package for observation processing (KPOP) system for data assimilation, preprocessing, and quality control modules for bending-angle measurements of global positioning system radio occultation (GPS-RO) data have been implemented and examined. The GPS-RO data processing system is composed of several steps for checking observation locations, missing values, physical values for Earth radius of curvature, and geoid undulation. An observation-minus-background check is implemented by use of a one-dimensional observational bending-angle operator, and tangent point drift is also considered in the quality control process. We have tested GPS-RO observations utilized by the Korean Meteorological Administration (KMA) within KPOP, based on both the KMA global model and the National Center for Atmospheric Research Community Atmosphere Model with Spectral Element dynamical core (CAM-SE) as a model background. Background fields from the CAM-SE model are incorporated for the preparation of assimilation experiments with the KIAPS local ensemble transform Kalman filter (LETKF) data assimilation system, which has been successfully implemented to a cubed-sphere model with unstructured quadrilateral meshes. As a result of data processing, the bending-angle departure statistics between observation and background show significant improvement. Also, the first experiment in assimilating GPS-RO bending angle from KPOP within KIAPS-LETKF shows encouraging results.

  7. Identifying the Machine Translation Error Types with the Greatest Impact on Post-editing Effort

    PubMed Central

    Daems, Joke; Vandepitte, Sonia; Hartsuiker, Robert J.; Macken, Lieve

    2017-01-01

    Translation Environment Tools make translators’ work easier by providing them with term lists, translation memories and machine translation output. Ideally, such tools automatically predict whether it is more effortful to post-edit than to translate from scratch, and determine whether or not to provide translators with machine translation output. Current machine translation quality estimation systems heavily rely on automatic metrics, even though they do not accurately capture actual post-editing effort. In addition, these systems do not take translator experience into account, even though novices’ translation processes are different from those of professional translators. In this paper, we report on the impact of machine translation errors on various types of post-editing effort indicators, for professional translators as well as student translators. We compare the impact of MT quality on a product effort indicator (HTER) with that on various process effort indicators. The translation and post-editing process of student translators and professional translators was logged with a combination of keystroke logging and eye-tracking, and the MT output was analyzed with a fine-grained translation quality assessment approach. We find that most post-editing effort indicators (product as well as process) are influenced by machine translation quality, but that different error types affect different post-editing effort indicators, confirming that a more fine-grained MT quality analysis is needed to correctly estimate actual post-editing effort. Coherence, meaning shifts, and structural issues are shown to be good indicators of post-editing effort. The additional impact of experience on these interactions between MT quality and post-editing effort is smaller than expected. PMID:28824482

  8. Identifying the Machine Translation Error Types with the Greatest Impact on Post-editing Effort.

    PubMed

    Daems, Joke; Vandepitte, Sonia; Hartsuiker, Robert J; Macken, Lieve

    2017-01-01

    Translation Environment Tools make translators' work easier by providing them with term lists, translation memories and machine translation output. Ideally, such tools automatically predict whether it is more effortful to post-edit than to translate from scratch, and determine whether or not to provide translators with machine translation output. Current machine translation quality estimation systems heavily rely on automatic metrics, even though they do not accurately capture actual post-editing effort. In addition, these systems do not take translator experience into account, even though novices' translation processes are different from those of professional translators. In this paper, we report on the impact of machine translation errors on various types of post-editing effort indicators, for professional translators as well as student translators. We compare the impact of MT quality on a product effort indicator (HTER) with that on various process effort indicators. The translation and post-editing process of student translators and professional translators was logged with a combination of keystroke logging and eye-tracking, and the MT output was analyzed with a fine-grained translation quality assessment approach. We find that most post-editing effort indicators (product as well as process) are influenced by machine translation quality, but that different error types affect different post-editing effort indicators, confirming that a more fine-grained MT quality analysis is needed to correctly estimate actual post-editing effort. Coherence, meaning shifts, and structural issues are shown to be good indicators of post-editing effort. The additional impact of experience on these interactions between MT quality and post-editing effort is smaller than expected.

  9. Intra Frame Coding In Advanced Video Coding Standard (H.264) to Obtain Consistent PSNR and Reduce Bit Rate for Diagonal Down Left Mode Using Gaussian Pulse

    NASA Astrophysics Data System (ADS)

    Manjanaik, N.; Parameshachari, B. D.; Hanumanthappa, S. N.; Banu, Reshma

    2017-08-01

    Intra prediction process of H.264 video coding standard used to code first frame i.e. Intra frame of video to obtain good coding efficiency compare to previous video coding standard series. More benefit of intra frame coding is to reduce spatial pixel redundancy with in current frame, reduces computational complexity and provides better rate distortion performance. To code Intra frame it use existing process Rate Distortion Optimization (RDO) method. This method increases computational complexity, increases in bit rate and reduces picture quality so it is difficult to implement in real time applications, so the many researcher has been developed fast mode decision algorithm for coding of intra frame. The previous work carried on Intra frame coding in H.264 standard using fast decision mode intra prediction algorithm based on different techniques was achieved increased in bit rate, degradation of picture quality(PSNR) for different quantization parameters. Many previous approaches of fast mode decision algorithms on intra frame coding achieved only reduction of computational complexity or it save encoding time and limitation was increase in bit rate with loss of quality of picture. In order to avoid increase in bit rate and loss of picture quality a better approach was developed. In this paper developed a better approach i.e. Gaussian pulse for Intra frame coding using diagonal down left intra prediction mode to achieve higher coding efficiency in terms of PSNR and bitrate. In proposed method Gaussian pulse is multiplied with each 4x4 frequency domain coefficients of 4x4 sub macro block of macro block of current frame before quantization process. Multiplication of Gaussian pulse for each 4x4 integer transformed coefficients at macro block levels scales the information of the coefficients in a reversible manner. The resulting signal would turn abstract. Frequency samples are abstract in a known and controllable manner without intermixing of coefficients, it avoids picture getting bad hit for higher values of quantization parameters. The proposed work was implemented using MATLAB and JM 18.6 reference software. The proposed work measure the performance parameters PSNR, bit rate and compression of intra frame of yuv video sequences in QCIF resolution under different values of quantization parameter with Gaussian value for diagonal down left intra prediction mode. The simulation results of proposed algorithm are tabulated and compared with previous algorithm i.e. Tian et al method. The proposed algorithm achieved reduced in bit rate averagely 30.98% and maintain consistent picture quality for QCIF sequences compared to previous algorithm i.e. Tian et al method.

  10. CVD2014-A Database for Evaluating No-Reference Video Quality Assessment Algorithms.

    PubMed

    Nuutinen, Mikko; Virtanen, Toni; Vaahteranoksa, Mikko; Vuori, Tero; Oittinen, Pirkko; Hakkinen, Jukka

    2016-07-01

    In this paper, we present a new video database: CVD2014-Camera Video Database. In contrast to previous video databases, this database uses real cameras rather than introducing distortions via post-processing, which results in a complex distortion space in regard to the video acquisition process. CVD2014 contains a total of 234 videos that are recorded using 78 different cameras. Moreover, this database contains the observer-specific quality evaluation scores rather than only providing mean opinion scores. We have also collected open-ended quality descriptions that are provided by the observers. These descriptions were used to define the quality dimensions for the videos in CVD2014. The dimensions included sharpness, graininess, color balance, darkness, and jerkiness. At the end of this paper, a performance study of image and video quality algorithms for predicting the subjective video quality is reported. For this performance study, we proposed a new performance measure that accounts for observer variance. The performance study revealed that there is room for improvement regarding the video quality assessment algorithms. The CVD2014 video database has been made publicly available for the research community. All video sequences and corresponding subjective ratings can be obtained from the CVD2014 project page (http://www.helsinki.fi/psychology/groups/visualcognition/).

  11. Characterization of Adipose Tissue Product Quality Using Measurements of Oxygen Consumption Rate.

    PubMed

    Suszynski, Thomas M; Sieber, David A; Mueller, Kathryn; Van Beek, Allen L; Cunningham, Bruce L; Kenkel, Jeffrey M

    2018-03-14

    Fat grafting is a common procedure in plastic surgery but associated with unpredictable graft retention. Adipose tissue (AT) "product" quality is affected by the methods used for harvest, processing and transfer, which vary widely amongst surgeons. Currently, there is no method available to accurately assess the quality of AT. In this study, we present a novel method for the assessment of AT product quality through direct measurements of oxygen consumption rate (OCR). OCR has exhibited potential in predicting outcomes following pancreatic islet transplant. Our study aim was to reapportion existing technology for its use with AT preparations and to confirm that these measurements are feasible. OCR was successfully measured for en bloc and postprocessed AT using a stirred microchamber system. OCR was then normalized to DNA content (OCR/DNA), which represents the AT product quality. Mean (±SE) OCR/DNA values for fresh en bloc and post-processed AT were 149.8 (± 9.1) and 61.1 (± 6.1) nmol/min/mg DNA, respectively. These preliminary data suggest that: (1) OCR and OCR/DNA measurements of AT harvested using conventional protocol are feasible; and (2) standard AT processing results in a decrease in overall AT product quality. OCR measurements of AT using existing technology can be done and enables accurate, real-time, quantitative assessment of the quality of AT product prior to transfer. The availability and further validation of this type of assay could enable optimization of fat grafting protocol by providing a tool for the more detailed study of procedural variables that affect AT product quality.

  12. Workshop on Satellite and In situ Observations for Climate Prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Acker, J.G.; Busalacchi, A.

    1995-02-01

    Participants in this workshop, which convened in Venice, Italy, 6-8 May 1993, met to consider the current state of climate monitoring programs and instrumentation for the purpose of climatological prediction on short-term (seasonal to interannual) timescales. Data quality and coverage requirements for definition of oceanographic heat and momentum fluxes, scales of inter- and intra-annual variability, and land-ocean-atmosphere exchange processes were examined. Advantages and disadvantages of earth-based and spaceborne monitoring systems were considered, as were the structures for future monitoring networks, research programs, and modeling studies.

  13. Workshop on Satellite and In situ Observations for Climate Prediction

    NASA Technical Reports Server (NTRS)

    Acker, James G.; Busalacchi, Antonio

    1995-01-01

    Participants in this workshop, which convened in Venice, Italy, 6-8 May 1993, met to consider the current state of climate monitoring programs and instrumentation for the purpose of climatological prediction on short-term (seasonal to interannual) timescales. Data quality and coverage requirements for definition of oceanographic heat and momentum fluxes, scales of inter- and intra-annual variability, and land-ocean-atmosphere exchange processes were examined. Advantages and disadvantages of earth-based and spaceborne monitoring systems were considered, as were the structures for future monitoring networks, research programs, and modeling studies.

  14. Ares I-X Range Safety Trajectory Analyses Overview and Independent Validation and Verification

    NASA Technical Reports Server (NTRS)

    Tarpley, Ashley F.; Starr, Brett R.; Tartabini, Paul V.; Craig, A. Scott; Merry, Carl M.; Brewer, Joan D.; Davis, Jerel G.; Dulski, Matthew B.; Gimenez, Adrian; Barron, M. Kyle

    2011-01-01

    All Flight Analysis data products were successfully generated and delivered to the 45SW in time to support the launch. The IV&V effort allowed data generators to work through issues early. Data consistency proved through the IV&V process provided confidence that the delivered data was of high quality. Flight plan approval was granted for the launch. The test flight was successful and had no safety related issues. The flight occurred within the predicted flight envelopes. Post flight reconstruction results verified the simulations accurately predicted the FTV trajectory.

  15. Side-gate modulation effects on high-quality BN-Graphene-BN nanoribbon capacitors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yang; Chen, Xiaolong; Ye, Weiguang

    High-quality BN-Graphene-BN nanoribbon capacitors with double side-gates of graphene have been experimentally realized. The double side-gates can effectively modulate the electronic properties of graphene nanoribbon capacitors. By applying anti-symmetric side-gate voltages, we observed significant upward shifting and flattening of the V-shaped capacitance curve near the charge neutrality point. Symmetric side-gate voltages, however, only resulted in tilted upward shifting along the opposite direction of applied gate voltages. These modulation effects followed the behavior of graphene nanoribbons predicted theoretically for metallic side-gate modulation. The negative quantum capacitance phenomenon predicted by numerical simulations for graphene nanoribbons modulated by graphene side-gates was not observed,more » possibly due to the weakened interactions between the graphene nanoribbon and side-gate electrodes caused by the Ga{sup +} beam etching process.« less

  16. A recursive linear predictive vocoder

    NASA Astrophysics Data System (ADS)

    Janssen, W. A.

    1983-12-01

    A non-real time 10 pole recursive autocorrelation linear predictive coding vocoder was created for use in studying effects of recursive autocorrelation on speech. The vocoder is composed of two interchangeable pitch detectors, a speech analyzer, and speech synthesizer. The time between updating filter coefficients is allowed to vary from .125 msec to 20 msec. The best quality was found using .125 msec between each update. The greatest change in quality was noted when changing from 20 msec/update to 10 msec/update. Pitch period plots for the center clipping autocorrelation pitch detector and simplified inverse filtering technique are provided. Plots of speech into and out of the vocoder are given. Formant versus time three dimensional plots are shown. Effects of noise on pitch detection and formants are shown. Noise effects the voiced/unvoiced decision process causing voiced speech to be re-constructed as unvoiced.

  17. Support vector machine-an alternative to artificial neuron network for water quality forecasting in an agricultural nonpoint source polluted river?

    PubMed

    Liu, Mei; Lu, Jun

    2014-09-01

    Water quality forecasting in agricultural drainage river basins is difficult because of the complicated nonpoint source (NPS) pollution transport processes and river self-purification processes involved in highly nonlinear problems. Artificial neural network (ANN) and support vector model (SVM) were developed to predict total nitrogen (TN) and total phosphorus (TP) concentrations for any location of the river polluted by agricultural NPS pollution in eastern China. River flow, water temperature, flow travel time, rainfall, dissolved oxygen, and upstream TN or TP concentrations were selected as initial inputs of the two models. Monthly, bimonthly, and trimonthly datasets were selected to train the two models, respectively, and the same monthly dataset which had not been used for training was chosen to test the models in order to compare their generalization performance. Trial and error analysis and genetic algorisms (GA) were employed to optimize the parameters of ANN and SVM models, respectively. The results indicated that the proposed SVM models performed better generalization ability due to avoiding the occurrence of overtraining and optimizing fewer parameters based on structural risk minimization (SRM) principle. Furthermore, both TN and TP SVM models trained by trimonthly datasets achieved greater forecasting accuracy than corresponding ANN models. Thus, SVM models will be a powerful alternative method because it is an efficient and economic tool to accurately predict water quality with low risk. The sensitivity analyses of two models indicated that decreasing upstream input concentrations during the dry season and NPS emission along the reach during average or flood season should be an effective way to improve Changle River water quality. If the necessary water quality and hydrology data and even trimonthly data are available, the SVM methodology developed here can easily be applied to other NPS-polluted rivers.

  18. Use of predictive models and rapid methods to nowcast bacteria levels at coastal beaches

    USGS Publications Warehouse

    Francy, Donna S.

    2009-01-01

    The need for rapid assessments of recreational water quality to better protect public health is well accepted throughout the research and regulatory communities. Rapid analytical methods, such as quantitative polymerase chain reaction (qPCR) and immunomagnetic separation/adenosine triphosphate (ATP) analysis, are being tested but are not yet ready for widespread use.Another solution is the use of predictive models, wherein variable(s) that are easily and quickly measured are surrogates for concentrations of fecal-indicator bacteria. Rainfall-based alerts, the simplest type of model, have been used by several communities for a number of years. Deterministic models use mathematical representations of the processes that affect bacteria concentrations; this type of model is being used for beach-closure decisions at one location in the USA. Multivariable statistical models are being developed and tested in many areas of the USA; however, they are only used in three areas of the Great Lakes to aid in notifications of beach advisories or closings. These “operational” statistical models can result in more accurate assessments of recreational water quality than use of the previous day's Escherichia coli (E. coli)concentration as determined by traditional culture methods. The Ohio Nowcast, at Huntington Beach, Bay Village, Ohio, is described in this paper as an example of an operational statistical model. Because predictive modeling is a dynamic process, water-resource managers continue to collect additional data to improve the predictive ability of the nowcast and expand the nowcast to other Ohio beaches and a recreational river. Although predictive models have been shown to work well at some beaches and are becoming more widely accepted, implementation in many areas is limited by funding, lack of coordinated technical leadership, and lack of supporting epidemiological data.

  19. The Impact of Iodide-Mediated Ozone Deposition and ...

    EPA Pesticide Factsheets

    The air quality of many large coastal areas in the United States is affected by the confluence of polluted urban and relatively clean marine airmasses, each with distinct atmospheric chemistry. In this context, the role of iodide-mediated ozone (O3) deposition over seawater and marine halogen chemistry accounted for in both the lateral boundary conditions and coastal waters surrounding the continental U.S. is examined using the Community Multiscale Air Quality (CMAQ) model. Several nested simulations are conducted in which these halogen processes are implemented separately in the continental U.S. and hemispheric CMAQ domains, the latter providing lateral boundary conditions for the former. Overall, it is the combination of these processes within both the continental U.S. domain and from lateral boundary conditions that lead to the largest reductions in modeled surface O3 concentrations. Predicted reductions in surface O3 concentrations occur mainly along the coast where CMAQ typically has large overpredictions. These results suggest that a realistic representation of halogen processes in marine regions can improve model prediction of O3 concentrations near the coast. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and

  20. Evaluation of the AnnAGNPS Model for Predicting Runoff and Nutrient Export in a Typical Small Watershed in the Hilly Region of Taihu Lake.

    PubMed

    Luo, Chuan; Li, Zhaofu; Li, Hengpeng; Chen, Xiaomin

    2015-09-02

    The application of hydrological and water quality models is an efficient approach to better understand the processes of environmental deterioration. This study evaluated the ability of the Annualized Agricultural Non-Point Source (AnnAGNPS) model to predict runoff, total nitrogen (TN) and total phosphorus (TP) loading in a typical small watershed of a hilly region near Taihu Lake, China. Runoff was calibrated and validated at both an annual and monthly scale, and parameter sensitivity analysis was performed for TN and TP before the two water quality components were calibrated. The results showed that the model satisfactorily simulated runoff at annual and monthly scales, both during calibration and validation processes. Additionally, results of parameter sensitivity analysis showed that the parameters Fertilizer rate, Fertilizer organic, Canopy cover and Fertilizer inorganic were more sensitive to TN output. In terms of TP, the parameters Residue mass ratio, Fertilizer rate, Fertilizer inorganic and Canopy cover were the most sensitive. Based on these sensitive parameters, calibration was performed. TN loading produced satisfactory results for both the calibration and validation processes, whereas the performance of TP loading was slightly poor. The simulation results showed that AnnAGNPS has the potential to be used as a valuable tool for the planning and management of watersheds.

  1. Robustness and cognition in stabilization problem of dynamical systems based on asymptotic methods

    NASA Astrophysics Data System (ADS)

    Dubovik, S. A.; Kabanov, A. A.

    2017-01-01

    The problem of synthesis of stabilizing systems based on principles of cognitive (logical-dynamic) control for mobile objects used under uncertain conditions is considered. This direction in control theory is based on the principles of guaranteeing robust synthesis focused on worst-case scenarios of the controlled process. The guaranteeing approach is able to provide functioning of the system with the required quality and reliability only at sufficiently low disturbances and in the absence of large deviations from some regular features of the controlled process. The main tool for the analysis of large deviations and prediction of critical states here is the action functional. After the forecast is built, the choice of anti-crisis control is the supervisory control problem that optimizes the control system in a normal mode and prevents escape of the controlled process in critical states. An essential aspect of the approach presented here is the presence of a two-level (logical-dynamic) control: the input data are used not only for generating of synthesized feedback (local robust synthesis) in advance (off-line), but also to make decisions about the current (on-line) quality of stabilization in the global sense. An example of using the presented approach for the problem of development of the ship tilting prediction system is considered.

  2. Investigation of improving MEMS-type VOA reliability

    NASA Astrophysics Data System (ADS)

    Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.

    2003-12-01

    MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).

  3. Investigation of improving MEMS-type VOA reliability

    NASA Astrophysics Data System (ADS)

    Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.

    2004-01-01

    MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).

  4. Prognostics using Engineering and Environmental Parameters as Applied to State of Health (SOH) Radionuclide Aerosol Sampler Analyzer (RASA) Real-Time Monitoring

    NASA Astrophysics Data System (ADS)

    Hutchenson, K. D.; Hartley-McBride, S.; Saults, T.; Schmidt, D. P.

    2006-05-01

    The International Monitoring System (IMS) is composed in part of radionuclide particulate and gas monitoring systems. Monitoring the operational status of these systems is an important aspect of nuclear weapon test monitoring. Quality data, process control techniques, and predictive models are necessary to detect and predict system component failures. Predicting failures in advance provides time to mitigate these failures, thus minimizing operational downtime. The Provisional Technical Secretariat (PTS) requires IMS radionuclide systems be operational 95 percent of the time. The United States National Data Center (US NDC) offers contributing components to the IMS. This effort focuses on the initial research and process development using prognostics for monitoring and predicting failures of the RASA two (2) days into the future. The predictions, using time series methods, are input to an expert decision system, called SHADES (State of Health Airflow and Detection Expert System). The results enable personnel to make informed judgments about the health of the RASA system. Data are read from a relational database, processed, and displayed to the user in a GIS as a prototype GUI. This procedure mimics the real time application process that could be implemented as an operational system, This initial proof-of-concept effort developed predictive models focused on RASA components for a single site (USP79). Future work shall include the incorporation of other RASA systems, as well as their environmental conditions that play a significant role in performance. Similarly, SHADES currently accommodates specific component behaviors at this one site. Future work shall also include important environmental variables that play an important part of the prediction algorithms.

  5. The Effect of Hospital Service Quality on Patient's Trust.

    PubMed

    Zarei, Ehsan; Daneshkohan, Abbas; Khabiri, Roghayeh; Arab, Mohammad

    2015-01-01

    The trust is meant the belief of the patient to the practitioner or the hospital based on the concept that the care provider seeks the best for the patient and will provide the suitable care and treatment for him/her. One of the main determinants of patient's trust is the service quality. This study aimed to examine the effect of quality of services provided in private hospitals on the patient's trust. In this descriptive cross-sectional study, 969 patients were selected using the consecutive method from eight private general hospitals of Tehran, Iran, in 2010. Data were collected through a questionnaire containing 20 items (14 items for quality, 6 items for trust) and its validity and reliability were confirmed. Data were analyzed using descriptive statistics and multivariate regression. The mean score of patients' perception of trust was 3.80 and 4.01 for service quality. Approximately 38% of the variance in patient trust was explained by service quality dimensions. Quality of interaction and process (P < 0.001) were the strongest factors in predicting patient's trust, but the quality of the environment had no significant effect on the patients' degree of trust. The interaction quality and process quality were the key determinants of patient's trust in the private hospitals of Tehran. To enhance the patients' trust, quality improvement efforts should focus on service delivery aspects such as scheduling, timely and accurate doing of the service, and strengthening the interpersonal aspects of care and communication skills of doctors, nurses and staff.

  6. Objective Quality and Intelligibility Prediction for Users of Assistive Listening Devices

    PubMed Central

    Falk, Tiago H.; Parsa, Vijay; Santos, João F.; Arehart, Kathryn; Hazrati, Oldooz; Huber, Rainer; Kates, James M.; Scollie, Susan

    2015-01-01

    This article presents an overview of twelve existing objective speech quality and intelligibility prediction tools. Two classes of algorithms are presented, namely intrusive and non-intrusive, with the former requiring the use of a reference signal, while the latter does not. Investigated metrics include both those developed for normal hearing listeners, as well as those tailored particularly for hearing impaired (HI) listeners who are users of assistive listening devices (i.e., hearing aids, HAs, and cochlear implants, CIs). Representative examples of those optimized for HI listeners include the speech-to-reverberation modulation energy ratio, tailored to hearing aids (SRMR-HA) and to cochlear implants (SRMR-CI); the modulation spectrum area (ModA); the hearing aid speech quality (HASQI) and perception indices (HASPI); and the PErception MOdel - hearing impairment quality (PEMO-Q-HI). The objective metrics are tested on three subjectively-rated speech datasets covering reverberation-alone, noise-alone, and reverberation-plus-noise degradation conditions, as well as degradations resultant from nonlinear frequency compression and different speech enhancement strategies. The advantages and limitations of each measure are highlighted and recommendations are given for suggested uses of the different tools under specific environmental and processing conditions. PMID:26052190

  7. Imaging characteristics of photogrammetric camera systems

    USGS Publications Warehouse

    Welch, R.; Halliday, J.

    1973-01-01

    In view of the current interest in high-altitude and space photographic systems for photogrammetric mapping, the United States Geological Survey (U.S.G.S.) undertook a comprehensive research project designed to explore the practical aspects of applying the latest image quality evaluation techniques to the analysis of such systems. The project had two direct objectives: (1) to evaluate the imaging characteristics of current U.S.G.S. photogrammetric camera systems; and (2) to develop methodologies for predicting the imaging capabilities of photogrammetric camera systems, comparing conventional systems with new or different types of systems, and analyzing the image quality of photographs. Image quality was judged in terms of a number of evaluation factors including response functions, resolving power, and the detectability and measurability of small detail. The limiting capabilities of the U.S.G.S. 6-inch and 12-inch focal length camera systems were established by analyzing laboratory and aerial photographs in terms of these evaluation factors. In the process, the contributing effects of relevant parameters such as lens aberrations, lens aperture, shutter function, image motion, film type, and target contrast procedures for analyzing image quality and predicting and comparing performance capabilities. ?? 1973.

  8. Improving Air Quality (and Weather) Predictions using Advanced Data Assimilation Techniques Applied to Coupled Models during KORUS-AQ

    NASA Astrophysics Data System (ADS)

    Carmichael, G. R.; Saide, P. E.; Gao, M.; Streets, D. G.; Kim, J.; Woo, J. H.

    2017-12-01

    Ambient aerosols are important air pollutants with direct impacts on human health and on the Earth's weather and climate systems through their interactions with radiation and clouds. Their role is dependent on their distributions of size, number, phase and composition, which vary significantly in space and time. There remain large uncertainties in simulated aerosol distributions due to uncertainties in emission estimates and in chemical and physical processes associated with their formation and removal. These uncertainties lead to large uncertainties in weather and air quality predictions and in estimates of health and climate change impacts. Despite these uncertainties and challenges, regional-scale coupled chemistry-meteorological models such as WRF-Chem have significant capabilities in predicting aerosol distributions and explaining aerosol-weather interactions. We explore the hypothesis that new advances in on-line, coupled atmospheric chemistry/meteorological models, and new emission inversion and data assimilation techniques applicable to such coupled models, can be applied in innovative ways using current and evolving observation systems to improve predictions of aerosol distributions at regional scales. We investigate the impacts of assimilating AOD from geostationary satellite (GOCI) and surface PM2.5 measurements on predictions of AOD and PM in Korea during KORUS-AQ through a series of experiments. The results suggest assimilating datasets from multiple platforms can improve the predictions of aerosol temporal and spatial distributions.

  9. Machine Learning and Deep Learning Models to Predict Runoff Water Quantity and Quality

    NASA Astrophysics Data System (ADS)

    Bradford, S. A.; Liang, J.; Li, W.; Murata, T.; Simunek, J.

    2017-12-01

    Contaminants can be rapidly transported at the soil surface by runoff to surface water bodies. Physically-based models, which are based on the mathematical description of main hydrological processes, are key tools for predicting surface water impairment. Along with physically-based models, data-driven models are becoming increasingly popular for describing the behavior of hydrological and water resources systems since these models can be used to complement or even replace physically based-models. In this presentation we propose a new data-driven model as an alternative to a physically-based overland flow and transport model. First, we have developed a physically-based numerical model to simulate overland flow and contaminant transport (the HYDRUS-1D overland flow module). A large number of numerical simulations were carried out to develop a database containing information about the impact of various input parameters (weather patterns, surface topography, vegetation, soil conditions, contaminants, and best management practices) on runoff water quantity and quality outputs. This database was used to train data-driven models. Three different methods (Neural Networks, Support Vector Machines, and Recurrence Neural Networks) were explored to prepare input- output functional relations. Results demonstrate the ability and limitations of machine learning and deep learning models to predict runoff water quantity and quality.

  10. Usability Prediction & Ranking of SDLC Models Using Fuzzy Hierarchical Usability Model

    NASA Astrophysics Data System (ADS)

    Gupta, Deepak; Ahlawat, Anil K.; Sagar, Kalpna

    2017-06-01

    Evaluation of software quality is an important aspect for controlling and managing the software. By such evaluation, improvements in software process can be made. The software quality is significantly dependent on software usability. Many researchers have proposed numbers of usability models. Each model considers a set of usability factors but do not cover all the usability aspects. Practical implementation of these models is still missing, as there is a lack of precise definition of usability. Also, it is very difficult to integrate these models into current software engineering practices. In order to overcome these challenges, this paper aims to define the term `usability' using the proposed hierarchical usability model with its detailed taxonomy. The taxonomy considers generic evaluation criteria for identifying the quality components, which brings together factors, attributes and characteristics defined in various HCI and software models. For the first time, the usability model is also implemented to predict more accurate usability values. The proposed system is named as fuzzy hierarchical usability model that can be easily integrated into the current software engineering practices. In order to validate the work, a dataset of six software development life cycle models is created and employed. These models are ranked according to their predicted usability values. This research also focuses on the detailed comparison of proposed model with the existing usability models.

  11. Sequence similarity is more relevant than species specificity in probabilistic backtranslation.

    PubMed

    Ferro, Alfredo; Giugno, Rosalba; Pigola, Giuseppe; Pulvirenti, Alfredo; Di Pietro, Cinzia; Purrello, Michele; Ragusa, Marco

    2007-02-21

    Backtranslation is the process of decoding a sequence of amino acids into the corresponding codons. All synthetic gene design systems include a backtranslation module. The degeneracy of the genetic code makes backtranslation potentially ambiguous since most amino acids are encoded by multiple codons. The common approach to overcome this difficulty is based on imitation of codon usage within the target species. This paper describes EasyBack, a new parameter-free, fully-automated software for backtranslation using Hidden Markov Models. EasyBack is not based on imitation of codon usage within the target species, but instead uses a sequence-similarity criterion. The model is trained with a set of proteins with known cDNA coding sequences, constructed from the input protein by querying the NCBI databases with BLAST. Unlike existing software, the proposed method allows the quality of prediction to be estimated. When tested on a group of proteins that show different degrees of sequence conservation, EasyBack outperforms other published methods in terms of precision. The prediction quality of a protein backtranslation methis markedly increased by replacing the criterion of most used codon in the same species with a Hidden Markov Model trained with a set of most similar sequences from all species. Moreover, the proposed method allows the quality of prediction to be estimated probabilistically.

  12. Predictive assimilation framework to support contaminated site understanding and remediation

    NASA Astrophysics Data System (ADS)

    Versteeg, R. J.; Bianchi, M.; Hubbard, S. S.

    2014-12-01

    Subsurface system behavior at contaminated sites is driven and controlled by the interplay of physical, chemical, and biological processes occurring at multiple temporal and spatial scales. Effective remediation and monitoring planning requires an understanding of this complexity that is current, predictive (with some level of confidence) and actionable. We present and demonstrate a predictive assimilation framework (PAF). This framework automatically ingests, quality controls and stores near real-time environmental data and processes these data using different inversion and modeling codes to provide information on the current state and evolution of the subsurface system. PAF is implemented as a cloud based software application which has five components: (1) data acquisition, (2) data management, (3) data assimilation and processing, (4) visualization and result deliver and (5) orchestration. Access to and interaction with PAF is done through a standard browser. PAF is designed to be modular so that it can ingest and process different data streams dependent on the site. We will present an implementation of PAF which uses data from a highly instrumented site (the DOE Rifle Subsurface Biogeochemistry Field Observatory in Rifle, Colorado) for which PAF automatically ingests hydrological data and forward models groundwater flow in the saturated zone.

  13. The variation in the eating quality of beef from different sexes and breed classes cannot be completely explained by carcass measurements.

    PubMed

    Bonny, S P F; Hocquette, J-F; Pethick, D W; Farmer, L J; Legrand, I; Wierzbicki, J; Allen, P; Polkinghorne, R J; Gardner, G E

    2016-06-01

    Delivering beef of consistent quality to the consumer is vital for consumer satisfaction and will help to ensure demand and therefore profitability within the beef industry. In Australia, this is being tackled with Meat Standards Australia (MSA), which uses carcass traits and processing factors to deliver an individual eating quality guarantee to the consumer for 135 different 'cut by cooking methods' from each carcass. The carcass traits used in the MSA model, such as ossification score, carcass weight and marbling explain the majority of the differences between breeds and sexes. Therefore, it was expected that the model would predict with eating quality of bulls and dairy breeds with good accuracy. In total, 8128 muscle samples from 482 carcasses from France, Poland, Ireland and Northern Ireland were MSA graded at slaughter then evaluated for tenderness, juiciness, flavour liking and overall liking by untrained consumers, according to MSA protocols. The scores were weighted (0.3, 0.1, 0.3, 0.3) and combined to form a global eating quality (meat quality (MQ4)) score. The carcasses were grouped into one of the three breed categories: beef breeds, dairy breeds and crosses. The difference between the actual and the MSA-predicted MQ4 scores were analysed using a linear mixed effects model including fixed effects for carcass hang method, cook type, muscle type, sex, country, breed category and postmortem ageing period, and random terms for animal identification, consumer country and kill group. Bulls had lower MQ4 scores than steers and females and were predicted less accurately by the MSA model. Beef breeds had lower eating quality scores than dairy breeds and crosses for five out of the 16 muscles tested. Beef breeds were also over predicted in comparison with the cross and dairy breeds for six out of the 16 muscles tested. Therefore, even after accounting for differences in carcass traits, bulls still differ in eating quality when compared with females and steers. Breed also influenced eating quality beyond differences in carcass traits. However, in this case, it was only for certain muscles. This should be taken into account when estimating the eating quality of meat. In addition, the coefficients used by the Australian MSA model for some muscles, marbling score and ultimate pH do not exactly reflect the influence of these factors on eating quality in this data set, and if this system was to be applied to Europe then the coefficients for these muscles and covariates would need further investigation.

  14. Perceptual video quality assessment in H.264 video coding standard using objective modeling.

    PubMed

    Karthikeyan, Ramasamy; Sainarayanan, Gopalakrishnan; Deepa, Subramaniam Nachimuthu

    2014-01-01

    Since usage of digital video is wide spread nowadays, quality considerations have become essential, and industry demand for video quality measurement is rising. This proposal provides a method of perceptual quality assessment in H.264 standard encoder using objective modeling. For this purpose, quality impairments are calculated and a model is developed to compute the perceptual video quality metric based on no reference method. Because of the shuttle difference between the original video and the encoded video the quality of the encoded picture gets degraded, this quality difference is introduced by the encoding process like Intra and Inter prediction. The proposed model takes into account of the artifacts introduced by these spatial and temporal activities in the hybrid block based coding methods and an objective modeling of these artifacts into subjective quality estimation is proposed. The proposed model calculates the objective quality metric using subjective impairments; blockiness, blur and jerkiness compared to the existing bitrate only calculation defined in the ITU G 1070 model. The accuracy of the proposed perceptual video quality metrics is compared against popular full reference objective methods as defined by VQEG.

  15. Comparative analysis of quality parameters of Italian extra virgin olive oils according to their region of origin

    NASA Astrophysics Data System (ADS)

    Mignani, Anna Grazia; García-Allende, Pilar Beatriz; Ciaccheri, Leonardo; Conde, Olga M.; Cimato, Antonio; Attilio, Cristina; Tura, Debora

    2008-04-01

    Italian extra virgin olive oils from four regions covering different latitudes of the country were considered. They were analyzed by means of absorption spectroscopy in the wide 200-2800 nm spectral range, and multivariate data processing was applied. These spectra were virtually a signature identification from which to extract information on the region of origin and on the most important quality indicators. A classification map was created which was able to group the 80 oils on the basis of their region of origin. Furthermore, a model for the prediction of quality parameters such as oleic acidity, peroxide number, K232, K270 and Delta K, was developed.

  16. Linear prediction data extrapolation superresolution radar imaging

    NASA Astrophysics Data System (ADS)

    Zhu, Zhaoda; Ye, Zhenru; Wu, Xiaoqing

    1993-05-01

    Range resolution and cross-range resolution of range-doppler imaging radars are related to the effective bandwidth of transmitted signal and the angle through which the object rotates relatively to the radar line of sight (RLOS) during the coherent processing time, respectively. In this paper, linear prediction data extrapolation discrete Fourier transform (LPDEDFT) superresolution imaging method is investigated for the purpose of surpassing the limitation imposed by the conventional FFT range-doppler processing and improving the resolution capability of range-doppler imaging radar. The LPDEDFT superresolution imaging method, which is conceptually simple, consists of extrapolating observed data beyond the observation windows by means of linear prediction, and then performing the conventional IDFT of the extrapolated data. The live data of a metalized scale model B-52 aircraft mounted on a rotating platform in a microwave anechoic chamber and a flying Boeing-727 aircraft were processed. It is concluded that, compared to the conventional Fourier method, either higher resolution for the same effective bandwidth of transmitted signals and total rotation angle of the object or equal-quality images from smaller bandwidth and total angle may be obtained by LPDEDFT.

  17. New submodel for watershed-scale simulations of fecal bacteria fate and transport at agricultural and pasture lands

    USDA-ARS?s Scientific Manuscript database

    Microbial contamination of waters is the critical public health issue. The watershed-scale process-based modeling of bacteria fate and transport (F&T) has been proven to serve as the useful tool for predicting microbial water quality and evaluating management practices. The objective of this work is...

  18. The Use of Interactive Methods in the Educational Process of the Higher Education Institution

    ERIC Educational Resources Information Center

    Kutbiddinova, Rimma A.; Eromasova, Aleksandr? A.; Romanova, Marina A.

    2016-01-01

    The modernization of higher education and the transition to the new Federal Education Standards require a higher quality training of the graduates. The training of highly qualified specialists must meet strict requirements: a high level of professional competence, the developed communication skills, the ability to predict the results of one's own…

  19. From Intent to Enrollment, Attendance, and Participation in Preventive Parenting Groups

    ERIC Educational Resources Information Center

    Dumas, Jean E.; Nissley-Tsiopinis, Jenelle; Moreland, Angela D.

    2007-01-01

    Applying the Theory of Planned Behavior (TPB) to the process of engagement in preventive parenting groups, we tested the ability of family and child measures to predict intent to enroll, enrollment, attendance, and quality of participation in PACE (Parenting Our Children to Excellence). PACE is a prevention trial testing the efficacy of a…

  20. Learning Strategies Assessed by Journal Writing: Prediction of Learning Outcomes by Quantity, Quality, and Combinations of Learning Strategies

    ERIC Educational Resources Information Center

    Glogger, Inga; Schwonke, Rolf; Holzapfel, Lars; Nuckles, Matthias; Renkl, Alexander

    2012-01-01

    Recently, there have been efforts to rethink assessment. Instead of informing about (relatively stable) learner characteristics, assessment should assist instruction by looking at the learning process, facilitating feedback about what students' next step in learning could be. Similarly, new forms of strategy assessment aim at capturing…

  1. Predicting Personal Healthcare Management: Impact of Individual Characteristics on Patient Use of Health Information Technology

    ERIC Educational Resources Information Center

    Sandefer, Ryan Heath

    2017-01-01

    The use of health information and health information technology by consumers is a major factor in the current healthcare systems' effort to address issues related to quality, cost, and access. Patient engagement in the healthcare process through access to information related to diagnoses, procedures, and treatment has the potential to improve…

  2. Tell Me Why! Content Knowledge Predicts Process-Orientation of Math Researchers' and Math Teachers' Explanations

    ERIC Educational Resources Information Center

    Lachner, Andreas; Nückles, Matthias

    2016-01-01

    In two studies, we investigated the impact of instructors' different knowledge bases on the quality of their instructional explanations. In Study 1, we asked 20 mathematics teachers (with high pedagogical content knowledge, but lower content knowledge) and 15 mathematicians (with lower pedagogical content knowledge, but high content knowledge) to…

  3. Tracking reliability for space cabin-borne equipment in development by Crow model.

    PubMed

    Chen, J D; Jiao, S J; Sun, H L

    2001-12-01

    Objective. To study and track the reliability growth of manned spaceflight cabin-borne equipment in the course of its development. Method. A new technique of reliability growth estimation and prediction, which is composed of the Crow model and test data conversion (TDC) method was used. Result. The estimation and prediction value of the reliability growth conformed to its expectations. Conclusion. The method could dynamically estimate and predict the reliability of the equipment by making full use of various test information in the course of its development. It offered not only a possibility of tracking the equipment reliability growth, but also the reference for quality control in manned spaceflight cabin-borne equipment design and development process.

  4. Analytical Quality by Design Approach in RP-HPLC Method Development for the Assay of Etofenamate in Dosage Forms

    PubMed Central

    Peraman, R.; Bhadraya, K.; Reddy, Y. Padmanabha; Reddy, C. Surayaprakash; Lokesh, T.

    2015-01-01

    By considering the current regulatory requirement for an analytical method development, a reversed phase high performance liquid chromatographic method for routine analysis of etofenamate in dosage form has been optimized using analytical quality by design approach. Unlike routine approach, the present study was initiated with understanding of quality target product profile, analytical target profile and risk assessment for method variables that affect the method response. A liquid chromatography system equipped with a C18 column (250×4.6 mm, 5 μ), a binary pump and photodiode array detector were used in this work. The experiments were conducted based on plan by central composite design, which could save time, reagents and other resources. Sigma Tech software was used to plan and analyses the experimental observations and obtain quadratic process model. The process model was used for predictive solution for retention time. The predicted data from contour diagram for retention time were verified actually and it satisfied with actual experimental data. The optimized method was achieved at 1.2 ml/min flow rate of using mobile phase composition of methanol and 0.2% triethylamine in water at 85:15, % v/v, pH adjusted to 6.5. The method was validated and verified for targeted method performances, robustness and system suitability during method transfer. PMID:26997704

  5. No-reference quality assessment based on visual perception

    NASA Astrophysics Data System (ADS)

    Li, Junshan; Yang, Yawei; Hu, Shuangyan; Zhang, Jiao

    2014-11-01

    The visual quality assessment of images/videos is an ongoing hot research topic, which has become more and more important for numerous image and video processing applications with the rapid development of digital imaging and communication technologies. The goal of image quality assessment (IQA) algorithms is to automatically assess the quality of images/videos in agreement with human quality judgments. Up to now, two kinds of models have been used for IQA, namely full-reference (FR) and no-reference (NR) models. For FR models, IQA algorithms interpret image quality as fidelity or similarity with a perfect image in some perceptual space. However, the reference image is not available in many practical applications, and a NR IQA approach is desired. Considering natural vision as optimized by the millions of years of evolutionary pressure, many methods attempt to achieve consistency in quality prediction by modeling salient physiological and psychological features of the human visual system (HVS). To reach this goal, researchers try to simulate HVS with image sparsity coding and supervised machine learning, which are two main features of HVS. A typical HVS captures the scenes by sparsity coding, and uses experienced knowledge to apperceive objects. In this paper, we propose a novel IQA approach based on visual perception. Firstly, a standard model of HVS is studied and analyzed, and the sparse representation of image is accomplished with the model; and then, the mapping correlation between sparse codes and subjective quality scores is trained with the regression technique of least squaresupport vector machine (LS-SVM), which gains the regressor that can predict the image quality; the visual metric of image is predicted with the trained regressor at last. We validate the performance of proposed approach on Laboratory for Image and Video Engineering (LIVE) database, the specific contents of the type of distortions present in the database are: 227 images of JPEG2000, 233 images of JPEG, 174 images of White Noise, 174 images of Gaussian Blur, 174 images of Fast Fading. The database includes subjective differential mean opinion score (DMOS) for each image. The experimental results show that the proposed approach not only can assess many kinds of distorted images quality, but also exhibits a superior accuracy and monotonicity.

  6. New directions: Time for a new approach to modeling surface-atmosphere exchanges in air quality models?

    NASA Astrophysics Data System (ADS)

    Saylor, Rick D.; Hicks, Bruce B.

    2016-03-01

    Just as the exchange of heat, moisture and momentum between the Earth's surface and the atmosphere are critical components of meteorological and climate models, the surface-atmosphere exchange of many trace gases and aerosol particles is a vitally important process in air quality (AQ) models. Current state-of-the-art AQ models treat the emission and deposition of most gases and particles as separate model parameterizations, even though evidence has accumulated over time that the emission and deposition processes of many constituents are often two sides of the same coin, with the upward (emission) or downward (deposition) flux over a landscape depending on a range of environmental, seasonal and biological variables. In this note we argue that the time has come to integrate the treatment of these processes in AQ models to provide biological, physical and chemical consistency and improved predictions of trace gases and particles.

  7. Near infrared (NIR) spectroscopy for in-line monitoring of polymer extrusion processes.

    PubMed

    Rohe, T; Becker, W; Kölle, S; Eisenreich, N; Eyerer, P

    1999-09-13

    In recent years, near infrared (NIR) spectroscopy has become an analytical tool frequently used in many chemical production processes. In particular, on-line measurements are of interest to increase process stability and to document constant product quality. Application to polymer processing e.g. polymer extrusion, could even increase product quality. Interesting parameters are composition of the processed polymer, moisture, or reaction status in reactive extrusion. For this issue a transmission sensor was developed for application of NIR spectroscopy to extrusion processes. This sensor includes fibre optic probes and a measuring cell to be adapted to various extruders for in-line measurements. In contrast to infrared sensors, it only uses optical quartz components. Extrusion processes at temperatures up to 300 degrees C and pressures up to 37 MPa have been investigated. Application of multivariate data analysis (e.g. partial least squares, PLS) demonstrated the performance of the system with respect to process monitoring: in the case of polymer blending, deviations between predicted and actual polymer composition were quite low (in the range of +/-0.25%). So the complete system is suitable for harsh industrial environments and could lead to improved polymer extrusion processes.

  8. Optimal changes in the prepress process: a model for evaluation of state and direction of productivity and quality improvements

    NASA Astrophysics Data System (ADS)

    Kihlberg, Henrik; Lindgren, Mats

    1998-09-01

    The demands changes with the customers marketplace which makes it crucial for prepress companies of today and those of tomorrow to be able to change their services. The production tools are becoming more standardized and similar through out the industry. Intelligent tools are developed at a rapid pace which results in possibilities to automate these processes. Key success factors of today and tomorrow are the ability to change and understand the customers' market. The market demands shorter delivery times and lower costs. The total number of printed editions are decreasing while each edition contains an increased numbers of pages and images. The customers requires higher quality with the ability to control and predict the end result. Case studies, interviews and workshops have been carried out at commercial printing companies, prepress houses, image bureau's, advertising agencies and digital photographers in Sweden. A major part of the research focus on the digital image process at eleven companies in the graphic arts industry, all of which have prepress. The analysis has resulted in the thorough knowledge of both the production process and the parameters to measure productivity and quality. A model for the evaluation of changes is presented, with measurable values for productivity and quality. The model can be used to map and compare the states prepress are in, and/or be used to evaluate if changes are needed.

  9. Resource Management Scheme Based on Ubiquitous Data Analysis

    PubMed Central

    Lee, Heung Ki; Jung, Jaehee

    2014-01-01

    Resource management of the main memory and process handler is critical to enhancing the system performance of a web server. Owing to the transaction delay time that affects incoming requests from web clients, web server systems utilize several web processes to anticipate future requests. This procedure is able to decrease the web generation time because there are enough processes to handle the incoming requests from web browsers. However, inefficient process management results in low service quality for the web server system. Proper pregenerated process mechanisms are required for dealing with the clients' requests. Unfortunately, it is difficult to predict how many requests a web server system is going to receive. If a web server system builds too many web processes, it wastes a considerable amount of memory space, and thus performance is reduced. We propose an adaptive web process manager scheme based on the analysis of web log mining. In the proposed scheme, the number of web processes is controlled through prediction of incoming requests, and accordingly, the web process management scheme consumes the least possible web transaction resources. In experiments, real web trace data were used to prove the improved performance of the proposed scheme. PMID:25197692

  10. The Impact of 3D Data Quality on Improving GNSS Performance Using City Models Initial Simulations

    NASA Astrophysics Data System (ADS)

    Ellul, C.; Adjrad, M.; Groves, P.

    2016-10-01

    There is an increasing demand for highly accurate positioning information in urban areas, to support applications such as people and vehicle tracking, real-time air quality detection and navigation. However systems such as GPS typically perform poorly in dense urban areas. A number of authors have made use of 3D city models to enhance accuracy, obtaining good results, but to date the influence of the quality of the 3D city model on these results has not been tested. This paper addresses the following question: how does the quality, and in particular the variation in height, level of generalization and completeness and currency of a 3D dataset, impact the results obtained for the preliminary calculations in a process known as Shadow Matching, which takes into account not only where satellite signals are visible on the street but also where they are predicted to be absent. We describe initial simulations to address this issue, examining the variation in elevation angle - i.e. the angle above which the satellite is visible, for three 3D city models in a test area in London, and note that even within one dataset using different available height values could cause a difference in elevation angle of up to 29°. Missing or extra buildings result in an elevation variation of around 85°. Variations such as these can significantly influence the predicted satellite visibility which will then not correspond to that experienced on the ground, reducing the accuracy of the resulting Shadow Matching process.

  11. Direct and indirect effects of a family-based intervention in early adolescence on parent-youth relationship quality, late adolescent health, and early adult obesity.

    PubMed

    Van Ryzin, Mark J; Nowicka, Paulina

    2013-02-01

    We explored family processes in adolescence that may influence the likelihood of obesity in early adulthood using a randomized trial of a family-based intervention (the Family Check-Up, or FCU). The FCU has been shown to reduce escalations in antisocial behavior and depression in adolescence by supporting positive family management practices, but no research has examined the mechanisms by which the FCU could influence health-related attitudes and behaviors linked to obesity. Participants were 998 adolescents (n = 526 male; n = 423 European American; M age 12.21 years) and their families, recruited in 6th grade from 3 middle schools in the Pacific Northwest. We used structural equation modeling (SEM) and an Intent-To-Treat (ITT) design to evaluate the direct and indirect effects of the FCU on parent-youth relationship quality (ages 12-15), healthy lifestyle behaviors, eating attitudes, depressive symptoms (all measured at age 17), and obesity (age 22). We found that the FCU led to greater parent-youth relationship quality, which predicted enhanced health-related behaviors, reduced maladaptive eating attitudes, and reduced depression. In turn, reduced maladaptive eating attitudes predicted reduced odds of obesity. The indirect effect of the FCU on obesity by way of parent-youth relationship quality and eating attitudes was significant. Our findings illustrate how family processes may influence adolescent health and suggest that family functioning may be an additional factor to consider when developing intervention programs for obesity. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  12. Different Statistical Approaches to Investigate Porcine Muscle Metabolome Profiles to Highlight New Biomarkers for Pork Quality Assessment

    PubMed Central

    Welzenbach, Julia; Neuhoff, Christiane; Looft, Christian; Schellander, Karl; Tholen, Ernst; Große-Brinkhaus, Christine

    2016-01-01

    The aim of this study was to elucidate the underlying biochemical processes to identify potential key molecules of meat quality traits drip loss, pH of meat 1 h post-mortem (pH1), pH in meat 24 h post-mortem (pH24) and meat color. An untargeted metabolomics approach detected the profiles of 393 annotated and 1,600 unknown metabolites in 97 Duroc × Pietrain pigs. Despite obvious differences regarding the statistical approaches, the four applied methods, namely correlation analysis, principal component analysis, weighted network analysis (WNA) and random forest regression (RFR), revealed mainly concordant results. Our findings lead to the conclusion that meat quality traits pH1, pH24 and color are strongly influenced by processes of post-mortem energy metabolism like glycolysis and pentose phosphate pathway, whereas drip loss is significantly associated with metabolites of lipid metabolism. In case of drip loss, RFR was the most suitable method to identify reliable biomarkers and to predict the phenotype based on metabolites. On the other hand, WNA provides the best parameters to investigate the metabolite interactions and to clarify the complex molecular background of meat quality traits. In summary, it was possible to attain findings on the interaction of meat quality traits and their underlying biochemical processes. The detected key metabolites might be better indicators of meat quality especially of drip loss than the measured phenotype itself and potentially might be used as bio indicators. PMID:26919205

  13. Using a topographic index to distribute variable source area runoff predicted with the SCS curve-number equation

    NASA Astrophysics Data System (ADS)

    Lyon, Steve W.; Walter, M. Todd; Gérard-Marchant, Pierre; Steenhuis, Tammo S.

    2004-10-01

    Because the traditional Soil Conservation Service curve-number (SCS-CN) approach continues to be used ubiquitously in water quality models, new application methods are needed that are consistent with variable source area (VSA) hydrological processes in the landscape. We developed and tested a distributed approach for applying the traditional SCS-CN equation to watersheds where VSA hydrology is a dominant process. Predicting the location of source areas is important for watershed planning because restricting potentially polluting activities from runoff source areas is fundamental to controlling non-point-source pollution. The method presented here used the traditional SCS-CN approach to predict runoff volume and spatial extent of saturated areas and a topographic index, like that used in TOPMODEL, to distribute runoff source areas through watersheds. The resulting distributed CN-VSA method was applied to two subwatersheds of the Delaware basin in the Catskill Mountains region of New York State and one watershed in south-eastern Australia to produce runoff-probability maps. Observed saturated area locations in the watersheds agreed with the distributed CN-VSA method. Results showed good agreement with those obtained from the previously validated soil moisture routing (SMR) model. When compared with the traditional SCS-CN method, the distributed CN-VSA method predicted a similar total volume of runoff, but vastly different locations of runoff generation. Thus, the distributed CN-VSA approach provides a physically based method that is simple enough to be incorporated into water quality models, and other tools that currently use the traditional SCS-CN method, while still adhering to the principles of VSA hydrology.

  14. Vector Adaptive/Predictive Encoding Of Speech

    NASA Technical Reports Server (NTRS)

    Chen, Juin-Hwey; Gersho, Allen

    1989-01-01

    Vector adaptive/predictive technique for digital encoding of speech signals yields decoded speech of very good quality after transmission at coding rate of 9.6 kb/s and of reasonably good quality at 4.8 kb/s. Requires 3 to 4 million multiplications and additions per second. Combines advantages of adaptive/predictive coding, and code-excited linear prediction, yielding speech of high quality but requires 600 million multiplications and additions per second at encoding rate of 4.8 kb/s. Vector adaptive/predictive coding technique bridges gaps in performance and complexity between adaptive/predictive coding and code-excited linear prediction.

  15. Model of Silicon Refining During Tapping: Removal of Ca, Al, and Other Selected Element Groups

    NASA Astrophysics Data System (ADS)

    Olsen, Jan Erik; Kero, Ida T.; Engh, Thorvald A.; Tranell, Gabriella

    2017-04-01

    A mathematical model for industrial refining of silicon alloys has been developed for the so-called oxidative ladle refining process. It is a lumped (zero-dimensional) model, based on the mass balances of metal, slag, and gas in the ladle, developed to operate with relatively short computational times for the sake of industrial relevance. The model accounts for a semi-continuous process which includes both the tapping and post-tapping refining stages. It predicts the concentrations of Ca, Al, and trace elements, most notably the alkaline metals, alkaline earth metal, and rare earth metals. The predictive power of the model depends on the quality of the model coefficients, the kinetic coefficient, τ, and the equilibrium partition coefficient, L for a given element. A sensitivity analysis indicates that the model results are most sensitive to L. The model has been compared to industrial measurement data and found to be able to qualitatively, and to some extent quantitatively, predict the data. The model is very well suited for alkaline and alkaline earth metals which respond relatively fast to the refining process. The model is less well suited for elements such as the lanthanides and Al, which are refined more slowly. A major challenge for the prediction of the behavior of the rare earth metals is that reliable thermodynamic data for true equilibrium conditions relevant to the industrial process is not typically available in literature.

  16. Detection and assessment of flaws in friction stir welded metallic plates

    NASA Astrophysics Data System (ADS)

    Fakih, Mohammad Ali; Mustapha, Samir; Tarraf, Jaafar; Ayoub, Georges; Hamade, Ramsey

    2017-04-01

    Investigated is the ability of ultrasonic guided waves to detect flaws and assess the quality of friction stir welds (FSW). AZ31B magnesium plates were friction stir welded. While process parameters of spindle speed and tool feed were fixed, shoulder penetration depth was varied resulting in welds of varying quality. Ultrasonic waves were excited at different frequencies using piezoelectric wafers and the fundamental symmetric (S0) mode was selected to detect the flaws resulting from the welding process. The front of the first transmitted wave signal was used to capture the S0 mode. A damage index (DI) measure was defined based on the amplitude attenuation after wave interaction with the welded zone. Computed Tomography (CT) scanning was employed as a nondestructive testing (NDT) technique to assess the actual weld quality. Derived DI values were plotted against CT-derived flaw volume resulting in a perfectly linear fit. The proposed approach showed high sensitivity of the S0 mode to internal flaws within the weld. As such, this methodology bears great potential as a future predictive method for the evaluation of FSW weld quality.

  17. Documentation of pain care processes does not accurately reflect pain management delivered in primary care.

    PubMed

    Krebs, Erin E; Bair, Matthew J; Carey, Timothy S; Weinberger, Morris

    2010-03-01

    Researchers and quality improvement advocates sometimes use review of chart-documented pain care processes to assess the quality of pain management. Studies have found that primary care providers frequently fail to document pain assessment and management. To assess documentation of pain care processes in an academic primary care clinic and evaluate the validity of this documentation as a measure of pain care delivered. Prospective observational study. 237 adult patients at a university-affiliated internal medicine clinic who reported any pain in the last week. Immediately after a visit, we asked patients to report the pain treatment they received. Patients completed the Brief Pain Inventory (BPI) to assess pain severity at baseline and 1 month later. We extracted documentation of pain care processes from the medical record and used kappa statistics to assess agreement between documentation and patient report of pain treatment. Using multivariable linear regression, we modeled whether documented or patient-reported pain care predicted change in pain at 1 month. Participants' mean age was 53.7 years, 66% were female, and 74% had chronic pain. Physicians documented pain assessment for 83% of visits. Patients reported receiving pain treatment more often (67%) than was documented by physicians (54%). Agreement between documentation and patient report was moderate for receiving a new pain medication (k = 0.50) and slight for receiving pain management advice (k = 0.13). In multivariable models, documentation of new pain treatment was not associated with change in pain (p = 0.134). In contrast, patient-reported receipt of new pain treatment predicted pain improvement (p = 0.005). Chart documentation underestimated pain care delivered, compared with patient report. Documented pain care processes had no relationship with pain outcomes at 1 month, but patient report of receiving care predicted clinically significant improvement. Chart review measures may not accurately reflect the pain management patients receive in primary care.

  18. Applicability of near-infrared spectroscopy in the monitoring of film coating and curing process of the prolonged release coated pellets.

    PubMed

    Korasa, Klemen; Hudovornik, Grega; Vrečer, Franc

    2016-10-10

    Although process analytical technology (PAT) guidance has been introduced to the pharmaceutical industry just a decade ago, this innovative approach has already become an important part of efficient pharmaceutical development, manufacturing, and quality assurance. PAT tools are especially important in technologically complex operations which require strict control of critical process parameters and have significant effect on final product quality. Manufacturing of prolonged release film coated pellets is definitely one of such processes. The aim of the present work was to study the applicability of the at-line near-infrared spectroscopy (NIR) approach in the monitoring of pellet film coating and curing steps. Film coated pellets were manufactured by coating the active ingredient containing pellets with film coating based on polymethacrylate polymers (Eudragit® RS/RL). The NIR proved as a useful tool for the monitoring of the curing process since it was able to determine the extent of the curing and hence predict drug release rate by using partial least square (PLS) model. However, such approach also showed a number of limitations, such as low reliability and high susceptibility to pellet moisture content, and was thus not able to predict drug release from pellets with high moisture content. On the other hand, the at-line NIR was capable to predict the thickness of Eudragit® RS/RL film coating in a wide range (up to 40μm) with good accuracy even in the pellets with high moisture content. To sum up, high applicability of the at-line NIR in the monitoring of the prolonged release pellets production was demonstrated in the present study. The present findings may contribute to more efficient and reliable PAT solutions in the manufacturing of prolonged release dosage forms. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Ecotoxicological criteria for final storage quality: Possibilities and limits

    NASA Astrophysics Data System (ADS)

    Zeyer, Josef; Meyer, Joseph

    Landfills are complex chemical and biological reactors whose internal processes are often beyond the immediate control of process engineers. Therefore, the concept of a "Final Storage Landfill" may be deceptive. Furthermore, traditional approaches to establishing discharge criteria and treatment requirements for industrial effluents may not work well for landfill emissions. Factories can often be treated as steady-state processes whose inputs and outputs are predictable; however, landfills are batch reactors whose contents and emissions may be unknown and will vary temporally and spatially. If the contents of a landfill are known, the sequence of chemical reactions can be predicted qualitatively. Even if that sequence is predictable, though, quantitative ecotoxicological criteria will be difficult to establish, and risk assessments based on chemical "laundry lists" will be questionable. The situation is not hopeless, though. New approaches can be developed to monitor and predict landfill emissions. We believe these will include (1) testing (biological and chemical) of internal components of landfills as well as emissions; (2) development of laboratory and/or field methods in which the chemical and biological evolution of landfills can be studied at accelerated rates, thus allowing better prediction of future emissions; and (3) flexible ecotoxicological criteria that are adaptable to the evolving nature of landfill emissions. These criteria should be based on complementary chemical analyses and biological tests that fit into a hierarchical (decision-tree) hazard assessment strategy.

  20. Statistical model selection for better prediction and discovering science mechanisms that affect reliability

    DOE PAGES

    Anderson-Cook, Christine M.; Morzinski, Jerome; Blecker, Kenneth D.

    2015-08-19

    Understanding the impact of production, environmental exposure and age characteristics on the reliability of a population is frequently based on underlying science and empirical assessment. When there is incomplete science to prescribe which inputs should be included in a model of reliability to predict future trends, statistical model/variable selection techniques can be leveraged on a stockpile or population of units to improve reliability predictions as well as suggest new mechanisms affecting reliability to explore. We describe a five-step process for exploring relationships between available summaries of age, usage and environmental exposure and reliability. The process involves first identifying potential candidatemore » inputs, then second organizing data for the analysis. Third, a variety of models with different combinations of the inputs are estimated, and fourth, flexible metrics are used to compare them. As a result, plots of the predicted relationships are examined to distill leading model contenders into a prioritized list for subject matter experts to understand and compare. The complexity of the model, quality of prediction and cost of future data collection are all factors to be considered by the subject matter experts when selecting a final model.« less

  1. Investigation on Effect of Material Hardness in High Speed CNC End Milling Process.

    PubMed

    Dhandapani, N V; Thangarasu, V S; Sureshkannan, G

    2015-01-01

    This research paper analyzes the effects of material properties on surface roughness, material removal rate, and tool wear on high speed CNC end milling process with various ferrous and nonferrous materials. The challenge of material specific decision on the process parameters of spindle speed, feed rate, depth of cut, coolant flow rate, cutting tool material, and type of coating for the cutting tool for required quality and quantity of production is addressed. Generally, decision made by the operator on floor is based on suggested values of the tool manufacturer or by trial and error method. This paper describes effect of various parameters on the surface roughness characteristics of the precision machining part. The prediction method suggested is based on various experimental analysis of parameters in different compositions of input conditions which would benefit the industry on standardization of high speed CNC end milling processes. The results show a basis for selection of parameters to get better results of surface roughness values as predicted by the case study results.

  2. Investigation on Effect of Material Hardness in High Speed CNC End Milling Process

    PubMed Central

    Dhandapani, N. V.; Thangarasu, V. S.; Sureshkannan, G.

    2015-01-01

    This research paper analyzes the effects of material properties on surface roughness, material removal rate, and tool wear on high speed CNC end milling process with various ferrous and nonferrous materials. The challenge of material specific decision on the process parameters of spindle speed, feed rate, depth of cut, coolant flow rate, cutting tool material, and type of coating for the cutting tool for required quality and quantity of production is addressed. Generally, decision made by the operator on floor is based on suggested values of the tool manufacturer or by trial and error method. This paper describes effect of various parameters on the surface roughness characteristics of the precision machining part. The prediction method suggested is based on various experimental analysis of parameters in different compositions of input conditions which would benefit the industry on standardization of high speed CNC end milling processes. The results show a basis for selection of parameters to get better results of surface roughness values as predicted by the case study results. PMID:26881267

  3. Managing meat tenderness.

    PubMed

    Thompson, John

    2002-11-01

    This paper discusses the management of meat tenderness using a carcass grading scheme which utilizes the concept of total quality management of those factors which impact on beef palatability. The scheme called Meat Standards Australia (MSA) has identified the Critical Control Points (CCPs) from the production, pre-slaughter, processing and value adding sectors of the beef supply chain and quantified their relative importance using large-scale consumer testing. These CCPs have been used to manage beef palatability in two ways. Firstly, CCPs from the pre-slaughter and processing sectors have been used as mandatory criteria for carcasses to be graded. Secondly, other CCPs from the production and processing sectors have been incorporated into a model to predict palatability for individual muscles. The evidence for the importance of CCPs from the production (breed, growth path and HGP implants), pre-slaughter and processing (pH/temperature window, alternative carcass suspension, marbling and ageing) sectors are reviewed and the accuracy of the model to predict palatability for specific muscle×cooking techniques is presented.

  4. The effects of autonomy-supportive coaching, need satisfaction, and self-perceptions on initiative and identity in youth swimmers.

    PubMed

    Coatsworth, J Douglas; Conroy, David E

    2009-03-01

    This study tested a sequential process model linking youth sport coaching climates (perceived coach behaviors and perceived need satisfaction) to youth self-perceptions (perceived competence and global self-esteem) and youth development outcomes (initiative, identity reflection, identity exploration). A sample of 119 youth between the ages of 10 and 18 who participated in a community-directed summer swim league completed questionnaires over the course of the 7-week season. Results indicated that coaches' autonomy support, particularly via process-focused praise, predicted youth competence need satisfaction and relatedness need satisfaction in the coaching relationship. Youth competence need satisfaction predicted self-esteem indirectly via perceived competence. Finally, self-esteem predicted identity reflection, and perceived competence predicted both identity reflection and initiative. Effects of age, sex, and perceptions of direct contact with the coach were not significant. Findings suggest that the quality of the coaching climate is an important predictor of the developmental benefits of sport participation and that one pathway by which the coaching climate has its effect on initiative and identity reflection is through developing youth self-perceptions.

  5. The Effects of Autonomy-supportive Coaching, Need Satisfaction and Self-Perceptions on Initiative and Identity in Youth Swimmers

    PubMed Central

    Coatsworth, J. Douglas; Conroy, David E.

    2015-01-01

    This study tested a sequential process model linking youth sport coaching climates (perceived coach behaviors and perceived need satisfaction) to youth self-perceptions (perceived competence and global self-esteem) and youth development outcomes (initiative, identity reflection, identity exploration). A sample of 119 youth between the ages 10–18 who participated in a community-directed summer swim league completed questionnaires over the course of the seven-week season. Results indicated that coaches’ autonomy support, particularly via process-focused praise, predicted youth competence and relatedness need satisfaction in the coaching relationship. Youth competence need satisfaction predicted self-esteem indirectly via perceived competence. Finally, self-esteem predicted identity reflection and perceived competence predicted both identity reflection and initiative. Effects of age, sex, and perceptions of direct contact with the coach were not significant. Findings suggest that the quality of the coaching climate is an important predictor of the developmental benefits of sport participation and that one pathway by which the coaching climate has its effect on initiative and identity reflection is through developing youth self-perceptions. PMID:19271821

  6. Automatic evidence quality prediction to support evidence-based decision making.

    PubMed

    Sarker, Abeed; Mollá, Diego; Paris, Cécile

    2015-06-01

    Evidence-based medicine practice requires practitioners to obtain the best available medical evidence, and appraise the quality of the evidence when making clinical decisions. Primarily due to the plethora of electronically available data from the medical literature, the manual appraisal of the quality of evidence is a time-consuming process. We present a fully automatic approach for predicting the quality of medical evidence in order to aid practitioners at point-of-care. Our approach extracts relevant information from medical article abstracts and utilises data from a specialised corpus to apply supervised machine learning for the prediction of the quality grades. Following an in-depth analysis of the usefulness of features (e.g., publication types of articles), they are extracted from the text via rule-based approaches and from the meta-data associated with the articles, and then applied in the supervised classification model. We propose the use of a highly scalable and portable approach using a sequence of high precision classifiers, and introduce a simple evaluation metric called average error distance (AED) that simplifies the comparison of systems. We also perform elaborate human evaluations to compare the performance of our system against human judgments. We test and evaluate our approaches on a publicly available, specialised, annotated corpus containing 1132 evidence-based recommendations. Our rule-based approach performs exceptionally well at the automatic extraction of publication types of articles, with F-scores of up to 0.99 for high-quality publication types. For evidence quality classification, our approach obtains an accuracy of 63.84% and an AED of 0.271. The human evaluations show that the performance of our system, in terms of AED and accuracy, is comparable to the performance of humans on the same data. The experiments suggest that our structured text classification framework achieves evaluation results comparable to those of human performance. Our overall classification approach and evaluation technique are also highly portable and can be used for various evidence grading scales. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Low-Quality Structural and Interaction Data Improves Binding Affinity Prediction via Random Forest.

    PubMed

    Li, Hongjian; Leung, Kwong-Sak; Wong, Man-Hon; Ballester, Pedro J

    2015-06-12

    Docking scoring functions can be used to predict the strength of protein-ligand binding. It is widely believed that training a scoring function with low-quality data is detrimental for its predictive performance. Nevertheless, there is a surprising lack of systematic validation experiments in support of this hypothesis. In this study, we investigated to which extent training a scoring function with data containing low-quality structural and binding data is detrimental for predictive performance. We actually found that low-quality data is not only non-detrimental, but beneficial for the predictive performance of machine-learning scoring functions, though the improvement is less important than that coming from high-quality data. Furthermore, we observed that classical scoring functions are not able to effectively exploit data beyond an early threshold, regardless of its quality. This demonstrates that exploiting a larger data volume is more important for the performance of machine-learning scoring functions than restricting to a smaller set of higher data quality.

  8. Quantitative Prediction of Beef Quality Using Visible and NIR Spectroscopy with Large Data Samples Under Industry Conditions

    NASA Astrophysics Data System (ADS)

    Qiao, T.; Ren, J.; Craigie, C.; Zabalza, J.; Maltin, Ch.; Marshall, S.

    2015-03-01

    It is well known that the eating quality of beef has a significant influence on the repurchase behavior of consumers. There are several key factors that affect the perception of quality, including color, tenderness, juiciness, and flavor. To support consumer repurchase choices, there is a need for an objective measurement of quality that could be applied to meat prior to its sale. Objective approaches such as offered by spectral technologies may be useful, but the analytical algorithms used remain to be optimized. For visible and near infrared (VISNIR) spectroscopy, Partial Least Squares Regression (PLSR) is a widely used technique for meat related quality modeling and prediction. In this paper, a Support Vector Machine (SVM) based machine learning approach is presented to predict beef eating quality traits. Although SVM has been successfully used in various disciplines, it has not been applied extensively to the analysis of meat quality parameters. To this end, the performance of PLSR and SVM as tools for the analysis of meat tenderness is evaluated, using a large dataset acquired under industrial conditions. The spectral dataset was collected using VISNIR spectroscopy with the wavelength ranging from 350 to 1800 nm on 234 beef M. longissimus thoracis steaks from heifers, steers, and young bulls. As the dimensionality with the VISNIR data is very high (over 1600 spectral bands), the Principal Component Analysis (PCA) technique was applied for feature extraction and data reduction. The extracted principal components (less than 100) were then used for data modeling and prediction. The prediction results showed that SVM has a greater potential to predict beef eating quality than PLSR, especially for the prediction of tenderness. The infl uence of animal gender on beef quality prediction was also investigated, and it was found that beef quality traits were predicted most accurately in beef from young bulls.

  9. Issues of upscaling in space and time with soil erosion models

    NASA Astrophysics Data System (ADS)

    Brazier, R. E.; Parsons, A. J.; Wainwright, J.; Hutton, C.

    2009-04-01

    Soil erosion - the entrainment, transport and deposition of soil particles - is an important phenomenon to understand; the quantity of soil loss determines the long term on-site sustainability of agricultural production (Pimental et al., 1995), and has potentially important off-site impacts on water quality (Bilotta and Brazier, 2008). The fundamental mechanisms of the soil erosion process have been studied at the laboratory scale, plot scale (Wainwright et al., 2000), the small catchment scale (refs here) and river basin scale through sediment yield and budgeting work. Subsequently, soil erosion models have developed alongside and directly from this empirical work, from data-based models such as the USLE (Wischmeier and Smith, 1978), to ‘physics or process-based' models such as EUROSEM (Morgan et al., 1998) and WEPP (Nearing et al., 1989). Model development has helped to structure our understanding of the fundamental factors that control soil erosion process at the plot and field scale. Despite these advances, however, our understanding of and ability to predict erosion and sediment yield at the same plot, field and also larger catchment scales remains poor. Sediment yield has been shown to both increase and decrease as a function of drainage area (de Vente et al., 2006); the lack of a simple relationship demonstrates complex and scale-dependant process domination throughout a catchment, and emphasises our uncertainty and poor conceptual basis for predicting plot to catchment scale erosion rates and sediment yields (Parsons et al., 2006b). Therefore, this paper presents a review of the problems associated with modelling soil erosion across spatial and temporal scales and suggests some potential solutions to address these problems. The transport-distance approach to scaling erosion rates (Wainwright, et al., 2008) is assessed and discussed in light of alternative techniques to predict erosion across spatial and temporal scales. References Bilotta, G.S. and Brazier, R.E., 2008. Understanding the influence of suspended solids on water quality and aquatic biota. Water Research, 42(12): 2849-2861. de Vente, J., Poesen, J., Bazzoffi, P., Van Ropaey, A.V. and Verstraeten, G., 2006. Predicting catchment sediment yield in Mediterranean environments: the importance of sediment sources and connectivity in Italian drainage basins. Earth Surface Processes And Landforms, 31: 1017-1034. Morgan, R.P.C. et al., 1998. The European soil erosion model (EUROSEM): a dynamic approach for predicting sediment transport from fields to small catchments. Earth Surface Processes And Landforms, 23: 527-544. Nearing, M. A., G. R. Foster, L. J. Lane, and S. C. Finkner. 1989. A process-based soil erosion model for USDA Water Erosion Prediction Project technology. Trans. ASAE 32(5): 1587-1593. Parsons, A.J., Brazier, R.E., Wainwright, J. and Powell, D.M., 2006a. Scale relationships in hillslope runoff and erosion. Earth Surface Processes and Landforms, 31(11): 1384-1393. Parsons, A.J., Wainwright, J., Brazier, R.E. and Powell, D.M., 2006b. Is sediment delivery a fallacy? Earth Surface Processes and Landforms, 31(10): 1325-1328. Pimental, D. et al., 1995. Environmental and economic costs of soil erosion and conservation benefits. Science, 267:1117-1122. Wainwright, J., Parsons, A.J. and Abrahams, A.D., 2000. Plot-scale studies of vegetation, overland flow and erosion interactions: case studies from Arizona and New Mexico. Hydrological Processes, 14(16-17): 2921-2943. Wischmeier, W.H. and Smith, D.D., 1978. Predicting rainfall erosion losses - a guide for conservation planning., 537.

  10. Near-roadway monitoring of vehicle emissions as a function of mode of operation for light-duty vehicles.

    PubMed

    Wen, Dongqi; Zhai, Wenjuan; Xiang, Sheng; Hu, Zhice; Wei, Tongchuan; Noll, Kenneth E

    2017-11-01

    Determination of the effect of vehicle emissions on air quality near roadways is important because vehicles are a major source of air pollution. A near-roadway monitoring program was undertaken in Chicago between August 4 and October 30, 2014, to measure ultrafine particles, carbon dioxide, carbon monoxide, traffic volume and speed, and wind direction and speed. The objective of this study was to develop a method to relate short-term changes in traffic mode of operation to air quality near roadways using data averaged over 5-min intervals to provide a better understanding of the processes controlling air pollution concentrations near roadways. Three different types of data analysis are provided to demonstrate the type of results that can be obtained from a near-roadway sampling program based on 5-min measurements: (1) development of vehicle emission factors (EFs) for ultrafine particles as a function of vehicle mode of operation, (2) comparison of measured and modeled CO 2 concentrations, and (3) application of dispersion models to determine concentrations near roadways. EFs for ultrafine particles are developed that are a function of traffic volume and mode of operation (free flow and congestion) for light-duty vehicles (LDVs) under real-world conditions. Two air quality models-CALINE4 (California Line Source Dispersion Model, version 4) and AERMOD (American Meteorological Society/U.S. Environmental Protection Agency Regulatory Model)-are used to predict the ultrafine particulate concentrations near roadways for comparison with measured concentrations. When using CALINE4 to predict air quality levels in the mixing cell, changes in surface roughness and stability class have no effect on the predicted concentrations. However, when using AERMOD to predict air quality in the mixing cell, changes in surface roughness have a significant impact on the predicted concentrations. The paper provides emission factors (EFs) that are a function of traffic volume and mode of operation (free flow and congestion) for LDVs under real-world conditions. The good agreement between monitoring and modeling results indicates that high-resolution, simultaneous measurements of air quality and meteorological and traffic conditions can be used to determine real-world, fleet-wide vehicle EFs as a function of vehicle mode of operation under actual driving conditions.

  11. [Monitoring method for macroporous resin column chromatography process of salvianolic acids based on near infrared spectroscopy].

    PubMed

    Hou, Xiang-Mei; Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang

    2016-07-01

    To study and establish a monitoring method for macroporous resin column chromatography process of salvianolic acids by using near infrared spectroscopy (NIR) as a process analytical technology (PAT).The multivariate statistical process control (MSPC) model was developed based on 7 normal operation batches, and 2 test batches (including one normal operation batch and one abnormal operation batch) were used to verify the monitoring performance of this model. The results showed that MSPC model had a good monitoring ability for the column chromatography process. Meanwhile, NIR quantitative calibration model was established for three key quality indexes (rosmarinic acid, lithospermic acid and salvianolic acid B) by using partial least squares (PLS) algorithm. The verification results demonstrated that this model had satisfactory prediction performance. The combined application of the above two models could effectively achieve real-time monitoring for macroporous resin column chromatography process of salvianolic acids, and can be used to conduct on-line analysis of key quality indexes. This established process monitoring method could provide reference for the development of process analytical technology for traditional Chinese medicines manufacturing. Copyright© by the Chinese Pharmaceutical Association.

  12. Vision-based weld pool boundary extraction and width measurement during keyhole fiber laser welding

    NASA Astrophysics Data System (ADS)

    Luo, Masiyang; Shin, Yung C.

    2015-01-01

    In keyhole fiber laser welding processes, the weld pool behavior is essential to determining welding quality. To better observe and control the welding process, the accurate extraction of the weld pool boundary as well as the width is required. This work presents a weld pool edge detection technique based on an off axial green illumination laser and a coaxial image capturing system that consists of a CMOS camera and optic filters. According to the difference of image quality, a complete developed edge detection algorithm is proposed based on the local maximum gradient of greyness searching approach and linear interpolation. The extracted weld pool geometry and the width are validated by the actual welding width measurement and predictions by a numerical multi-phase model.

  13. Performance evaluation of various classifiers for color prediction of rice paddy plant leaf

    NASA Astrophysics Data System (ADS)

    Singh, Amandeep; Singh, Maninder Lal

    2016-11-01

    The food industry is one of the industries that uses machine vision for a nondestructive quality evaluation of the produce. These quality measuring systems and softwares are precalculated on the basis of various image-processing algorithms which generally use a particular type of classifier. These classifiers play a vital role in making the algorithms so intelligent that it can contribute its best while performing the said quality evaluations by translating the human perception into machine vision and hence machine learning. The crop of interest is rice, and the color of this crop indicates the health status of the plant. An enormous number of classifiers are available to solve the purpose of color prediction, but choosing the best among them is the focus of this paper. Performance of a total of 60 classifiers has been analyzed from the application point of view, and the results have been discussed. The motivation comes from the idea of providing a set of classifiers with excellent performance and implementing them on a single algorithm for the improvement of machine vision learning and, hence, associated applications.

  14. Assessing the Association Between Asthma and Air Quality in the Presence of Wildfires

    NASA Technical Reports Server (NTRS)

    Young, L. J.; Lopiano, K. K.; Xu, X.; Holt, N. M.; Leary, E.; Al-Hamdan, M. Z.; Crosson, W. L.; Estes, M. G.; Luvall, J. C.; Estes, S. M.; hide

    2012-01-01

    Asthma hospital/emergency room (patient) data are used as the foundation for creating a health outcome indicator of human response to environmental air quality. Daily U.S. Environmental Protection Agency (EPA) Air Quality System (AQS) fine particulates (PM2.5) ground data and the U.S. National Aeronautical Space Administration (NASA) MODIS aerosol optical depth (AOD) data were acquired and processed for years of 2007 and 2008. Figure 1 shows the PM2.5 annual mean composite of all the 2007 B-spline daily surfaces. Initial models for predicting the number of weekly asthma cases within a Florida county has focused on environmental variables. Weekly maximums of PM2.5, relative humidity, and the proportions of the county with smoke and fire were the environmental variables included in the model. Cosine and sine functions of time were used to account for seasonality in asthma cases. Counties were considered to be random effects, thereby adjusting for differences in socio ]demographics and other factors. The 2007 predictions for Miami ]Dade county when using B ]splines PM2.5 are displayed in Figures 2.

  15. The role of female search behaviour in determining host plant range in plant feeding insects: a test of the information processing hypothesis

    PubMed Central

    Janz, N.; Nylin, S.

    1997-01-01

    Recent theoretical studies have suggested that host range in herbivorous insects may be more restricted by constraints on information processing on the ovipositing females than by trade-offs in larval feeding efficiency. We have investigated if females from polyphagous species have to pay for their ability to localize and evaluate plants from different species with a lower ability to discriminate between conspecific host plants with differences in quality. Females of the monophagous butterflies Polygonia satyrus, Vanessa indica and Inachis io and the polyphagous P. c-album and Cynthia cardui (all in Lepidoptera, Nymphalidae) were given a simultaneous choice of stinging nettles (Urtica dioica) of different quality. In addition, the same choice trial was given to females from two populations of P. c-album with different degrees of specificity. As predicted from the information processing hypothesis, all specialists discriminated significantly against the bad quality nettle, whereas the generalists laid an equal amount of eggs on both types of nettle. There were no corresponding differences between specialist and generalist larvae in their ability to utilize poor quality leaves. Our study therefore suggests that female host-searching behaviour plays an important role in determining host plant range.

  16. The Role of Female Search Behaviour in Determining Host Plant Range in Plant Feeding Insects: A Test of the Information Processing Hypothesis

    NASA Astrophysics Data System (ADS)

    Janz, Niklas; Nylin, Soren

    1997-05-01

    Recent theoretical studies have suggested that host range in herbivorous insects may be more restricted by constraints on information processing on the ovipositing females than by trade-offs in larval feeding efficiency. We have investigated if females from polyphagous species have to pay for their ability to localize and evaluate plants from different species with a lower ability to discriminate between conspecific host plants with differences in quality. Females of the monophagous butterflies Polygonia satyrus, Vanessa indica and Inachis io and the polyphagous P. c-album and Cynthia cardui (all in Lepidoptera, Nymphalidae) were given a simultaneous choice of stinging nettles (Urtica dioica) of different quality. In addition, the same choice trial was given to females from two populations of P. c-album with different degrees of specificity. As predicted from the information processing hypothesis, all specialists discriminated significantly against the bad quality nettle, whereas the generalists laid an equal amount of eggs on both types of nettle. There were no corresponding differences between specialist and generalist larvae in their ability to utilize poor quality leaves. Our study therefore suggests that female host-searching behaviour plays an important role in determining host plant range.

  17. Examining the mediational role of psychological flexibility, pain catastrophizing, and visceral sensitivity in the relationship between psychological distress, irritable bowel symptom frequency, and quality of life.

    PubMed

    Cassar, G E; Knowles, S; Youssef, G J; Moulding, R; Uiterwijk, D; Waters, L; Austin, D W

    2018-06-08

    The aim of the current study was to use Structural Equation Modelling (SEM) to examine whether psychological flexibility (i.e. mindfulness, acceptance, valued-living) mediates the relationship between distress, irritable bowel syndrome (IBS) symptom frequency, and quality of life (QoL). Ninety-two individuals participated in the study (12 male, 80 female, M age  = 36.24) by completing an online survey including measures of visceral sensitivity, distress, IBS-related QoL, mindfulness, bowel symptoms, pain catastrophizing, acceptance, and valued-living. A final model with excellent fit was identified. Psychological distress significantly and directly predicted pain catastrophizing, valued-living, and IBS symptom frequency. Pain catastrophizing directly predicted visceral sensitivity and acceptance, while visceral sensitivity significantly and directly predicted IBS symptom frequency and QoL. Symptom frequency also had a direct and significant relationship with QoL. The current findings suggest that interventions designed to address unhelpful cognitive processes related to visceral sensitivity, pain catastrophizing, and psychological distress may be of most benefit to IBS-related QoL.

  18. Acoustics Research of Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Gao, Ximing; Houston, Janice

    2014-01-01

    The liftoff phase induces high acoustic loading over a broad frequency range for a launch vehicle. These external acoustic environments are used in the prediction of the internal vibration responses of the vehicle and components. Present liftoff vehicle acoustic environment prediction methods utilize stationary data from previously conducted hold-down tests to generate 1/3 octave band Sound Pressure Level (SPL) spectra. In an effort to update the accuracy and quality of liftoff acoustic loading predictions, non-stationary flight data from the Ares I-X were processed in PC-Signal in two flight phases: simulated hold-down and liftoff. In conjunction, the Prediction of Acoustic Vehicle Environments (PAVE) program was developed in MATLAB to allow for efficient predictions of sound pressure levels (SPLs) as a function of station number along the vehicle using semi-empirical methods. This consisted of generating the Dimensionless Spectrum Function (DSF) and Dimensionless Source Location (DSL) curves from the Ares I-X flight data. These are then used in the MATLAB program to generate the 1/3 octave band SPL spectra. Concluding results show major differences in SPLs between the hold-down test data and the processed Ares I-X flight data making the Ares I-X flight data more practical for future vehicle acoustic environment predictions.

  19. Optimising the Encapsulation of an Aqueous Bitter Melon Extract by Spray-Drying

    PubMed Central

    Tan, Sing Pei; Kha, Tuyen Chan; Parks, Sophie; Stathopoulos, Costas; Roach, Paul D.

    2015-01-01

    Our aim was to optimise the encapsulation of an aqueous bitter melon extract by spray-drying with maltodextrin (MD) and gum Arabic (GA). The response surface methodology models accurately predicted the process yield and retentions of bioactive concentrations and activity (R2 > 0.87). The optimal formulation was predicted and validated as 35% (w/w) stock solution (MD:GA, 1:1) and a ratio of 1.5:1 g/g of the extract to the stock solution. The spray-dried powder had a high process yield (66.2% ± 9.4%) and high retention (>79.5% ± 8.4%) and the quality of the powder was high. Therefore, the bitter melon extract was well encapsulated into a powder using MD/GA and spray-drying. PMID:28231214

  20. Forecasting in the presence of expectations

    NASA Astrophysics Data System (ADS)

    Allen, R.; Zivin, J. G.; Shrader, J.

    2016-05-01

    Physical processes routinely influence economic outcomes, and actions by economic agents can, in turn, influence physical processes. This feedback creates challenges for forecasting and inference, creating the potential for complementarity between models from different academic disciplines. Using the example of prediction of water availability during a drought, we illustrate the potential biases in forecasts that only take part of a coupled system into account. In particular, we show that forecasts can alter the feedbacks between supply and demand, leading to inaccurate prediction about future states of the system. Although the example is specific to drought, the problem of feedback between expectations and forecast quality is not isolated to the particular model-it is relevant to areas as diverse as population assessments for conservation, balancing the electrical grid, and setting macroeconomic policy.

  1. A mathematical model of reservoir sediment quality prediction based on land-use and erosion processes in watershed

    NASA Astrophysics Data System (ADS)

    Junakova, N.; Balintova, M.; Junak, J.

    2017-10-01

    The aim of this paper is to propose a mathematical model for determining of total nitrogen (N) and phosphorus (P) content in eroded soil particles with emphasis on prediction of bottom sediment quality in reservoirs. The adsorbed nutrient concentrations are calculated using the Universal Soil Loss Equation (USLE) extended by the determination of the average soil nutrient concentration in top soils. The average annual vegetation and management factor is divided into five periods of the cropping cycle. For selected plants, the average plant nutrient uptake divided into five cropping periods is also proposed. The average nutrient concentrations in eroded soil particles in adsorbed form are modified by sediment enrichment ratio to obtain the total nutrient content in transported soil particles. The model was designed for the conditions of north-eastern Slovakia. The study was carried out in the agricultural basin of the small water reservoir Klusov.

  2. Non-destructive and non-invasive observation of friction and wear of human joints and of fracture initiation by acoustic emission.

    PubMed

    Schwalbe, H J; Bamfaste, G; Franke, R P

    1999-01-01

    Quality control in orthopaedic diagnostics according to DIN EN ISO 9000ff requires methods of non-destructive process control, which do not harm the patient by radiation or by invasive examinations. To obtain an improvement in health economy, quality-controlled and non-destructive measurements have to be introduced into the diagnostics and therapy of human joints and bones. A non-invasive evaluation of the state of wear of human joints and of the cracking tendency of bones is, as of today's point of knowledge, not established. The analysis of acoustic emission signals allows the prediction of bone rupture far below the fracture load. The evaluation of dry and wet bone samples revealed that it is possible to conclude from crack initiation to the bone strength and thus to predict the probability of bone rupture.

  3. Chronic parenting stress and mood reactivity: The role of sleep quality.

    PubMed

    da Estrela, Chelsea; Barker, Erin T; Lantagne, Sarah; Gouin, Jean-Philippe

    2018-04-01

    Sleep is a basic biological process supporting emotion regulation. The emotion regulation function of sleep may be particularly important in the context of chronic stress. To better understand how chronic stress and sleep interact to predict mood, 66 parents of children with autism completed daily diaries assessing parenting stress, negative mood, and sleep quality for 6 consecutive days. Hierarchical linear modelling revealed that daily negative mood was predicted by between-person differences in parenting stress and between-person differences in sleep efficiency. Further, between-person differences in sleep efficiency and within-person differences in sleep satisfaction moderated the impact of stress on mood. These data suggest that sleep disturbances may exacerbate the association between stress and mood in the context of chronic parenting stress. Further, high parenting stress appears to heighten the impact of transient sleep disturbances on mood. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Utilizing electronic health records to predict acute kidney injury risk and outcomes: workgroup statements from the 15(th) ADQI Consensus Conference.

    PubMed

    Sutherland, Scott M; Chawla, Lakhmir S; Kane-Gill, Sandra L; Hsu, Raymond K; Kramer, Andrew A; Goldstein, Stuart L; Kellum, John A; Ronco, Claudio; Bagshaw, Sean M

    2016-01-01

    The data contained within the electronic health record (EHR) is "big" from the standpoint of volume, velocity, and variety. These circumstances and the pervasive trend towards EHR adoption have sparked interest in applying big data predictive analytic techniques to EHR data. Acute kidney injury (AKI) is a condition well suited to prediction and risk forecasting; not only does the consensus definition for AKI allow temporal anchoring of events, but no treatments exist once AKI develops, underscoring the importance of early identification and prevention. The Acute Dialysis Quality Initiative (ADQI) convened a group of key opinion leaders and stakeholders to consider how best to approach AKI research and care in the "Big Data" era. This manuscript addresses the core elements of AKI risk prediction and outlines potential pathways and processes. We describe AKI prediction targets, feature selection, model development, and data display.

  5. Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction

    PubMed Central

    Venkatesan, R.

    2016-01-01

    Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free) software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN) and the novel adaptive dimensional biogeography based optimization (ADBBO) model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets. PMID:27738649

  6. Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction.

    PubMed

    Kumudha, P; Venkatesan, R

    Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free) software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN) and the novel adaptive dimensional biogeography based optimization (ADBBO) model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets.

  7. [Influence of Spectral Pre-Processing on PLS Quantitative Model of Detecting Cu in Navel Orange by LIBS].

    PubMed

    Li, Wen-bing; Yao, Lin-tao; Liu, Mu-hua; Huang, Lin; Yao, Ming-yin; Chen, Tian-bing; He, Xiu-wen; Yang, Ping; Hu, Hui-qin; Nie, Jiang-hui

    2015-05-01

    Cu in navel orange was detected rapidly by laser-induced breakdown spectroscopy (LIBS) combined with partial least squares (PLS) for quantitative analysis, then the effect on the detection accuracy of the model with different spectral data ptetreatment methods was explored. Spectral data for the 52 Gannan navel orange samples were pretreated by different data smoothing, mean centralized and standard normal variable transform. Then 319~338 nm wavelength section containing characteristic spectral lines of Cu was selected to build PLS models, the main evaluation indexes of models such as regression coefficient (r), root mean square error of cross validation (RMSECV) and the root mean square error of prediction (RMSEP) were compared and analyzed. Three indicators of PLS model after 13 points smoothing and processing of the mean center were found reaching 0. 992 8, 3. 43 and 3. 4 respectively, the average relative error of prediction model is only 5. 55%, and in one word, the quality of calibration and prediction of this model are the best results. The results show that selecting the appropriate data pre-processing method, the prediction accuracy of PLS quantitative model of fruits and vegetables detected by LIBS can be improved effectively, providing a new method for fast and accurate detection of fruits and vegetables by LIBS.

  8. Basecalling with LifeTrace

    PubMed Central

    Walther, Dirk; Bartha, Gábor; Morris, Macdonald

    2001-01-01

    A pivotal step in electrophoresis sequencing is the conversion of the raw, continuous chromatogram data into the actual sequence of discrete nucleotides, a process referred to as basecalling. We describe a novel algorithm for basecalling implemented in the program LifeTrace. Like Phred, currently the most widely used basecalling software program, LifeTrace takes processed trace data as input. It was designed to be tolerant to variable peak spacing by means of an improved peak-detection algorithm that emphasizes local chromatogram information over global properties. LifeTrace is shown to generate high-quality basecalls and reliable quality scores. It proved particularly effective when applied to MegaBACE capillary sequencing machines. In a benchmark test of 8372 dye-primer MegaBACE chromatograms, LifeTrace generated 17% fewer substitution errors, 16% fewer insertion/deletion errors, and 2.4% more aligned bases to the finished sequence than did Phred. For two sets totaling 6624 dye-terminator chromatograms, the performance improvement was 15% fewer substitution errors, 10% fewer insertion/deletion errors, and 2.1% more aligned bases. The processing time required by LifeTrace is comparable to that of Phred. The predicted quality scores were in line with observed quality scores, permitting direct use for quality clipping and in silico single nucleotide polymorphism (SNP) detection. Furthermore, we introduce a new type of quality score associated with every basecall: the gap-quality. It estimates the probability of a deletion error between the current and the following basecall. This additional quality score improves detection of single basepair deletions when used for locating potential basecalling errors during the alignment. We also describe a new protocol for benchmarking that we believe better discerns basecaller performance differences than methods previously published. PMID:11337481

  9. A Review and Analysis of Remote Sensing Capability for Air Quality Measurements as a Potential Decision Support Tool Conducted by the NASA DEVELOP Program

    NASA Technical Reports Server (NTRS)

    Ross, A.; Richards, A.; Keith, K.; Frew, C.; Boseck, J.; Sutton, S.; Watts, C.; Rickman, D.

    2007-01-01

    This project focused on a comprehensive utilization of air quality model products as decision support tools (DST) needed for public health applications. A review of past and future air quality measurement methods and their uncertainty, along with the relationship of air quality to national and global public health, is vital. This project described current and future NASA satellite remote sensing and ground sensing capabilities and the potential for using these sensors to enhance the prediction, prevention, and control of public health effects that result from poor air quality. The qualitative uncertainty of current satellite remotely sensed air quality, the ground-based remotely sensed air quality, the air quality/public health model, and the decision making process is evaluated in this study. Current peer-reviewed literature suggests that remotely sensed air quality parameters correlate well with ground-based sensor data. A satellite remote-sensed and ground-sensed data complement is needed to enhance the models/tools used by policy makers for the protection of national and global public health communities

  10. A systematic review and meta-analysis of tests to predict wound healing in diabetic foot.

    PubMed

    Wang, Zhen; Hasan, Rim; Firwana, Belal; Elraiyah, Tarig; Tsapas, Apostolos; Prokop, Larry; Mills, Joseph L; Murad, Mohammad Hassan

    2016-02-01

    This systematic review summarized the evidence on noninvasive screening tests for the prediction of wound healing and the risk of amputation in diabetic foot ulcers. We searched MEDLINE In-Process & Other Non-Indexed Citations, MEDLINE, Embase, Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, and Scopus from database inception to October 2011. We pooled sensitivity, specificity, and diagnostic odds ratio (DOR) and compared test performance. Thirty-seven studies met the inclusion criteria. Eight tests were used to predict wound healing in this setting, including ankle-brachial index (ABI), ankle peak systolic velocity, transcutaneous oxygen measurement (TcPo2), toe-brachial index, toe systolic blood pressure, microvascular oxygen saturation, skin perfusion pressure, and hyperspectral imaging. For the TcPo2 test, the pooled DOR was 15.81 (95% confidence interval [CI], 3.36-74.45) for wound healing and 4.14 (95% CI, 2.98-5.76) for the risk of amputation. ABI was also predictive but to a lesser degree of the risk of amputations (DOR, 2.89; 95% CI, 1.65-5.05) but not of wound healing (DOR, 1.02; 95% CI, 0.40-2.64). It was not feasible to perform meta-analysis comparing the remaining tests. The overall quality of evidence was limited by the risk of bias and imprecision (wide CIs due to small sample size). Several tests may predict wound healing in the setting of diabetic foot ulcer; however, most of the available evidence evaluates only TcPo2 and ABI. The overall quality of the evidence is low, and further research is needed to provide higher quality comparative effectiveness evidence. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  11. Modeling regional air quality and climate: improving organic aerosol and aerosol activation processes in WRF/Chem version 3.7.1

    NASA Astrophysics Data System (ADS)

    Yahya, Khairunnisa; Glotfelty, Timothy; Wang, Kai; Zhang, Yang; Nenes, Athanasios

    2017-06-01

    Air quality and climate influence each other through the uncertain processes of aerosol formation and cloud droplet activation. In this study, both processes are improved in the Weather, Research and Forecasting model with Chemistry (WRF/Chem) version 3.7.1. The existing Volatility Basis Set (VBS) treatments for organic aerosol (OA) formation in WRF/Chem are improved by considering the following: the secondary OA (SOA) formation from semi-volatile primary organic aerosol (POA), a semi-empirical formulation for the enthalpy of vaporization of SOA, and functionalization and fragmentation reactions for multiple generations of products from the oxidation of VOCs. Over the continental US, 2-month-long simulations (May to June 2010) are conducted and results are evaluated against surface and aircraft observations during the Nexus of Air Quality and Climate Change (CalNex) campaign. Among all the configurations considered, the best performance is found for the simulation with the 2005 Carbon Bond mechanism (CB05) and the VBS SOA module with semivolatile POA treatment, 25 % fragmentation, and the emissions of semi-volatile and intermediate volatile organic compounds being 3 times the original POA emissions. Among the three gas-phase mechanisms (CB05, CB6, and SAPRC07) used, CB05 gives the best performance for surface ozone and PM2. 5 concentrations. Differences in SOA predictions are larger for the simulations with different VBS treatments (e.g., nonvolatile POA versus semivolatile POA) compared to the simulations with different gas-phase mechanisms. Compared to the simulation with CB05 and the default SOA module, the simulations with the VBS treatment improve cloud droplet number concentration (CDNC) predictions (normalized mean biases from -40.8 % to a range of -34.6 to -27.7 %), with large differences between CB05-CB6 and SAPRC07 due to large differences in their OH and HO2 predictions. An advanced aerosol activation parameterization based on the Fountoukis and Nenes (2005) series reduces the large negative CDNC bias associated with the default Abdul Razzak and Ghan (2000) parameterization from -35.4 % to a range of -0.8 to 7.1 %. However, it increases the errors due to overpredictions of CDNC, mainly over the northeastern US. This work indicates a need to improve other aerosol-cloud-radiation processes in the model, such as the spatial distribution of aerosol optical depth and cloud condensation nuclei, in order to further improve CDNC predictions.

  12. [Application of quality by design in granulation process for ginkgo leaf tablet (Ⅱ): identification of critical quality attributes].

    PubMed

    Xu, Bing; Cui, Xiang-Long; Yang, Chan; Wang, Xin; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2017-03-01

    Quality by design (QbD) highlights the concept of "begin with the end", which means to thoroughly understand the target product quality first, and then guide pharmaceutical process development and quality control throughout the whole manufacturing process. In this paper, the Ginkgo biloba granules intermediates were taken as the research object, and the requirements of the tensile strength of tablets were treated as the goals to establish the methods for identification of granules' critical quality attributes (CQAs) and establishment of CQAs' limits. Firstly, the orthogonal partial least square (OPLS) model was adopted to build the relationship between the micromeritic properties of 29 batches of granules and the tensile strength of ginkgo leaf tablets, and thereby the potential critical quality attributes (pCQAs) were screened by variable importance in the projection (VIP) indexes. Then, a series of OPLS models were rebuilt by reducing pCQAs variables one by one in view of the rule of VIP values from low to high in sequence. The model performance results demonstrated that calibration and predictive performance of the model had no decreasing trend after variables reduction. In consideration of the results from variables selection as well as the collinearity test and testability of the pCQAs, the median particle size (D₅₀) and the bulk density (Da) were identified as critical quality attributes (CQAs). The design space of CQAs was developed based on a multiple linear regression model established between the CQAs (D₅₀ and Da) and the tensile strength. The control constraints of the CQAs were determined as 170 μm< D₅₀<500 μm and 0.30 g•cm⁻³

  13. The roles of associative and executive processes in creative cognition.

    PubMed

    Beaty, Roger E; Silvia, Paul J; Nusbaum, Emily C; Jauk, Emanuel; Benedek, Mathias

    2014-10-01

    How does the mind produce creative ideas? Past research has pointed to important roles of both executive and associative processes in creative cognition. But such work has largely focused on the influence of one ability or the other-executive or associative-so the extent to which both abilities may jointly affect creative thought remains unclear. Using multivariate structural equation modeling, we conducted two studies to determine the relative influences of executive and associative processes in domain-general creative cognition (i.e., divergent thinking). Participants completed a series of verbal fluency tasks, and their responses were analyzed by means of latent semantic analysis (LSA) and scored for semantic distance as a measure of associative ability. Participants also completed several measures of executive function-including broad retrieval ability (Gr) and fluid intelligence (Gf). Across both studies, we found substantial effects of both associative and executive abilities: As the average semantic distance between verbal fluency responses and cues increased, so did the creative quality of divergent-thinking responses (Study 1 and Study 2). Moreover, the creative quality of divergent-thinking responses was predicted by the executive variables-Gr (Study 1) and Gf (Study 2). Importantly, the effects of semantic distance and the executive function variables remained robust in the same structural equation model predicting divergent thinking, suggesting unique contributions of both constructs. The present research extends recent applications of LSA in creativity research and provides support for the notion that both associative and executive processes underlie the production of novel ideas.

  14. Using remote sensing to monitor past changes and assess future scenarios for the Sacramento-San Joaquin River Delta waterways, California USA

    NASA Astrophysics Data System (ADS)

    Santos, Maria J.; Hestir, Erin; Khanna, Shruti; Ustin, Susan L.

    2017-04-01

    Historically, deltas have been extensively affected both by natural processes and human intervention. Thus, understanding drivers, predicting impacts and optimizing solutions to delta problems requires a holistic approach spanning many sectors, disciplines and fields of expertise. Deltas are ideal model systems to understand the effects of the interaction between social and ecological domains, as they face unprecedented disturbances and threats to their biological and ecological sustainability. The challenge for deltas is to meet the goals of supporting biodiversity and ecosystem processes while also provisioning fresh water resources for human use. We provide an overview of the last 150 years of the Sacramento-San Joaquin River delta, where we illustrate the parallel process of an increase in disturbances, by particularly zooming in on the current cascading effects of invasive species on geophysical and biological processes. Using remote sensing data coupled with in situ measurements of water quality, turbidity, and species presence we show how the spread and persistence of aquatic invasive species affects sedimentation processes and ecosystem functioning. Our results show that the interactions between the biological and physical conditions in the Delta affect the trajectory of dominance by native and invasive aquatic plant species. Trends in growth and community characteristics associated with predicted impacts of climate change (sea level rise, warmer temperatures, changes in the hydrograph with high winter and low summer outflows) do not provide simple predictions. Individually, the impact of specific environmental changes on the biological components can be predicted, however it is the complex interactions of biological communities with the suite of physical changes that make predictions uncertain. Systematic monitoring is critical to provide the data needed to document and understand change of these delta systems, and to identify successful adaptation strategies.

  15. In line NIR quantification of film thickness on pharmaceutical pellets during a fluid bed coating process.

    PubMed

    Lee, Min-Jeong; Seo, Da-Young; Lee, Hea-Eun; Wang, In-Chun; Kim, Woo-Sik; Jeong, Myung-Yung; Choi, Guang J

    2011-01-17

    Along with the risk-based approach, process analytical technology (PAT) has emerged as one of the key elements to fully implement QbD (quality-by-design). Near-infrared (NIR) spectroscopy has been extensively applied as an in-line/on-line analytical tool in biomedical and chemical industries. In this study, the film thickness on pharmaceutical pellets was examined for quantification using in-line NIR spectroscopy during a fluid-bed coating process. A precise monitoring of coating thickness and its prediction with a suitable control strategy is crucial to the quality assurance of solid dosage forms including dissolution characteristics. Pellets of a test formulation were manufactured and coated in a fluid-bed by spraying a hydroxypropyl methylcellulose (HPMC) coating solution. NIR spectra were acquired via a fiber-optic probe during the coating process, followed by multivariate analysis utilizing partial least squares (PLS) calibration models. The actual coating thickness of pellets was measured by two separate methods, confocal laser scanning microscopy (CLSM) and laser diffraction particle size analysis (LD-PSA). Both characterization methods gave superb correlation results, and all determination coefficient (R(2)) values exceeded 0.995. In addition, a prediction coating experiment for 70min demonstrated that the end-point can be accurately designated via NIR in-line monitoring with appropriate calibration models. In conclusion, our approach combining in-line NIR monitoring with CLSM and LD-PSA can be applied as an effective PAT tool for fluid-bed pellet coating processes. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. Gaining Control and Predictability of Software-Intensive Systems Development and Sustainment

    DTIC Science & Technology

    2015-02-04

    implementation of the baselines, audits , and technical reviews within an overarching systems engineering process (SEP; Defense Acquisition University...warfighters’ needs. This management and metrics effort supplements and supports the system’s technical development through the baselines, audits and...other areas that could be researched and added into the nine-tier model. Areas including software metrics, quality assurance , software-oriented

  17. Biophysical modelling of intra-ring variations in tracheid features and wood density of Pinus pinaster trees exposed to seasonal droughts

    Treesearch

    Sarah Wilkinson; Jerome Ogee; Jean-Christophe Domec; Mark Rayment; Lisa Wingate

    2015-01-01

    Process-based models that link seasonally varying environmental signals to morphological features within tree rings are essential tools to predict tree growth response and commercially important wood quality traits under future climate scenarios. This study evaluated model portrayal of radial growth and wood anatomy observations within a mature maritime pine (Pinus...

  18. Evaluating a Brief Measure of Reading Comprehension for Narrative and Expository Text: The Convergent and Predictive Validity of the Reading Retell Rubric

    ERIC Educational Resources Information Center

    Thomas, Lisa B.

    2012-01-01

    Reading comprehension is a critical aspect of the reading process. Children who experience significant problems in reading comprehension are at risk for long-term academic and social problems. High-quality measures are needed for early, efficient, and effective identification of children in need of remediation in reading comprehension. Substantial…

  19. Young Adolescents' Metacognition and Domain Knowledge as Predictors of Hypothesis-Development Performance in a Computer-Supported Context

    ERIC Educational Resources Information Center

    Kim, Hye Jeong; Pedersen, Susan

    2010-01-01

    Recently, the importance of ill-structured problem-solving in real-world contexts has become a focus of educational research. Particularly, the hypothesis-development process has been examined as one of the keys to developing a high-quality solution in a problem context. The authors of this study examined predictive relations between young…

  20. Review of nitrogen fate models applicable to forest landscapes in the Southern U.S.

    Treesearch

    D. M. Amatya; C. G. Rossi; A. Saleh; Z. Dai; M. A. Youssef; R. G. Williams; D. D. Bosch; G. M. Chescheir; G. Sun; R. W. Skaggs; C. C. Trettin; E. D. Vance; J. E. Nettles; S. Tian

    2013-01-01

    Assessing the environmental impacts of fertilizer nitrogen (N) used to increase productivity in managed forests is complex due to a wide range of abiotic and biotic factors affecting its forms and movement. Models developed to predict fertilizer N fate (e.g., cycling processes) and water quality impacts vary widely in their design, scope, and potential application. We...

  1. The Effect of Hospital Service Quality on Patient's Trust

    PubMed Central

    Zarei, Ehsan; Daneshkohan, Abbas; Khabiri, Roghayeh; Arab, Mohammad

    2014-01-01

    Background: The trust is meant the belief of the patient to the practitioner or the hospital based on the concept that the care provider seeks the best for the patient and will provide the suitable care and treatment for him/her. One of the main determinants of patient’s trust is the service quality. Objectives: This study aimed to examine the effect of quality of services provided in private hospitals on the patient’s trust. Patients and Methods: In this descriptive cross-sectional study, 969 patients were selected using the consecutive method from eight private general hospitals of Tehran, Iran, in 2010. Data were collected through a questionnaire containing 20 items (14 items for quality, 6 items for trust) and its validity and reliability were confirmed. Data were analyzed using descriptive statistics and multivariate regression. Results: The mean score of patients' perception of trust was 3.80 and 4.01 for service quality. Approximately 38% of the variance in patient trust was explained by service quality dimensions. Quality of interaction and process (P < 0.001) were the strongest factors in predicting patient’s trust, but the quality of the environment had no significant effect on the patients' degree of trust. Conclusions: The interaction quality and process quality were the key determinants of patient’s trust in the private hospitals of Tehran. To enhance the patients' trust, quality improvement efforts should focus on service delivery aspects such as scheduling, timely and accurate doing of the service, and strengthening the interpersonal aspects of care and communication skills of doctors, nurses and staff. PMID:25763258

  2. Applying quality by design (QbD) concept for fabrication of chitosan coated nanoliposomes.

    PubMed

    Pandey, Abhijeet P; Karande, Kiran P; Sonawane, Raju O; Deshmukh, Prashant K

    2014-03-01

    In the present investigation, a quality by design (QbD) strategy was successfully applied to the fabrication of chitosan-coated nanoliposomes (CH-NLPs) encapsulating a hydrophilic drug. The effects of the processing variables on the particle size, encapsulation efficiency (%EE) and coating efficiency (%CE) of CH-NLPs (prepared using a modified ethanol injection method) were investigated. The concentrations of lipid, cholesterol, drug and chitosan; stirring speed, sonication time; organic:aqueous phase ratio; and temperature were identified as the key factors after risk analysis for conducting a screening design study. A separate study was designed to investigate the robustness of the predicted design space. The particle size, %EE and %CE of the optimized CH-NLPs were 111.3 nm, 33.4% and 35.2%, respectively. The observed responses were in accordance with the predicted response, which confirms the suitability and robustness of the design space for CH-NLP formulation. In conclusion, optimization of the selected key variables will help minimize the problems related to size, %EE and %CE that are generally encountered when scaling up processes for NLP formulations. The robustness of the design space will help minimize both intra-batch and inter-batch variations, which are quite common in the pharmaceutical industry.

  3. Template-based protein structure modeling using the RaptorX web server.

    PubMed

    Källberg, Morten; Wang, Haipeng; Wang, Sheng; Peng, Jian; Wang, Zhiyong; Lu, Hui; Xu, Jinbo

    2012-07-19

    A key challenge of modern biology is to uncover the functional role of the protein entities that compose cellular proteomes. To this end, the availability of reliable three-dimensional atomic models of proteins is often crucial. This protocol presents a community-wide web-based method using RaptorX (http://raptorx.uchicago.edu/) for protein secondary structure prediction, template-based tertiary structure modeling, alignment quality assessment and sophisticated probabilistic alignment sampling. RaptorX distinguishes itself from other servers by the quality of the alignment between a target sequence and one or multiple distantly related template proteins (especially those with sparse sequence profiles) and by a novel nonlinear scoring function and a probabilistic-consistency algorithm. Consequently, RaptorX delivers high-quality structural models for many targets with only remote templates. At present, it takes RaptorX ~35 min to finish processing a sequence of 200 amino acids. Since its official release in August 2011, RaptorX has processed ~6,000 sequences submitted by ~1,600 users from around the world.

  4. Template-based protein structure modeling using the RaptorX web server

    PubMed Central

    Källberg, Morten; Wang, Haipeng; Wang, Sheng; Peng, Jian; Wang, Zhiyong; Lu, Hui; Xu, Jinbo

    2016-01-01

    A key challenge of modern biology is to uncover the functional role of the protein entities that compose cellular proteomes. To this end, the availability of reliable three-dimensional atomic models of proteins is often crucial. This protocol presents a community-wide web-based method using RaptorX (http://raptorx.uchicago.edu/) for protein secondary structure prediction, template-based tertiary structure modeling, alignment quality assessment and sophisticated probabilistic alignment sampling. RaptorX distinguishes itself from other servers by the quality of the alignment between a target sequence and one or multiple distantly related template proteins (especially those with sparse sequence profiles) and by a novel nonlinear scoring function and a probabilistic-consistency algorithm. Consequently, RaptorX delivers high-quality structural models for many targets with only remote templates. At present, it takes RaptorX ~35 min to finish processing a sequence of 200 amino acids. Since its official release in August 2011, RaptorX has processed ~6,000 sequences submitted by ~1,600 users from around the world. PMID:22814390

  5. Space shuttle propulsion parameter estimation using optimal estimation techniques

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The first twelve system state variables are presented with the necessary mathematical developments for incorporating them into the filter/smoother algorithm. Other state variables, i.e., aerodynamic coefficients can be easily incorporated into the estimation algorithm, representing uncertain parameters, but for initial checkout purposes are treated as known quantities. An approach for incorporating the NASA propulsion predictive model results into the optimal estimation algorithm was identified. This approach utilizes numerical derivatives and nominal predictions within the algorithm with global iterations of the algorithm. The iterative process is terminated when the quality of the estimates provided no longer significantly improves.

  6. Modeling and evaluating of surface roughness prediction in micro-grinding on soda-lime glass considering tool characterization

    NASA Astrophysics Data System (ADS)

    Cheng, Jun; Gong, Yadong; Wang, Jinsheng

    2013-11-01

    The current research of micro-grinding mainly focuses on the optimal processing technology for different materials. However, the material removal mechanism in micro-grinding is the base of achieving high quality processing surface. Therefore, a novel method for predicting surface roughness in micro-grinding of hard brittle materials considering micro-grinding tool grains protrusion topography is proposed in this paper. The differences of material removal mechanism between convention grinding process and micro-grinding process are analyzed. Topography characterization has been done on micro-grinding tools which are fabricated by electroplating. Models of grain density generation and grain interval are built, and new predicting model of micro-grinding surface roughness is developed. In order to verify the precision and application effect of the surface roughness prediction model proposed, a micro-grinding orthogonally experiment on soda-lime glass is designed and conducted. A series of micro-machining surfaces which are 78 nm to 0.98 μm roughness of brittle material is achieved. It is found that experimental roughness results and the predicting roughness data have an evident coincidence, and the component variable of describing the size effects in predicting model is calculated to be 1.5×107 by reverse method based on the experimental results. The proposed model builds a set of distribution to consider grains distribution densities in different protrusion heights. Finally, the characterization of micro-grinding tools which are used in the experiment has been done based on the distribution set. It is concluded that there is a significant coincidence between surface prediction data from the proposed model and measurements from experiment results. Therefore, the effectiveness of the model is demonstrated. This paper proposes a novel method for predicting surface roughness in micro-grinding of hard brittle materials considering micro-grinding tool grains protrusion topography, which would provide significant research theory and experimental reference of material removal mechanism in micro-grinding of soda-lime glass.

  7. Managing uncertainty in metabolic network structure and improving predictions using EnsembleFBA

    PubMed Central

    2017-01-01

    Genome-scale metabolic network reconstructions (GENREs) are repositories of knowledge about the metabolic processes that occur in an organism. GENREs have been used to discover and interpret metabolic functions, and to engineer novel network structures. A major barrier preventing more widespread use of GENREs, particularly to study non-model organisms, is the extensive time required to produce a high-quality GENRE. Many automated approaches have been developed which reduce this time requirement, but automatically-reconstructed draft GENREs still require curation before useful predictions can be made. We present a novel approach to the analysis of GENREs which improves the predictive capabilities of draft GENREs by representing many alternative network structures, all equally consistent with available data, and generating predictions from this ensemble. This ensemble approach is compatible with many reconstruction methods. We refer to this new approach as Ensemble Flux Balance Analysis (EnsembleFBA). We validate EnsembleFBA by predicting growth and gene essentiality in the model organism Pseudomonas aeruginosa UCBPP-PA14. We demonstrate how EnsembleFBA can be included in a systems biology workflow by predicting essential genes in six Streptococcus species and mapping the essential genes to small molecule ligands from DrugBank. We found that some metabolic subsystems contributed disproportionately to the set of predicted essential reactions in a way that was unique to each Streptococcus species, leading to species-specific outcomes from small molecule interactions. Through our analyses of P. aeruginosa and six Streptococci, we show that ensembles increase the quality of predictions without drastically increasing reconstruction time, thus making GENRE approaches more practical for applications which require predictions for many non-model organisms. All of our functions and accompanying example code are available in an open online repository. PMID:28263984

  8. Managing uncertainty in metabolic network structure and improving predictions using EnsembleFBA.

    PubMed

    Biggs, Matthew B; Papin, Jason A

    2017-03-01

    Genome-scale metabolic network reconstructions (GENREs) are repositories of knowledge about the metabolic processes that occur in an organism. GENREs have been used to discover and interpret metabolic functions, and to engineer novel network structures. A major barrier preventing more widespread use of GENREs, particularly to study non-model organisms, is the extensive time required to produce a high-quality GENRE. Many automated approaches have been developed which reduce this time requirement, but automatically-reconstructed draft GENREs still require curation before useful predictions can be made. We present a novel approach to the analysis of GENREs which improves the predictive capabilities of draft GENREs by representing many alternative network structures, all equally consistent with available data, and generating predictions from this ensemble. This ensemble approach is compatible with many reconstruction methods. We refer to this new approach as Ensemble Flux Balance Analysis (EnsembleFBA). We validate EnsembleFBA by predicting growth and gene essentiality in the model organism Pseudomonas aeruginosa UCBPP-PA14. We demonstrate how EnsembleFBA can be included in a systems biology workflow by predicting essential genes in six Streptococcus species and mapping the essential genes to small molecule ligands from DrugBank. We found that some metabolic subsystems contributed disproportionately to the set of predicted essential reactions in a way that was unique to each Streptococcus species, leading to species-specific outcomes from small molecule interactions. Through our analyses of P. aeruginosa and six Streptococci, we show that ensembles increase the quality of predictions without drastically increasing reconstruction time, thus making GENRE approaches more practical for applications which require predictions for many non-model organisms. All of our functions and accompanying example code are available in an open online repository.

  9. In-line and Real-time Monitoring of Resonant Acoustic Mixing by Near-infrared Spectroscopy Combined with Chemometric Technology for Process Analytical Technology Applications in Pharmaceutical Powder Blending Systems.

    PubMed

    Tanaka, Ryoma; Takahashi, Naoyuki; Nakamura, Yasuaki; Hattori, Yusuke; Ashizawa, Kazuhide; Otsuka, Makoto

    2017-01-01

    Resonant acoustic ® mixing (RAM) technology is a system that performs high-speed mixing by vibration through the control of acceleration and frequency. In recent years, real-time process monitoring and prediction has become of increasing interest, and process analytical technology (PAT) systems will be increasingly introduced into actual manufacturing processes. This study examined the application of PAT with the combination of RAM, near-infrared spectroscopy, and chemometric technology as a set of PAT tools for introduction into actual pharmaceutical powder blending processes. Content uniformity was based on a robust partial least squares regression (PLSR) model constructed to manage the RAM configuration parameters and the changing concentration of the components. As a result, real-time monitoring may be possible and could be successfully demonstrated for in-line real-time prediction of active pharmaceutical ingredients and other additives using chemometric technology. This system is expected to be applicable to the RAM method for the risk management of quality.

  10. Adaptation of the quality by design concept in early pharmaceutical development of an intranasal nanosized formulation.

    PubMed

    Pallagi, Edina; Ambrus, Rita; Szabó-Révész, Piroska; Csóka, Ildikó

    2015-08-01

    Regulatory science based pharmaceutical development and product manufacturing is highly recommended by the authorities nowadays. The aim of this study was to adapt regulatory science even in the nano-pharmaceutical early development. Authors applied the quality by design (QbD) concept in the early development phase of nano-systems, where the illustration material was meloxicam. The meloxicam nanoparticles produced by co-grinding method for nasal administration were studied according to the QbD policy and the QbD based risk assessment (RA) was performed. The steps were implemented according to the relevant regulatory guidelines (quality target product profile (QTPP) determination, selection of critical quality attributes (CQAs) and critical process parameters (CPPs)) and a special software (Lean QbD Software(®)) was used for the RA, which represents a novelty in this field. The RA was able to predict and identify theoretically the factors (e.g. sample composition, production method parameters, etc.) which have the highest impact on the desired meloxicam-product quality. The results of the practical research justified the theoretical prediction. This method can improve pharmaceutical nano-developments by achieving shorter development time, lower cost, saving human resource efforts and more effective target-orientation. It makes possible focusing the resources on the selected parameters and area during the practical product development. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Predicting the Overall Spatial Quality of Automotive Audio Systems

    NASA Astrophysics Data System (ADS)

    Koya, Daisuke

    The spatial quality of automotive audio systems is often compromised due to their unideal listening environments. Automotive audio systems need to be developed quickly due to industry demands. A suitable perceptual model could evaluate the spatial quality of automotive audio systems with similar reliability to formal listening tests but take less time. Such a model is developed in this research project by adapting an existing model of spatial quality for automotive audio use. The requirements for the adaptation were investigated in a literature review. A perceptual model called QESTRAL was reviewed, which predicts the overall spatial quality of domestic multichannel audio systems. It was determined that automotive audio systems are likely to be impaired in terms of the spatial attributes that were not considered in developing the QESTRAL model, but metrics are available that might predict these attributes. To establish whether the QESTRAL model in its current form can accurately predict the overall spatial quality of automotive audio systems, MUSHRA listening tests using headphone auralisation with head tracking were conducted to collect results to be compared against predictions by the model. Based on guideline criteria, the model in its current form could not accurately predict the overall spatial quality of automotive audio systems. To improve prediction performance, the QESTRAL model was recalibrated and modified using existing metrics of the model, those that were proposed from the literature review, and newly developed metrics. The most important metrics for predicting the overall spatial quality of automotive audio systems included those that were interaural cross-correlation (IACC) based, relate to localisation of the frontal audio scene, and account for the perceived scene width in front of the listener. Modifying the model for automotive audio systems did not invalidate its use for domestic audio systems. The resulting model predicts the overall spatial quality of 2- and 5-channel automotive audio systems with a cross-validation performance of R. 2 = 0.85 and root-mean-squareerror (RMSE) = 11.03%.

  12. Combining microwave resonance technology to multivariate data analysis as a novel PAT tool to improve process understanding in fluid bed granulation.

    PubMed

    Lourenço, Vera; Herdling, Thorsten; Reich, Gabriele; Menezes, José C; Lochmann, Dirk

    2011-08-01

    A set of 192 fluid bed granulation batches at industrial scale were in-line monitored using microwave resonance technology (MRT) to determine moisture, temperature and density of the granules. Multivariate data analysis techniques such as multiway partial least squares (PLS), multiway principal component analysis (PCA) and multivariate batch control charts were applied onto collected batch data sets. The combination of all these techniques, along with off-line particle size measurements, led to significantly increased process understanding. A seasonality effect could be put into evidence that impacted further processing through its influence on the final granule size. Moreover, it was demonstrated by means of a PLS that a relation between the particle size and the MRT measurements can be quantitatively defined, highlighting a potential ability of the MRT sensor to predict information about the final granule size. This study has contributed to improve a fluid bed granulation process, and the process knowledge obtained shows that the product quality can be built in process design, following Quality by Design (QbD) and Process Analytical Technology (PAT) principles. Copyright © 2011. Published by Elsevier B.V.

  13. A quality-by-design approach to risk reduction and optimization for human embryonic stem cell cryopreservation processes.

    PubMed

    Mitchell, Peter D; Ratcliffe, Elizabeth; Hourd, Paul; Williams, David J; Thomas, Robert J

    2014-12-01

    It is well documented that cryopreservation and resuscitation of human embryonic stem cells (hESCs) is complex and ill-defined, and often suffers poor cell recovery and increased levels of undesirable cell differentiation. In this study we have applied Quality-by-Design (QbD) concepts to the critical processes of slow-freeze cryopreservation and resuscitation of hESC colony cultures. Optimized subprocesses were linked together to deliver a controlled complete process. We have demonstrated a rapid, high-throughput, and stable system for measurement of cell adherence and viability as robust markers of in-process and postrecovery cell state. We observed that measurement of adherence and viability of adhered cells at 1 h postseeding was predictive of cell proliferative ability up to 96 h in this system. Application of factorial design defined the operating spaces for cryopreservation and resuscitation, critically linking the performance of these two processes. Optimization of both processes resulted in enhanced reattachment and post-thaw viability, resulting in substantially greater recovery of cryopreserved, pluripotent cell colonies. This study demonstrates the importance of QbD concepts and tools for rapid, robust, and low-risk process design that can inform manufacturing controls and logistics.

  14. Potential application of machine vision technology to saffron (Crocus sativus L.) quality characterization.

    PubMed

    Kiani, Sajad; Minaei, Saeid

    2016-12-01

    Saffron quality characterization is an important issue in the food industry and of interest to the consumers. This paper proposes an expert system based on the application of machine vision technology for characterization of saffron and shows how it can be employed in practical usage. There is a correlation between saffron color and its geographic location of production and some chemical attributes which could be properly used for characterization of saffron quality and freshness. This may be accomplished by employing image processing techniques coupled with multivariate data analysis for quantification of saffron properties. Expert algorithms can be made available for prediction of saffron characteristics such as color as well as for product classification. Copyright © 2016. Published by Elsevier Ltd.

  15. Incorporating discrete event simulation into quality improvement efforts in health care systems.

    PubMed

    Rutberg, Matthew Harris; Wenczel, Sharon; Devaney, John; Goldlust, Eric Jonathan; Day, Theodore Eugene

    2015-01-01

    Quality improvement (QI) efforts are an indispensable aspect of health care delivery, particularly in an environment of increasing financial and regulatory pressures. The ability to test predictions of proposed changes to flow, policy, staffing, and other process-level changes using discrete event simulation (DES) has shown significant promise and is well reported in the literature. This article describes how to incorporate DES into QI departments and programs in order to support QI efforts, develop high-fidelity simulation models, conduct experiments, make recommendations, and support adoption of results. The authors describe how DES-enabled QI teams can partner with clinical services and administration to plan, conduct, and sustain QI investigations. © 2013 by the American College of Medical Quality.

  16. Antecedents and Outcomes of Joint Trajectories of Mother-Son Conflict and Warmth during Middle Childhood and Adolescence

    PubMed Central

    Trentacosta, Christopher J.; Criss, Michael M.; Shaw, Daniel S.; Lacourse, Eric; Hyde, Luke W.; Dishion, Thomas J.

    2011-01-01

    This study investigated the development of mother-son relationship quality from ages 5 to 15 in a sample of 265 low-income families. Non-parametric random effects modeling was utilized to uncover distinct and homogeneous developmental trajectories of conflict and warmth; antecedents and outcomes of the trajectory groups also were examined. Four conflict trajectory groups and three warmth trajectory groups were identified. Difficult temperament in early childhood discriminated both conflict and warmth trajectory group membership (TGM), and adult relationship quality in early childhood was related to warmth trajectories. In addition, conflict TGM differentiated youth antisocial behavior during adolescence, and warmth trajectories predicted adolescent peer relationship quality and youth moral disengagement. Implications for socialization processes are discussed. PMID:21883153

  17. ConsPred: a rule-based (re-)annotation framework for prokaryotic genomes.

    PubMed

    Weinmaier, Thomas; Platzer, Alexander; Frank, Jeroen; Hellinger, Hans-Jörg; Tischler, Patrick; Rattei, Thomas

    2016-11-01

    The rapidly growing number of available prokaryotic genome sequences requires fully automated and high-quality software solutions for their initial and re-annotation. Here we present ConsPred, a prokaryotic genome annotation framework that performs intrinsic gene predictions, homology searches, predictions of non-coding genes as well as CRISPR repeats and integrates all evidence into a consensus annotation. ConsPred achieves comprehensive, high-quality annotations based on rules and priorities, similar to decision-making in manual curation and avoids conflicting predictions. Parameters controlling the annotation process are configurable by the user. ConsPred has been used in the institutions of the authors for longer than 5 years and can easily be extended and adapted to specific needs. The ConsPred algorithm for producing a consensus from the varying scores of multiple gene prediction programs approaches manual curation in accuracy. Its rule-based approach for choosing final predictions avoids overriding previous manual curations. ConsPred is implemented in Java, Perl and Shell and is freely available under the Creative Commons license as a stand-alone in-house pipeline or as an Amazon Machine Image for cloud computing, see https://sourceforge.net/projects/conspred/. thomas.rattei@univie.ac.atSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Sibling Influences on Gender Development in Middle Childhood and Early Adolescence: A Longitudinal Study.

    ERIC Educational Resources Information Center

    McHale, Susan M.; Updegraff, Kimberly A.; Helms- Erikson, Heather; Crouter, Ann C.

    2001-01-01

    Examined development of gender role qualities from middle childhood to early adolescence to determine whether children's gender role qualities predicted siblings'. Found that firstborn children's qualities in Year 1 predicted second-born children's qualities in Year 3 when Year 1 parent and child qualities were controlled. Parental influence was…

  19. Mood as a resource in dealing with health recommendations: how mood affects information processing and acceptance of quit-smoking messages.

    PubMed

    Das, Enny; Vonkeman, Charlotte; Hartmann, Tilo

    2012-01-01

    An experimental study tested the effects of positive and negative mood on the processing and acceptance of health recommendations about smoking in an online experiment. It was hypothesised that positive mood would provide smokers with the resources to systematically process self-relevant health recommendations. One hundred and twenty-seven participants (smokers and non-smokers) read a message in which a quit smoking programme was recommended. Participants were randomly assigned to one of four conditions: positive versus negative mood, and strong versus weak arguments for the recommended action. Systematic message processing was inferred when participants were able to distinguish between high- and low-quality arguments, and by congruence between attitudes and behavioural intentions. Persuasion was measured by participant's attitudes towards smoking and the recommended action, and by their intentions to follow the action recommendation. As predicted, smokers systematically processed the health message only under positive mood conditions; non-smokers systematically processed the health message only under negative mood conditions. Moreover, smokers' attitudes towards the health message predicted intentions to quit smoking only under positive mood conditions. Findings suggest that positive mood may decrease defensive processing of self-relevant health information.

  20. Semi-Supervised Tripled Dictionary Learning for Standard-dose PET Image Prediction using Low-dose PET and Multimodal MRI

    PubMed Central

    Wang, Yan; Ma, Guangkai; An, Le; Shi, Feng; Zhang, Pei; Lalush, David S.; Wu, Xi; Pu, Yifei; Zhou, Jiliu; Shen, Dinggang

    2017-01-01

    Objective To obtain high-quality positron emission tomography (PET) image with low-dose tracer injection, this study attempts to predict the standard-dose PET (S-PET) image from both its low-dose PET (L-PET) counterpart and corresponding magnetic resonance imaging (MRI). Methods It was achieved by patch-based sparse representation (SR), using the training samples with a complete set of MRI, L-PET and S-PET modalities for dictionary construction. However, the number of training samples with complete modalities is often limited. In practice, many samples generally have incomplete modalities (i.e., with one or two missing modalities) that thus cannot be used in the prediction process. In light of this, we develop a semi-supervised tripled dictionary learning (SSTDL) method for S-PET image prediction, which can utilize not only the samples with complete modalities (called complete samples) but also the samples with incomplete modalities (called incomplete samples), to take advantage of the large number of available training samples and thus further improve the prediction performance. Results Validation was done on a real human brain dataset consisting of 18 subjects, and the results show that our method is superior to the SR and other baseline methods. Conclusion This work proposed a new S-PET prediction method, which can significantly improve the PET image quality with low-dose injection. Significance The proposed method is favorable in clinical application since it can decrease the potential radiation risk for patients. PMID:27187939

  1. PepsNMR for 1H NMR metabolomic data pre-processing.

    PubMed

    Martin, Manon; Legat, Benoît; Leenders, Justine; Vanwinsberghe, Julien; Rousseau, Réjane; Boulanger, Bruno; Eilers, Paul H C; De Tullio, Pascal; Govaerts, Bernadette

    2018-08-17

    In the analysis of biological samples, control over experimental design and data acquisition procedures alone cannot ensure well-conditioned 1 H NMR spectra with maximal information recovery for data analysis. A third major element affects the accuracy and robustness of results: the data pre-processing/pre-treatment for which not enough attention is usually devoted, in particular in metabolomic studies. The usual approach is to use proprietary software provided by the analytical instruments' manufacturers to conduct the entire pre-processing strategy. This widespread practice has a number of advantages such as a user-friendly interface with graphical facilities, but it involves non-negligible drawbacks: a lack of methodological information and automation, a dependency of subjective human choices, only standard processing possibilities and an absence of objective quality criteria to evaluate pre-processing quality. This paper introduces PepsNMR to meet these needs, an R package dedicated to the whole processing chain prior to multivariate data analysis, including, among other tools, solvent signal suppression, internal calibration, phase, baseline and misalignment corrections, bucketing and normalisation. Methodological aspects are discussed and the package is compared to the gold standard procedure with two metabolomic case studies. The use of PepsNMR on these data shows better information recovery and predictive power based on objective and quantitative quality criteria. Other key assets of the package are workflow processing speed, reproducibility, reporting and flexibility, graphical outputs and documented routines. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Optimization of process parameters of pulsed TIG welded maraging steel C300

    NASA Astrophysics Data System (ADS)

    Deepak, P.; Jualeash, M. J.; Jishnu, J.; Srinivasan, P.; Arivarasu, M.; Padmanaban, R.; Thirumalini, S.

    2016-09-01

    Pulsed TIG welding technology provides excellent welding performance on thin sections which helps to increase productivity, enhance weld quality, minimize weld costs, and boost operator efficiency and this has drawn the attention of the welding society. Maraging C300 steel is extensively used in defence and aerospace industry and thus its welding becomes an area of paramount importance. In pulsed TIG welding, weld quality depends on the process parameters used. In this work, Pulsed TIG bead-on-plate welding is performed on a 5mm thick maraging C300 plate at different combinations of input parameters: peak current (Ip), base current (Ib) and pulsing frequency (HZ) as per box behnken design with three-levels for each factor. Response surface methodology is utilized for establishing a mathematical model for predicting the weld bead depth. The effect of Ip, Ib and HZ on the weld bead depth is investigated using the developed model. The weld bead depth is found to be affected by all the three parameters. Surface and contour plots developed from regression equation are used to optimize the processing parameters for maximizing the weld bead depth. Optimum values of Ip, Ib and HZ are obtained as 259 A, 120 A and 8 Hz respectively. Using this optimum condition, maximum bead depth of the weld is predicted to be 4.325 mm.

  3. Application of the target lipid model and passive samplers to characterize the toxicity of bioavailable organics in oil sands process-affected water.

    PubMed

    Redman, Aaron D; Parkerton, Thomas F; Butler, Josh David; Letinski, Daniel J; Frank, Richard A; Hewitt, L Mark; Bartlett, Adrienne J; Gillis, Patricia Leigh; Marentette, Julie R; Parrott, Joanne L; Hughes, Sarah A; Guest, Rodney; Bekele, Asfaw; Zhang, Kun; Morandi, Garrett; Wiseman, Steve B; Giesy, John P

    2018-06-14

    Oil sand operations in Alberta, Canada will eventually include returning treated process-affected waters to the environment. Organic constituents in oil sand process-affected water (OSPW) represent complex mixtures of nonionic and ionic (e.g. naphthenic acids) compounds, and compositions can vary spatially and temporally, which has impeded development of water quality benchmarks. To address this challenge, it was hypothesized that solid phase microextraction fibers coated with polydimethylsiloxane (PDMS) could be used as a biomimetic extraction (BE) to measure bioavailable organics in OSPW. Organic constituents of OSPW were assumed to contribute additively to toxicity, and partitioning to PDMS was assumed to be predictive of accumulation in target lipids, which were the presumed site of action. This method was tested using toxicity data for individual model compounds, defined mixtures, and organic mixtures extracted from OSPW. Toxicity was correlated with BE data, which supports the use of this method in hazard assessments of acute lethality to aquatic organisms. A species sensitivity distribution (SSD), based on target lipid model and BE values, was similar to SSDs based on residues in tissues for both nonionic and ionic organics. BE was shown to be an analytical tool that accounts for bioaccumulation of organic compound mixtures from which toxicity can be predicted, with the potential to aid in the development of water quality guidelines.

  4. Evaluation of the AnnAGNPS Model for Predicting Runoff and Nutrient Export in a Typical Small Watershed in the Hilly Region of Taihu Lake

    PubMed Central

    Luo, Chuan; Li, Zhaofu; Li, Hengpeng; Chen, Xiaomin

    2015-01-01

    The application of hydrological and water quality models is an efficient approach to better understand the processes of environmental deterioration. This study evaluated the ability of the Annualized Agricultural Non-Point Source (AnnAGNPS) model to predict runoff, total nitrogen (TN) and total phosphorus (TP) loading in a typical small watershed of a hilly region near Taihu Lake, China. Runoff was calibrated and validated at both an annual and monthly scale, and parameter sensitivity analysis was performed for TN and TP before the two water quality components were calibrated. The results showed that the model satisfactorily simulated runoff at annual and monthly scales, both during calibration and validation processes. Additionally, results of parameter sensitivity analysis showed that the parameters Fertilizer rate, Fertilizer organic, Canopy cover and Fertilizer inorganic were more sensitive to TN output. In terms of TP, the parameters Residue mass ratio, Fertilizer rate, Fertilizer inorganic and Canopy cover were the most sensitive. Based on these sensitive parameters, calibration was performed. TN loading produced satisfactory results for both the calibration and validation processes, whereas the performance of TP loading was slightly poor. The simulation results showed that AnnAGNPS has the potential to be used as a valuable tool for the planning and management of watersheds. PMID:26364642

  5. Early warning of changing drinking water quality by trend analysis.

    PubMed

    Tomperi, Jani; Juuso, Esko; Leiviskä, Kauko

    2016-06-01

    Monitoring and control of water treatment plants play an essential role in ensuring high quality drinking water and avoiding health-related problems or economic losses. The most common quality variables, which can be used also for assessing the efficiency of the water treatment process, are turbidity and residual levels of coagulation and disinfection chemicals. In the present study, the trend indices are developed from scaled measurements to detect warning signs of changes in the quality variables of drinking water and some operating condition variables that strongly affect water quality. The scaling is based on monotonically increasing nonlinear functions, which are generated with generalized norms and moments. Triangular episodes are classified with the trend index and its derivative. Deviation indices are used to assess the severity of situations. The study shows the potential of the described trend analysis as a predictive monitoring tool, as it provides an advantage over the traditional manual inspection of variables by detecting changes in water quality and giving early warnings.

  6. Bayesian Revision of Residual Detection Power

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2013-01-01

    This paper addresses some issues with quality assessment and quality assurance in response surface modeling experiments executed in wind tunnels. The role of data volume on quality assurance for response surface models is reviewed. Specific wind tunnel response surface modeling experiments are considered for which apparent discrepancies exist between fit quality expectations based on implemented quality assurance tactics, and the actual fit quality achieved in those experiments. These discrepancies are resolved by using Bayesian inference to account for certain imperfections in the assessment methodology. Estimates of the fraction of out-of-tolerance model predictions based on traditional frequentist methods are revised to account for uncertainty in the residual assessment process. The number of sites in the design space for which residuals are out of tolerance is seen to exceed the number of sites where the model actually fails to fit the data. A method is presented to estimate how much of the design space in inadequately modeled by low-order polynomial approximations to the true but unknown underlying response function.

  7. Nursing home consumer complaints and quality of care: a national view.

    PubMed

    Stevenson, David G

    2006-06-01

    This study uses 5 years of national data on investigated nursing home complaints (1998-2002) to evaluate whether complaints might be used to assess nursing home quality of care. On-Line Survey Certification and Reporting (OSCAR) data are used to evaluate the association between consumer complaints, facility and resident characteristics, and other nursing home quality measures. The analyses are undertaken in the context of considerable cross-state variation in nursing home complaint processes and rates. Complaints varied across facility characteristics in ways consistent with the nursing home quality literature. Complaints were significantly positively associated with survey deficiencies and the presence of serious survey deficiencies, and significantly negatively associated with nurse and nurse aide staffing. Complaints performance was significantly predictive of survey deficiencies at subsequent inspections. This study presents the first national evidence for using consumer complaints to assess nursing home quality of care. Despite limitations, nursing home complaints appear to offer a real-time signal of quality concerns.

  8. Modeling multi-scale aerosol dynamics and micro-environmental air quality near a large highway intersection using the CTAG model.

    PubMed

    Wang, Yan Jason; Nguyen, Monica T; Steffens, Jonathan T; Tong, Zheming; Wang, Yungang; Hopke, Philip K; Zhang, K Max

    2013-01-15

    A new methodology, referred to as the multi-scale structure, integrates "tailpipe-to-road" (i.e., on-road domain) and "road-to-ambient" (i.e., near-road domain) simulations to elucidate the environmental impacts of particulate emissions from traffic sources. The multi-scale structure is implemented in the CTAG model to 1) generate process-based on-road emission rates of ultrafine particles (UFPs) by explicitly simulating the effects of exhaust properties, traffic conditions, and meteorological conditions and 2) to characterize the impacts of traffic-related emissions on micro-environmental air quality near a highway intersection in Rochester, NY. The performance of CTAG, evaluated against with the field measurements, shows adequate agreement in capturing the dispersion of carbon monoxide (CO) and the number concentrations of UFPs in the near road micro-environment. As a proof-of-concept case study, we also apply CTAG to separate the relative impacts of the shutdown of a large coal-fired power plant (CFPP) and the adoption of the ultra-low-sulfur diesel (ULSD) on UFP concentrations in the intersection micro-environment. Although CTAG is still computationally expensive compared to the widely-used parameterized dispersion models, it has the potential to advance our capability to predict the impacts of UFP emissions and spatial/temporal variations of air pollutants in complex environments. Furthermore, for the on-road simulations, CTAG can serve as a process-based emission model; Combining the on-road and near-road simulations, CTAG becomes a "plume-in-grid" model for mobile emissions. The processed emission profiles can potentially improve regional air quality and climate predictions accordingly. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. A Design of Experiment approach to predict product and process parameters for a spray dried influenza vaccine.

    PubMed

    Kanojia, Gaurav; Willems, Geert-Jan; Frijlink, Henderik W; Kersten, Gideon F A; Soema, Peter C; Amorij, Jean-Pierre

    2016-09-25

    Spray dried vaccine formulations might be an alternative to traditional lyophilized vaccines. Compared to lyophilization, spray drying is a fast and cheap process extensively used for drying biologicals. The current study provides an approach that utilizes Design of Experiments for spray drying process to stabilize whole inactivated influenza virus (WIV) vaccine. The approach included systematically screening and optimizing the spray drying process variables, determining the desired process parameters and predicting product quality parameters. The process parameters inlet air temperature, nozzle gas flow rate and feed flow rate and their effect on WIV vaccine powder characteristics such as particle size, residual moisture content (RMC) and powder yield were investigated. Vaccine powders with a broad range of physical characteristics (RMC 1.2-4.9%, particle size 2.4-8.5μm and powder yield 42-82%) were obtained. WIV showed no significant loss in antigenicity as revealed by hemagglutination test. Furthermore, descriptive models generated by DoE software could be used to determine and select (set) spray drying process parameter. This was used to generate a dried WIV powder with predefined (predicted) characteristics. Moreover, the spray dried vaccine powders retained their antigenic stability even after storage for 3 months at 60°C. The approach used here enabled the generation of a thermostable, antigenic WIV vaccine powder with desired physical characteristics that could be potentially used for pulmonary administration. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Effect of horizontal resolution on meteorology and air-quality prediction with a regional scale model

    NASA Astrophysics Data System (ADS)

    Varghese, Saji; Langmann, Baerbel; Ceburnis, Darius; O'Dowd, Colin D.

    2011-08-01

    Horizontal resolution sensitivity can significantly contribute to the uncertainty in predictions of meteorology and air-quality from a regional climate model. In the study presented here, a state-of-the-art regional scale atmospheric climate-chemistry-aerosol model REMOTE is used to understand the influence of spatial model resolutions of 1.0°, 0.5° and 0.25° on predicted meteorological and aerosol parameters for June 2003 for the European domain comprising North-east Atlantic and Western Europe. Model precipitation appears to improve with resolution while wind speed has shown best results for 0.25° resolution for most of the stations compared with ECAD data. Low root mean square error and spatial bias for surface pressure, precipitation and surface temperature show that the model is very reliable. Spatial and temporal variation in black carbon, primary organic carbon, sea-salt and sulphate concentrations and their burden are presented. In most cases, chemical species concentrations at the surface show no particular trend or improvement with increase in resolution. There has been a pronounced influence of horizontal resolution on the vertical distribution pattern of some aerosol species. Some of these effects are due to the improvement in topographical details, flow characteristics and associated vertical and horizontal dynamic processes. The different sink processes have contributed very differently to the various aerosol species in terms of deposition (wet and dry) and sedimentation which are strongly linked to the meteorological processes. Overall, considering the performance of meteorological parameters and chemical species concentrations, a horizontal model resolution of 0.5° is suggested to achieve reasonable results within the limitations of this model.

  11. Predictors of facial attractiveness and health in humans.

    PubMed

    Foo, Yong Zhi; Simmons, Leigh W; Rhodes, Gillian

    2017-02-03

    Facial attractiveness has been suggested to provide signals of biological quality, particularly health, in humans. The attractive traits that have been implicated as signals of biological quality include sexual dimorphism, symmetry, averageness, adiposity, and carotenoid-based skin colour. In this study, we first provide a comprehensive examination of the traits that predict attractiveness. In men, attractiveness was predicted positively by masculinity, symmetry, averageness, and negatively by adiposity. In women, attractiveness was predicted positively by femininity and negatively by adiposity. Skin colour did not predict attractiveness in either sex, suggesting that, despite recent interest in the literature, colour may play limited role in determining attractiveness. Male perceived health was predicted positively by averageness, symmetry, and skin yellowness, and negatively by adiposity. Female perceived health was predicted by femininity. We then examined whether appearance predicted actual health using measures that have been theoretically linked to sexual selection, including immune function, oxidative stress, and semen quality. In women, there was little evidence that female appearance predicted health. In men, we found support for the phenotype-linked fertility hypothesis that male masculinity signalled semen quality. However, we also found a negative relationship between averageness and semen quality. Overall, these results indicate weak links between attractive facial traits and health.

  12. Predictors of facial attractiveness and health in humans

    PubMed Central

    Foo, Yong Zhi; Simmons, Leigh W.; Rhodes, Gillian

    2017-01-01

    Facial attractiveness has been suggested to provide signals of biological quality, particularly health, in humans. The attractive traits that have been implicated as signals of biological quality include sexual dimorphism, symmetry, averageness, adiposity, and carotenoid-based skin colour. In this study, we first provide a comprehensive examination of the traits that predict attractiveness. In men, attractiveness was predicted positively by masculinity, symmetry, averageness, and negatively by adiposity. In women, attractiveness was predicted positively by femininity and negatively by adiposity. Skin colour did not predict attractiveness in either sex, suggesting that, despite recent interest in the literature, colour may play limited role in determining attractiveness. Male perceived health was predicted positively by averageness, symmetry, and skin yellowness, and negatively by adiposity. Female perceived health was predicted by femininity. We then examined whether appearance predicted actual health using measures that have been theoretically linked to sexual selection, including immune function, oxidative stress, and semen quality. In women, there was little evidence that female appearance predicted health. In men, we found support for the phenotype-linked fertility hypothesis that male masculinity signalled semen quality. However, we also found a negative relationship between averageness and semen quality. Overall, these results indicate weak links between attractive facial traits and health. PMID:28155897

  13. Modeling in the quality by design environment: Regulatory requirements and recommendations for design space and control strategy appointment.

    PubMed

    Djuris, Jelena; Djuric, Zorica

    2017-11-30

    Mathematical models can be used as an integral part of the quality by design (QbD) concept throughout the product lifecycle for variety of purposes, including appointment of the design space and control strategy, continual improvement and risk assessment. Examples of different mathematical modeling techniques (mechanistic, empirical and hybrid) in the pharmaceutical development and process monitoring or control are provided in the presented review. In the QbD context, mathematical models are predominantly used to support design space and/or control strategies. Considering their impact to the final product quality, models can be divided into the following categories: high, medium and low impact models. Although there are regulatory guidelines on the topic of modeling applications, review of QbD-based submission containing modeling elements revealed concerns regarding the scale-dependency of design spaces and verification of models predictions at commercial scale of manufacturing, especially regarding real-time release (RTR) models. Authors provide critical overview on the good modeling practices and introduce concepts of multiple-unit, adaptive and dynamic design space, multivariate specifications and methods for process uncertainty analysis. RTR specification with mathematical model and different approaches to multivariate statistical process control supporting process analytical technologies are also presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. ATLAS trigger operations: Upgrades to ``Xmon'' rate prediction system

    NASA Astrophysics Data System (ADS)

    Myers, Ava; Aukerman, Andrew; Hong, Tae Min; Atlas Collaboration

    2017-01-01

    We present ``Xmon,'' a tool to monitor trigger rates in the Control Room of the ATLAS Experiment. We discuss Xmon's recent (1) updates, (2) upgrades, and (3) operations. (1) Xmon was updated to modify the tool written for the three-level trigger architecture in Run-1 (2009-2012) to adapt to the new two-level system for Run-2 (2015-current). The tool takes as input the beam luminosity to make a rate prediction, which is compared with incoming rates to detect anomalies that occur both globally throughout a run and locally within a run. Global offsets are more commonly caught by the predictions based upon past runs, where offline processing allows for function adjustments and fit quality through outlier rejection. (2) Xmon was upgraded to detect local offsets using on-the-fly predictions, which uses a sliding window of in-run rates to make predictions. (3) Xmon operations examples are given. Future work involves further automation of the steps to provide the predictive functions and for alerting shifters.

  15. Predicting Trihalomethanes (THMs) in the New York City Water Supply

    NASA Astrophysics Data System (ADS)

    Mukundan, R.; Van Dreason, R.

    2013-12-01

    Chlorine, a commonly used disinfectant in most water supply systems, can combine with organic carbon to form disinfectant byproducts including carcinogenic trihalomethanes (THMs). We used water quality data from 24 monitoring sites within the New York City (NYC) water supply distribution system, measured between January 2009 and April 2012, to develop site-specific empirical models for predicting total trihalomethane (TTHM) levels. Terms in the model included various combinations of the following water quality parameters: total organic carbon, pH, specific conductivity, and water temperature. Reasonable estimates of TTHM levels were achieved with overall R2 of about 0.87 and predicted values within 5 μg/L of measured values. The relative importance of factors affecting TTHM formation was estimated by ranking the model regression coefficients. Site-specific models showed improved model performance statistics compared to a single model for the entire system most likely because the single model did not consider locational differences in the water treatment process. Although never out of compliance in 2011, the TTHM levels in the water supply increased following tropical storms Irene and Lee with 45% of the samples exceeding the 80 μg/L Maximum Contaminant Level (MCL) in October and November. This increase was explained by changes in water quality parameters, particularly by the increase in total organic carbon concentration and pH during this period.

  16. Assessing the Association Between Asthma and Air Quality in the Presence of Wildfires

    NASA Astrophysics Data System (ADS)

    Young, L. J.; Al-Hamdan, M. Z.; Lopiano, K. K.; Crosson, W. L.; Gotway, C. A.; DuClos, C.; Jordan, M.; Estes, M. G.; Luvall, J. C.; Estes, S. M.; Xu, X.; Holt, N. M.; Leary, E.

    2012-12-01

    Asthma hospital/emergency room (patient) data are used as the foundation for creating a health outcome indicator of human response to environmental air quality. Daily U.S. Environmental Protection Agency (EPA) Air Quality System (AQS) fine particulates (PM2.5) ground data and the U.S. National Aeronautical Space Administration (NASA) MODIS aerosol optical depth (AOD) data were acquired and processed for years of 2007 and 2008. Figure 1 shows the PM2.5 annual mean composite of all the 2007 B-spline daily surfaces. Initial models for predicting the number of weekly asthma cases within a Florida county has focused on environmental variables. Weekly maximums of PM2.5, relative humidity, and the proportions of the county with smoke and fire were the environmental variables included in the model. Cosine and sine functions of time were used to account for seasonality in asthma cases. Counties were considered to be random effects, thereby adjusting for differences in socio-demographics and other factors. The 2007 predictions for Miami-Dade county when using B-splines PM2.5 are displayed in Figures 2.; PM2.5 annual mean composite of all the 2007 daily surfaces developed using Al-Hamdan et al (2009) B-spline fitting algorithm ; Predicted and observed weekly asthma cases presenting to hospitals or emergency rooms in Miami-Dade county in Florida during 2007

  17. Numerical Simulation of Non-Thermal Food Preservation

    NASA Astrophysics Data System (ADS)

    Rauh, C.; Krauss, J.; Ertunc, Ö.; Delgado, a.

    2010-09-01

    Food preservation is an important process step in food technology regarding product safety and product quality. Novel preservation techniques are currently developed, that aim at improved sensory and nutritional value but comparable safety than in conventional thermal preservation techniques. These novel non-thermal food preservation techniques are based for example on high pressures up to one GPa or pulsed electric fields. in literature studies the high potential of high pressures (HP) and of pulsed electric fields (PEF) is shown due to their high retention of valuable food components as vitamins and flavour and selective inactivation of spoiling enzymes and microorganisms. for the design of preservation processes based on the non-thermal techniques it is crucial to predict the effect of high pressure and pulsed electric fields on the food components and on the spoiling enzymes and microorganisms locally and time-dependent in the treated product. Homogenous process conditions (especially of temperature fields in HP and PEF processing and of electric fields in PEF) are aimed at to avoid the need of over-processing and the connected quality loss and to minimize safety risks due to under-processing. the present contribution presents numerical simulations of thermofluiddynamical phenomena inside of high pressure autoclaves and pulsed electric field treatment chambers. in PEF processing additionally the electric fields are considered. Implementing kinetics of occurring (bio-) chemical reactions in the numerical simulations of the temperature, flow and electric fields enables the evaluation of the process homogeneity and efficiency connected to different process parameters of the preservation techniques. Suggestions to achieve safe and high quality products are concluded out of the numerical results.

  18. Rapid non-destructive assessment of pork edible quality by using VIS/NIR spectroscopic technique

    NASA Astrophysics Data System (ADS)

    Zhang, Leilei; Peng, Yankun; Dhakal, Sagar; Song, Yulin; Zhao, Juan; Zhao, Songwei

    2013-05-01

    The objectives of this research were to develop a rapid non-destructive method to evaluate the edible quality of chilled pork. A total of 42 samples were packed in seal plastic bags and stored at 4°C for 1 to 21 days. Reflectance spectra were collected from visible/near-infrared spectroscopy system in the range of 400nm to 1100nm. Microbiological, physicochemical and organoleptic characteristics such as the total viable counts (TVC), total volatile basic-nitrogen (TVB-N), pH value and color parameters L* were determined to appraise pork edible quality. Savitzky-Golay (SG) based on five and eleven smoothing points, Multiple Scattering Correlation (MSC) and first derivative pre-processing methods were employed to eliminate the spectra noise. The support vector machines (SVM) and partial least square regression (PLSR) were applied to establish prediction models using the de-noised spectra. A linear correlation was developed between the VIS/NIR spectroscopy and parameters such as TVC, TVB-N, pH and color parameter L* indexes, which could gain prediction results with Rv of 0.931, 0.844, 0.805 and 0.852, respectively. The results demonstrated that VIS/NIR spectroscopy technique combined with SVM possesses a powerful assessment capability. It can provide a potential tool for detecting pork edible quality rapidly and non-destructively.

  19. Digital contract approach for consistent and predictable multimedia information delivery in electronic commerce

    NASA Astrophysics Data System (ADS)

    Konana, Prabhudev; Gupta, Alok; Whinston, Andrew B.

    1997-01-01

    A pure 'technological' solution to network quality problems is incomplete since any benefits from new technologies are offset by the demand from exponentially growing electronic commerce ad data-intensive applications. SInce an economic paradigm is implicit in electronic commerce, we propose a 'market-system' approach to improve quality of service. Quality of service for digital products takes on a different meaning since users view quality of service differently and value information differently. We propose a framework for electronic commerce that is based on an economic paradigm and mass-customization, and works as a wide-area distributed management system. In our framework, surrogate-servers act as intermediaries between information provides and end- users, and arrange for consistent and predictable information delivery through 'digital contracts.' These contracts are negotiated and priced based on economic principles. Surrogate servers pre-fetched, through replication, information from many different servers and consolidate based on demand expectations. In order to recognize users' requirements and process requests accordingly, real-time databases are central to our framework. We also propose that multimedia information be separated into slowly changing and rapidly changing data streams to improve response time requirements. Surrogate- servers perform the tasks of integration of these data streams that is transparent to end-users.

  20. Can integrative catchment management mitigate future water quality issues caused by climate change and socio-economic development?

    NASA Astrophysics Data System (ADS)

    Honti, Mark; Schuwirth, Nele; Rieckermann, Jörg; Stamm, Christian

    2017-03-01

    The design and evaluation of solutions for integrated surface water quality management requires an integrated modelling approach. Integrated models have to be comprehensive enough to cover the aspects relevant for management decisions, allowing for mapping of larger-scale processes such as climate change to the regional and local contexts. Besides this, models have to be sufficiently simple and fast to apply proper methods of uncertainty analysis, covering model structure deficits and error propagation through the chain of sub-models. Here, we present a new integrated catchment model satisfying both conditions. The conceptual iWaQa model was developed to support the integrated management of small streams. It can be used to predict traditional water quality parameters, such as nutrients and a wide set of organic micropollutants (plant and material protection products), by considering all major pollutant pathways in urban and agricultural environments. Due to its simplicity, the model allows for a full, propagative analysis of predictive uncertainty, including certain structural and input errors. The usefulness of the model is demonstrated by predicting future surface water quality in a small catchment with mixed land use in the Swiss Plateau. We consider climate change, population growth or decline, socio-economic development, and the implementation of management strategies to tackle urban and agricultural point and non-point sources of pollution. Our results indicate that input and model structure uncertainties are the most influential factors for certain water quality parameters. In these cases model uncertainty is already high for present conditions. Nevertheless, accounting for today's uncertainty makes management fairly robust to the foreseen range of potential changes in the next decades. The assessment of total predictive uncertainty allows for selecting management strategies that show small sensitivity to poorly known boundary conditions. The identification of important sources of uncertainty helps to guide future monitoring efforts and pinpoints key indicators, whose evolution should be closely followed to adapt management. The possible impact of climate change is clearly demonstrated by water quality substantially changing depending on single climate model chains. However, when all climate trajectories are combined, the human land use and management decisions have a larger influence on water quality against a time horizon of 2050 in the study.

Top