Model-based software process improvement
NASA Technical Reports Server (NTRS)
Zettervall, Brenda T.
1994-01-01
The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.
Process Correlation Analysis Model for Process Improvement Identification
Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170
Process correlation analysis model for process improvement identification.
Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.
Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B
2007-03-01
The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.
Capability Maturity Model (CMM) for Software Process Improvements
NASA Technical Reports Server (NTRS)
Ling, Robert Y.
2000-01-01
This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.
Agent-Based Computing in Distributed Adversarial Planning
2010-08-09
plans. An agent is expected to agree to deviate from its optimal uncoordinated plan only if it improves its position. - process models for opponent...Game . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.2 Improvements ...plan only if it improves its position. – process models for opponent modeling – We have analyzed the suitability of business process models for creating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, L.; Britt, J.; Birkmire, R.
ITN Energy Systems, Inc., and Global Solar Energy, Inc., assisted by NREL's PV Manufacturing R&D program, have continued to advance CIGS production technology by developing trajectory-oriented predictive/control models, fault-tolerance control, control platform development, in-situ sensors, and process improvements. Modeling activities included developing physics-based and empirical models for CIGS and sputter-deposition processing, implementing model-based control, and applying predictive models to the construction of new evaporation sources and for control. Model-based control is enabled by implementing reduced or empirical models into a control platform. Reliability improvement activities include implementing preventive maintenance schedules; detecting failed sensors/equipment and reconfiguring to tinue processing; and systematicmore » development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which in turn have been enabled by control and reliability improvements due to this PV Manufacturing R&D program.« less
Accelerating quality improvement within your organization: Applying the Model for Improvement.
Crowl, Ashley; Sharma, Anita; Sorge, Lindsay; Sorensen, Todd
2015-01-01
To discuss the fundamentals of the Model for Improvement and how the model can be applied to quality improvement activities associated with medication use, including understanding the three essential questions that guide quality improvement, applying a process for actively testing change within an organization, and measuring the success of these changes on care delivery. PubMed from 1990 through April 2014 using the search terms quality improvement, process improvement, hospitals, and primary care. At the authors' discretion, studies were selected based on their relevance in demonstrating the quality improvement process and tests of change within an organization. Organizations are continuously seeking to enhance quality in patient care services, and much of this work focuses on improving care delivery processes. Yet change in these systems is often slow, which can lead to frustration or apathy among frontline practitioners. Adopting and applying the Model for Improvement as a core strategy for quality improvement efforts can accelerate the process. While the model is frequently well known in hospitals and primary care settings, it is not always familiar to pharmacists. In addition, while some organizations may be familiar with the "plan, do, study, act" (PDSA) cycles-one element of the Model for Improvement-many do not apply it effectively. The goal of the model is to combine a continuous process of small tests of change (PDSA cycles) within an overarching aim with a longitudinal measurement process. This process differs from other forms of improvement work that plan and implement large-scale change over an extended period, followed by months of data collection. In this scenario it may take months or years to determine whether an intervention will have a positive impact. By following the Model for Improvement, frontline practitioners and their organizational leaders quickly identify strategies that make a positive difference and result in a greater degree of success.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, L.
ITN Energy Systems, Inc., and Global Solar Energy, Inc., with the assistance of NREL's PV Manufacturing R&D program, have continued the advancement of CIGS production technology through the development of trajectory-oriented predictive/control models, fault-tolerance control, control-platform development, in-situ sensors, and process improvements. Modeling activities to date include the development of physics-based and empirical models for CIGS and sputter-deposition processing, implementation of model-based control, and application of predictive models to the construction of new evaporation sources and for control. Model-based control is enabled through implementation of reduced or empirical models into a control platform. Reliability improvement activities include implementation of preventivemore » maintenance schedules; detection of failed sensors/equipment and reconfiguration to continue processing; and systematic development of fault prevention and reconfiguration strategies for the full range of CIGS PV production deposition processes. In-situ sensor development activities have resulted in improved control and indicated the potential for enhanced process status monitoring and control of the deposition processes. Substantial process improvements have been made, including significant improvement in CIGS uniformity, thickness control, efficiency, yield, and throughput. In large measure, these gains have been driven by process optimization, which, in turn, have been enabled by control and reliability improvements due to this PV Manufacturing R&D program. This has resulted in substantial improvements of flexible CIGS PV module performance and efficiency.« less
NASA Astrophysics Data System (ADS)
Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao
Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.
The use of discrete-event simulation modelling to improve radiation therapy planning processes.
Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven
2009-07-01
The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.
NASA Astrophysics Data System (ADS)
Qian, Xiaoshan
2018-01-01
The traditional model of evaporation process parameters have continuity and cumulative characteristics of the prediction error larger issues, based on the basis of the process proposed an adaptive particle swarm neural network forecasting method parameters established on the autoregressive moving average (ARMA) error correction procedure compensated prediction model to predict the results of the neural network to improve prediction accuracy. Taking a alumina plant evaporation process to analyze production data validation, and compared with the traditional model, the new model prediction accuracy greatly improved, can be used to predict the dynamic process of evaporation of sodium aluminate solution components.
2007-05-01
Organizational Structure 40 6.1.3 Funding Model 40 6.1.4 Role of Information Technology 40 6.2 Considering Process Improvement 41 6.2.1 Dimensions of...to the process definition for resiliency engineering. 6.1.3 Funding Model Just as organizational structures tend to align across security and...responsibility. Adopting an enter- prise view of operational resiliency and a process improvement approach requires that the funding model evolve to one
Improving Earth/Prediction Models to Improve Network Processing
NASA Astrophysics Data System (ADS)
Wagner, G. S.
2017-12-01
The United States Atomic Energy Detection System (USAEDS) primaryseismic network consists of a relatively small number of arrays andthree-component stations. The relatively small number of stationsin the USAEDS primary network make it both necessary and feasibleto optimize both station and network processing.Station processing improvements include detector tuning effortsthat use Receiver Operator Characteristic (ROC) curves to helpjudiciously set acceptable Type 1 (false) vs. Type 2 (miss) errorrates. Other station processing improvements include the use ofempirical/historical observations and continuous background noisemeasurements to compute time-varying, maximum likelihood probabilityof detection thresholds.The USAEDS network processing software makes extensive use of theazimuth and slowness information provided by frequency-wavenumberanalysis at array sites, and polarization analysis at three-componentsites. Most of the improvements in USAEDS network processing aredue to improvements in the models used to predict azimuth, slowness,and probability of detection. Kriged travel-time, azimuth andslowness corrections-and associated uncertainties-are computedusing a ground truth database. Improvements in station processingand the use of improved models for azimuth, slowness, and probabilityof detection have led to significant improvements in USADES networkprocessing.
Quality Improvement on the Acute Inpatient Psychiatry Unit Using the Model for Improvement
Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean
2013-01-01
Background A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. Methods We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients—those starting or continuing on standing neuroleptics—with the Abnormal Involuntary Movement Scale (AIMS). Results After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Conclusion Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team. PMID:24052768
Quality improvement on the acute inpatient psychiatry unit using the model for improvement.
Singh, Kuldeep; Sanderson, Joshua; Galarneau, David; Keister, Thomas; Hickman, Dean
2013-01-01
A need exists for constant evaluation and modification of processes within healthcare systems to achieve quality improvement. One common approach is the Model for Improvement that can be used to clearly define aims, measures, and changes that are then implemented through a plan-do-study-act (PDSA) cycle. This approach is a commonly used method for improving quality in a wide range of fields. The Model for Improvement allows for a systematic process that can be revised at set time intervals to achieve a desired result. We used the Model for Improvement in an acute psychiatry unit (APU) to improve the screening incidence of abnormal involuntary movements in eligible patients-those starting or continuing on standing neuroleptics-with the Abnormal Involuntary Movement Scale (AIMS). After 8 weeks of using the Model for Improvement, both of the participating inpatient services in the APU showed substantial overall improvement in screening for abnormal involuntary movements using the AIMS. Crucial aspects of a successful quality improvement initiative based on the Model for Improvement are well-defined goals, process measures, and structured PDSA cycles. Success also requires communication, organization, and participation of the entire team.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
Advanced Hydrogen Liquefaction Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, Joseph; Kromer, Brian; Neu, Ben
2011-09-28
The project identified and quantified ways to reduce the cost of hydrogen liquefaction, and reduce the cost of hydrogen distribution. The goal was to reduce the power consumption by 20% and then to reduce the capital cost. Optimizing the process, improving process equipment, and improving ortho-para conversion significantly reduced the power consumption of liquefaction, but by less than 20%. Because the efficiency improvement was less than the target, the program was stopped before the capital cost was addressed. These efficiency improvements could provide a benefit to the public to improve the design of future hydrogen liquefiers. The project increased themore » understanding of hydrogen liquefaction by modeling different processes and thoroughly examining ortho-para separation and conversion. The process modeling provided a benefit to the public because the project incorporated para hydrogen into the process modeling software, so liquefaction processes can be modeled more accurately than using only normal hydrogen. Adding catalyst to the first heat exchanger, a simple method to reduce liquefaction power, was identified, analyzed, and quantified. The demonstrated performance of ortho-para separation is sufficient for at least one identified process concept to show reduced power cost when compared to hydrogen liquefaction processes using conventional ortho-para conversion. The impact of improved ortho-para conversion can be significant because ortho para conversion uses about 20-25% of the total liquefaction power, but performance improvement is necessary to realize a substantial benefit. Most of the energy used in liquefaction is for gas compression. Improvements in hydrogen compression will have a significant impact on overall liquefier efficiency. Improvements to turbines, heat exchangers, and other process equipment will have less impact.« less
NASA Astrophysics Data System (ADS)
Pomeroy, J. W.; Fang, X.
2014-12-01
The vast effort in hydrology devoted to parameter calibration as a means to improve model performance assumes that the models concerned are not fundamentally wrong. By focussing on finding optimal parameter sets and ascribing poor model performance to parameter or data uncertainty, these efforts may fail to consider the need to improve models with more intelligent descriptions of hydrological processes. To test this hypothesis, a flexible physically based hydrological model including a full suite of snow hydrology processes as well as warm season, hillslope and groundwater hydrology was applied to Marmot Creek Research Basin, Canadian Rocky Mountains where excellent driving meteorology and basin biophysical descriptions exist. Model parameters were set from values found in the basin or from similar environments; no parameters were calibrated. The model was tested against snow surveys and streamflow observations. The model used algorithms that describe snow redistribution, sublimation and forest canopy effects on snowmelt and evaporative processes that are rarely implemented in hydrological models. To investigate the contribution of these processes to model predictive capability, the model was "falsified" by deleting parameterisations for forest canopy snow mass and energy, blowing snow, intercepted rain evaporation, and sublimation. Model falsification by ignoring forest canopy processes contributed to a large increase in SWE errors for forested portions of the research basin with RMSE increasing from 19 to 55 mm and mean bias (MB) increasing from 0.004 to 0.62. In the alpine tundra portion, removing blowing processes resulted in an increase in model SWE MB from 0.04 to 2.55 on north-facing slopes and -0.006 to -0.48 on south-facing slopes. Eliminating these algorithms degraded streamflow prediction with the Nash Sutcliffe efficiency dropping from 0.58 to 0.22 and MB increasing from 0.01 to 0.09. These results show dramatic model improvements by including snow redistribution and melt processes associated with wind transport and forest canopies. As most hydrological models do not currently include these processes, it is suggested that modellers first improve the realism of model structures before trying to optimise what are inherently inadequate simulations of hydrology.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A Total Quality Leadership Process Improvement Model
1993-12-01
Leadership Process Improvement Model by Archester Houston, Ph.D. and Steven L. Dockstader, Ph.D. DTICS ELECTE tleaese oand sale itsFeat ben proe 94-12058...tTl ’AND SIATE COVERID0 Z lits Z40 uerI’Ll12/93 IFinalS.FNR IM F A Total Quality Leadership Process Improvement Model M ARRhOW~ Archester Houston, Ph.D...and Steven L. Dockstader, Ph.D. ?. 7PEJORMING ORG-AN1:AION NAMEIS) AND 00-RESS(ES) L PERFORMIN4 ORAINIZATION Total Quality Leadership OfficeREOTNMR
Microphysics, Radiation and Surface Processes in the Goddard Cumulus Ensemble (GCE) Model
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo
2002-01-01
In this talk, five specific major GCE improvements: (1) ice microphysics, (2) longwave and shortwave radiative transfer processes, (3) land surface processes, (4) ocean surface fluxes and (5) ocean mixed layer processes are presented. The performance of these new GCE improvements will be examined. Observations are used for model validation.
Askari, Marjan; Westerhof, Richard; Eslami, Saied; Medlock, Stephanie; de Rooij, Sophia E; Abu-Hanna, Ameen
2013-10-01
To propose a combined disease management and process modeling approach for evaluating and improving care processes, and demonstrate its usability and usefulness in a real-world fall management case study. We identified essential disease management related concepts and mapped them into explicit questions meant to expose areas for improvement in the respective care processes. We applied the disease management oriented questions to a process model of a comprehensive real world fall prevention and treatment program covering primary and secondary care. We relied on interviews and observations to complete the process models, which were captured in UML activity diagrams. A preliminary evaluation of the usability of our approach by gauging the experience of the modeler and an external validator was conducted, and the usefulness of the method was evaluated by gathering feedback from stakeholders at an invitational conference of 75 attendees. The process model of the fall management program was organized around the clinical tasks of case finding, risk profiling, decision making, coordination and interventions. Applying the disease management questions to the process models exposed weaknesses in the process including: absence of program ownership, under-detection of falls in primary care, and lack of efficient communication among stakeholders due to missing awareness about other stakeholders' workflow. The modelers experienced the approach as usable and the attendees of the invitational conference found the analysis results to be valid. The proposed disease management view of process modeling was usable and useful for systematically identifying areas of improvement in a fall management program. Although specifically applied to fall management, we believe our case study is characteristic of various disease management settings, suggesting the wider applicability of the approach. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Wu, Sheng; Jin, Qibing; Zhang, Ridong; Zhang, Junfeng; Gao, Furong
2017-07-01
In this paper, an improved constrained tracking control design is proposed for batch processes under uncertainties. A new process model that facilitates process state and tracking error augmentation with further additional tuning is first proposed. Then a subsequent controller design is formulated using robust stable constrained MPC optimization. Unlike conventional robust model predictive control (MPC), the proposed method enables the controller design to bear more degrees of tuning so that improved tracking control can be acquired, which is very important since uncertainties exist inevitably in practice and cause model/plant mismatches. An injection molding process is introduced to illustrate the effectiveness of the proposed MPC approach in comparison with conventional robust MPC. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Improvement of radiology services based on the process management approach.
Amaral, Creusa Sayuri Tahara; Rozenfeld, Henrique; Costa, Janaina Mascarenhas Hornos; Magon, Maria de Fátima de Andrade; Mascarenhas, Yvone Maria
2011-06-01
The health sector requires continuous investments to ensure the improvement of products and services from a technological standpoint, the use of new materials, equipment and tools, and the application of process management methods. Methods associated with the process management approach, such as the development of reference models of business processes, can provide significant innovations in the health sector and respond to the current market trend for modern management in this sector (Gunderman et al. (2008) [4]). This article proposes a process model for diagnostic medical X-ray imaging, from which it derives a primary reference model and describes how this information leads to gains in quality and improvements. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Khamidullin, R. I.
2018-05-01
The paper is devoted to milestones of the optimal mathematical model for a business process related to cost estimate documentation compiled during construction and reconstruction of oil and gas facilities. It describes the study and analysis of fundamental issues in petroleum industry, which are caused by economic instability and deterioration of a business strategy. Business process management is presented as business process modeling aimed at the improvement of the studied business process, namely main criteria of optimization and recommendations for the improvement of the above-mentioned business model.
NASA Astrophysics Data System (ADS)
Abitew, T. A.; van Griensven, A.; Bauwens, W.
2015-12-01
Evapotranspiration is the main process in hydrology (on average around 60%), though has not received as much attention in the evaluation and calibration of hydrological models. In this study, Remote Sensing (RS) derived Evapotranspiration (ET) is used to improve the spatially distributed processes of ET of SWAT model application in the upper Mara basin (Kenya) and the Blue Nile basin (Ethiopia). The RS derived ET data is obtained from recently compiled global datasets (continuously monthly data at 1 km resolution from MOD16NBI,SSEBop,ALEXI,CMRSET models) and from regionally applied Energy Balance Models (for several cloud free days). The RS-RT data is used in different forms: Method 1) to evaluate spatially distributed evapotransiration model resultsMethod 2) to calibrate the evotranspiration processes in hydrological modelMethod 3) to bias-correct the evapotranpiration in hydrological model during simulation after changing the SWAT codesAn inter-comparison of the RS-ET products shows that at present there is a significant bias, but at the same time an agreement on the spatial variability of ET. The ensemble mean of different ET products seems the most realistic estimation and was further used in this study.The results show that:Method 1) the spatially mapped evapotranspiration of hydrological models shows clear differences when compared to RS derived evapotranspiration (low correlations). Especially evapotranspiration in forested areas is strongly underestimated compared to other land covers.Method 2) Calibration allows to improve the correlations between the RS and hydrological model results to some extent.Method 3) Bias-corrections are efficient in producing (sesonal or annual) evapotranspiration maps from hydrological models which are very similar to the patterns obtained from RS data.Though the bias-correction is very efficient, it is advised to improve the model results by better representing the ET processes by improved plant/crop computations, improved agricultural management practices or by providing improved meteorological data.
A Prototype for the Support of Integrated Software Process Development and Improvement
NASA Astrophysics Data System (ADS)
Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian
An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.
ARM - Midlatitude Continental Convective Clouds
Jensen, Mike; Bartholomew, Mary Jane; Genio, Anthony Del; Giangrande, Scott; Kollias, Pavlos
2012-01-19
Convective processes play a critical role in the Earth's energy balance through the redistribution of heat and moisture in the atmosphere and their link to the hydrological cycle. Accurate representation of convective processes in numerical models is vital towards improving current and future simulations of Earths climate system. Despite improvements in computing power, current operational weather and global climate models are unable to resolve the natural temporal and spatial scales important to convective processes and therefore must turn to parameterization schemes to represent these processes. In turn, parameterization schemes in cloud-resolving models need to be evaluated for their generality and application to a variety of atmospheric conditions. Data from field campaigns with appropriate forcing descriptors have been traditionally used by modelers for evaluating and improving parameterization schemes.
ARM - Midlatitude Continental Convective Clouds (comstock-hvps)
Jensen, Mike; Comstock, Jennifer; Genio, Anthony Del; Giangrande, Scott; Kollias, Pavlos
2012-01-06
Convective processes play a critical role in the Earth's energy balance through the redistribution of heat and moisture in the atmosphere and their link to the hydrological cycle. Accurate representation of convective processes in numerical models is vital towards improving current and future simulations of Earths climate system. Despite improvements in computing power, current operational weather and global climate models are unable to resolve the natural temporal and spatial scales important to convective processes and therefore must turn to parameterization schemes to represent these processes. In turn, parameterization schemes in cloud-resolving models need to be evaluated for their generality and application to a variety of atmospheric conditions. Data from field campaigns with appropriate forcing descriptors have been traditionally used by modelers for evaluating and improving parameterization schemes.
Improving operational anodising process performance using simulation approach
NASA Astrophysics Data System (ADS)
Liong, Choong-Yeun; Ghazali, Syarah Syahidah
2015-10-01
The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist of five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.
Health care managers' views on and approaches to implementing models for improving care processes.
Andreasson, Jörgen; Eriksson, Andrea; Dellve, Lotta
2016-03-01
To develop a deeper understanding of health-care managers' views on and approaches to the implementation of models for improving care processes. In health care, there are difficulties in implementing models for improving care processes that have been decided on by upper management. Leadership approaches to this implementation can affect the outcome. In-depth interviews with first- and second-line managers in Swedish hospitals were conducted and analysed using grounded theory. 'Coaching for participation' emerged as a central theme for managers in handling top-down initiated process development. The vertical approach in this coaching addresses how managers attempt to sustain unit integrity through adapting and translating orders from top management. The horizontal approach in the coaching refers to managers' strategies for motivating and engaging their employees in implementation work. Implementation models for improving care processes require a coaching leadership built on close manager-employee interaction, mindfulness regarding the pace of change at the unit level, managers with the competence to share responsibility with their teams and engaged employees with the competence to share responsibility for improving the care processes, and organisational structures that support process-oriented work. Implications for nursing management are the importance of giving nurse managers knowledge of change management. © 2015 John Wiley & Sons Ltd.
Teaching the NIATx Model of Process Improvement as an Evidence-Based Process
ERIC Educational Resources Information Center
Evans, Alyson C.; Rieckmann, Traci; Fitzgerald, Maureen M.; Gustafson, David H.
2007-01-01
Process Improvement (PI) is an approach for helping organizations to identify and resolve inefficient and ineffective processes through problem solving and pilot testing change. Use of PI in improving client access, retention and outcomes in addiction treatment is on the rise through the teaching of the Network for the Improvement of Addiction…
2017-08-01
access to the GPU for general purpose processing .5 CUDA is designed to work easily with multiple programming languages , including Fortran. CUDA is a...Using Graphics Processing Unit (GPU) Computing by Leelinda P Dawson Approved for public release; distribution unlimited...The Performance Improvement of the Lagrangian Particle Dispersion Model (LPDM) Using Graphics Processing Unit (GPU) Computing by Leelinda
Process-Improvement Cost Model for the Emergency Department.
Dyas, Sheila R; Greenfield, Eric; Messimer, Sherri; Thotakura, Swati; Gholston, Sampson; Doughty, Tracy; Hays, Mary; Ivey, Richard; Spalding, Joseph; Phillips, Robin
2015-01-01
The objective of this report is to present a simplified, activity-based costing approach for hospital emergency departments (EDs) to use with Lean Six Sigma cost-benefit analyses. The cost model complexity is reduced by removing diagnostic and condition-specific costs, thereby revealing the underlying process activities' cost inefficiencies. Examples are provided for evaluating the cost savings from reducing discharge delays and the cost impact of keeping patients in the ED (boarding) after the decision to admit has been made. The process-improvement cost model provides a needed tool in selecting, prioritizing, and validating Lean process-improvement projects in the ED and other areas of patient care that involve multiple dissimilar diagnoses.
NASA Astrophysics Data System (ADS)
Caldararu, Silvia; Purves, Drew W.; Smith, Matthew J.
2017-04-01
Improving international food security under a changing climate and increasing human population will be greatly aided by improving our ability to modify, understand and predict crop growth. What we predominantly have at our disposal are either process-based models of crop physiology or statistical analyses of yield datasets, both of which suffer from various sources of error. In this paper, we present a generic process-based crop model (PeakN-crop v1.0) which we parametrise using a Bayesian model-fitting algorithm to three different sources: data-space-based vegetation indices, eddy covariance productivity measurements and regional crop yields. We show that the model parametrised without data, based on prior knowledge of the parameters, can largely capture the observed behaviour but the data-constrained model greatly improves both the model fit and reduces prediction uncertainty. We investigate the extent to which each dataset contributes to the model performance and show that while all data improve on the prior model fit, the satellite-based data and crop yield estimates are particularly important for reducing model error and uncertainty. Despite these improvements, we conclude that there are still significant knowledge gaps, in terms of available data for model parametrisation, but our study can help indicate the necessary data collection to improve our predictions of crop yields and crop responses to environmental changes.
Studying the Accuracy of Software Process Elicitation: The User Articulated Model
ERIC Educational Resources Information Center
Crabtree, Carlton A.
2010-01-01
Process models are often the basis for demonstrating improvement and compliance in software engineering organizations. A descriptive model is a type of process model describing the human activities in software development that actually occur. The purpose of a descriptive model is to provide a documented baseline for further process improvement…
The maturing of the quality improvement paradigm in the SEL
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1993-01-01
The Software Engineering Laboratory uses a paradigm for improving the software process and product, called the quality improvement paradigm. This paradigm has evolved over the past 18 years, along with our software development processes and product. Since 1976, when we first began the SEL, we have learned a great deal about improving the software process and product, making a great many mistakes along the way. Quality improvement paradigm, as it is currently defined, can be broken up into six steps: characterize the current project and its environment with respect to the appropriate models and metrics; set the quantifiable goals for successful project performance and improvement; choose the appropriate process model and supporting methods and tools for this project; execute the processes, construct the products, and collect, validate, and analyze the data to provide real-time feedback for corrective action; analyze the data to evaluate the current practices, determine problems, record findings, and make recommendations for future project improvements; and package the experience gained in the form of updated and refined models and other forms of structured knowledge gained from this and prior projects and save it in an experience base to be reused on future projects.
Cook, David J; Thompson, Jeffrey E; Suri, Rakesh; Prinsen, Sharon K
2014-01-01
The absence of standardization in surgical care process, exemplified in a "solution shop" model, can lead to unwarranted variation, increased cost, and reduced quality. A comprehensive effort was undertaken to improve quality of care around indwelling bladder catheter use following surgery by creating a "focused factory" model within the cardiac surgical practice. Baseline compliance with Surgical Care Improvement Inf-9, removal of urinary catheter by the end of surgical postoperative day 2, was determined. Comparison of baseline data to postintervention results showed clinically important reductions in the duration of indwelling bladder catheters as well as marked reduction in practice variation. Following the intervention, Surgical Care Improvement Inf-9 guidelines were met in 97% of patients. Although clinical quality improvement was notable, the process to accomplish this-identification of patients suitable for standardized pathways, protocol application, and electronic systems to support the standardized practice model-has potentially greater relevance than the specific clinical results. © 2013 by the American College of Medical Quality.
Effective Report Preparation: Streamlining the Reporting Process. AIR 1999 Annual Forum Paper.
ERIC Educational Resources Information Center
Dalrymple, Margaret; Wang, Mindy; Frost, Jacquelyn
This paper describes the processes and techniques used to improve and streamline the standard student reports used at Purdue University (Indiana). Various models for analyzing reporting processes are described, especially the model used in the study, the Shewart or Deming Cycle, a method that aids in continuous analysis and improvement through a…
Improving surgeon utilization in an orthopedic department using simulation modeling
Simwita, Yusta W; Helgheim, Berit I
2016-01-01
Purpose Worldwide more than two billion people lack appropriate access to surgical services due to mismatch between existing human resource and patient demands. Improving utilization of existing workforce capacity can reduce the existing gap between surgical demand and available workforce capacity. In this paper, the authors use discrete event simulation to explore the care process at an orthopedic department. Our main focus is improving utilization of surgeons while minimizing patient wait time. Methods The authors collaborated with orthopedic department personnel to map the current operations of orthopedic care process in order to identify factors that influence poor surgeons utilization and high patient waiting time. The authors used an observational approach to collect data. The developed model was validated by comparing the simulation output with the actual patient data that were collected from the studied orthopedic care process. The authors developed a proposal scenario to show how to improve surgeon utilization. Results The simulation results showed that if ancillary services could be performed before the start of clinic examination services, the orthopedic care process could be highly improved. That is, improved surgeon utilization and reduced patient waiting time. Simulation results demonstrate that with improved surgeon utilizations, up to 55% increase of future demand can be accommodated without patients reaching current waiting time at this clinic, thus, improving patient access to health care services. Conclusion This study shows how simulation modeling can be used to improve health care processes. This study was limited to a single care process; however the findings can be applied to improve other orthopedic care process with similar operational characteristics. PMID:29355193
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo; Lang, Stephen E.; Zeng, Xiping; Li, Xiaowen; Matsui, Toshi; Mohr, Karen; Posselt, Derek; Chern, Jiundar; Peters-Lidard, Christa; Norris, Peter M.;
2014-01-01
Convection is the primary transport process in the Earth's atmosphere. About two-thirds of the Earth's rainfall and severe floods derive from convection. In addition, two-thirds of the global rain falls in the tropics, while the associated latent heat release accounts for three-fourths of the total heat energy for the Earth's atmosphere. Cloud-resolving models (CRMs) have been used to improve our understanding of cloud and precipitation processes and phenomena from micro-scale to cloud-scale and mesoscale as well as their interactions with radiation and surface processes. CRMs use sophisticated and realistic representations of cloud microphysical processes and can reasonably well resolve the time evolution, structure, and life cycles of clouds and cloud systems. CRMs also allow for explicit interaction between clouds, outgoing longwave (cooling) and incoming solar (heating) radiation, and ocean and land surface processes. Observations are required to initialize CRMs and to validate their results. The Goddard Cumulus Ensemble model (GCE) has been developed and improved at NASA/Goddard Space Flight Center over the past three decades. It is amulti-dimensional non-hydrostatic CRM that can simulate clouds and cloud systems in different environments. Early improvements and testing were presented in Tao and Simpson (1993) and Tao et al. (2003a). A review on the application of the GCE to the understanding of precipitation processes can be found in Simpson and Tao (1993) and Tao (2003). In this paper, recent model improvements (microphysics, radiation and land surface processes) are described along with their impact and performance on cloud and precipitation events in different geographic locations via comparisons with observations. In addition, recent advanced applications of the GCE are presented that include understanding the physical processes responsible for diurnal variation, examining the impact of aerosols (cloud condensation nuclei or CCN and ice nuclei or IN) on precipitation processes, utilizing a satellite simulator to improve the microphysics, providing better simulations for satellite-derived latent heating retrieval, and coupling with a general circulation model to improve the representation of precipitation processes.
Tetteh, Hassan A
2012-01-01
Kaizen is a proven management technique that has a practical application for health care in the context of health care reform and the 2010 Institute of Medicine landmark report on the future of nursing. Compounded productivity is the unique benefit of kaizen, and its principles are change, efficiency, performance of key essential steps, and the elimination of waste through small and continuous process improvements. The kaizen model offers specific instruction for perioperative nurses to achieve process improvement in a five-step framework that includes teamwork, personal discipline, improved morale, quality circles, and suggestions for improvement. Published by Elsevier Inc.
Improving operational anodising process performance using simulation approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liong, Choong-Yeun, E-mail: lg@ukm.edu.my; Ghazali, Syarah Syahidah, E-mail: syarah@gapps.kptm.edu.my
The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist ofmore » five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.« less
Creating State Accountability Systems That Help Schools Improve
ERIC Educational Resources Information Center
Elgart, Mark A.
2016-01-01
Organizational leaders from nearly every sector have been using continuous improvement models and improvement science for years to improve products, services, and processes. Though continuous improvement processes are not new in education, they are relatively new in the state policy arena. In a continuous improvement system, educators use data,…
NASA Astrophysics Data System (ADS)
Jamaluddin, Z.; Razali, A. M.; Mustafa, Z.
2015-02-01
The purpose of this paper is to examine the relationship between the quality management practices (QMPs) and organisational performance for the manufacturing industry in Malaysia. In this study, a QMPs and organisational performance framework is developed according to a comprehensive literature review which cover aspects of hard and soft quality factors in manufacturing process environment. A total of 11 hypotheses have been put forward to test the relationship amongst the six constructs, which are management commitment, training, process management, quality tools, continuous improvement and organisational performance. The model is analysed using Structural Equation Modeling (SEM) with AMOS software version 18.0 using Maximum Likelihood (ML) estimation. A total of 480 questionnaires were distributed, and 210 questionnaires were valid for analysis. The results of the modeling analysis using ML estimation indicate that the fits statistics of QMPs and organisational performance model for manufacturing industry is admissible. From the results, it found that the management commitment have significant impact on the training and process management. Similarly, the training had significant effect to the quality tools, process management and continuous improvement. Furthermore, the quality tools have significant influence on the process management and continuous improvement. Likewise, the process management also has a significant impact to the continuous improvement. In addition the continuous improvement has significant influence the organisational performance. However, the results of the study also found that there is no significant relationship between management commitment and quality tools, and between the management commitment and continuous improvement. The results of the study can be used by managers to prioritize the implementation of QMPs. For instances, those practices that are found to have positive impact on organisational performance can be recommended to managers so that they can allocate resources to improve these practices to get better performance.
1996-04-01
CHILD SUPPORT ENFORCEMENT: A PROPOSAL TO IMPROVE SERVICE OF PROCESS A Thesis Presented to The Judge Advocate General’s School United States Army The...19960 THE ARMED SERVICES AND MODEL EMPLOYER STATUS FOR CHILD SUPPORT ENFORCEMENT: A PROPOSAL TO IMPROVE SERVICE OF PROCESS by Major Alan L. Cook...ABSTRACT: On February 27, 1995, President Clinton issued Executive Order 12953, "Actions Required of all Executive Agencies to Facilitate Payment of Child
Oh, Hong-Choon; Toh, Hong-Guan; Giap Cheong, Eddy Seng
2011-11-01
Using the classical process improvement framework of Plan-Do-Study-Act (PDSA), the diagnostic radiology department of a tertiary hospital identified several patient cycle time reduction strategies. Experimentation of these strategies (which included procurement of new machines, hiring of new staff, redesign of queue system, etc.) through pilot scale implementation was impractical because it might incur substantial expenditure or be operationally disruptive. With this in mind, simulation modeling was used to test these strategies via performance of "what if" analyses. Using the output generated by the simulation model, the team was able to identify a cost-free cycle time reduction strategy, which subsequently led to a reduction of patient cycle time and achievement of a management-defined performance target. As healthcare professionals work continually to improve healthcare operational efficiency in response to rising healthcare costs and patient expectation, simulation modeling offers an effective scientific framework that can complement established process improvement framework like PDSA to realize healthcare process enhancement. © 2011 National Association for Healthcare Quality.
NASA Astrophysics Data System (ADS)
Song, Yanpo; Peng, Xiaoqi; Tang, Ying; Hu, Zhikun
2013-07-01
To improve the operation level of copper converter, the approach to optimal decision making modeling for coppermatte converting process based on data mining is studied: in view of the characteristics of the process data, such as containing noise, small sample size and so on, a new robust improved ANN (artificial neural network) modeling method is proposed; taking into account the application purpose of decision making model, three new evaluation indexes named support, confidence and relative confidence are proposed; using real production data and the methods mentioned above, optimal decision making model for blowing time of S1 period (the 1st slag producing period) are developed. Simulation results show that this model can significantly improve the converting quality of S1 period, increase the optimal probability from about 70% to about 85%.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation. The main computational objectives were: 1. To develop computationally efficient, but physically based, parameterizations of estuary and continental shelf mixing processes for use in an Earth System Model (CESM). 2. Tomore » develop a two-way nested regional modeling framework in order to dynamically downscale the climate response of particular coastal ocean regions and to upscale the impact of the regional coastal processes to the global climate in an Earth System Model (CESM). 3. To develop computational infrastructure to enhance the efficiency of data transfer between specific sources and destinations, i.e., a point-to-point communication capability, (used in objective 1) within POP, the ocean component of CESM.« less
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1992-01-01
The concepts of quality improvements have permeated many businesses. It is clear that the nineties will be the quality era for software and there is a growing need to develop or adapt quality improvement approaches to the software business. Thus we must understand software as an artifact and software as a business. Since the business we are dealing with is software, we must understand the nature of software and software development. The software discipline is evolutionary and experimental; it is a laboratory science. Software is development not production. The technologies of the discipline are human based. There is a lack of models that allow us to reason about the process and the product. All software is not the same; process is a variable, goals are variable, etc. Packaged, reusable, experiences require additional resources in the form of organization, processes, people, etc. There have been a variety of organizational frameworks proposed to improve quality for various businesses. The ones discussed in this presentation include: Plan-Do-Check-Act, a quality improvement process based upon a feedback cycle for optimizing a single process model/production line; the Experience Factory/Quality Improvement Paradigm, continuous improvements through the experimentation, packaging, and reuse of experiences based upon a business's needs; Total Quality Management, a management approach to long term success through customer satisfaction based on the participation of all members of an organization; the SEI capability maturity model, a staged process improvement based upon assessment with regard to a set of key process areas until you reach a level 5 which represents a continuous process improvement; and Lean (software) Development, a principle supporting the concentration of the production on 'value added' activities and the elimination of reduction of 'not value added' activities.
Improvement Guides for I.A. Curriculum
ERIC Educational Resources Information Center
Ritz, John M.; Wright, Lawrence S.
1977-01-01
Describes a project to revise "The Wisconsin Guide to Local Curriculum Improvement in Industrial Education, K-12", originally prepared in 1973. Four figures from the guide are included: (1) model of a field objective, (2) curriculum planning model, (3) instructional development process, and (4) process for developing objectives. (MF)
West, Sarah Katherine
2016-01-01
This article aims to summarize the successes and future implications for a nurse practitioner-driven committee on process improvement in trauma. The trauma nurse practitioner is uniquely positioned to recognize the need for clinical process improvement and enact change within the clinical setting. Application of the Strong Model of Advanced Practice proves to actively engage the trauma nurse practitioner in process improvement initiatives. Through enhancing nurse practitioner professional engagement, the committee aims to improve health care delivery to the traumatically injured patient. A retrospective review of the committee's first year reveals trauma nurse practitioner success in the domains of direct comprehensive care, support of systems, education, and leadership. The need for increased trauma nurse practitioner involvement has been identified for the domains of research and publication.
NASA Technical Reports Server (NTRS)
Goad, Clyde C.; Chadwell, C. David
1993-01-01
GEODYNII is a conventional batch least-squares differential corrector computer program with deterministic models of the physical environment. Conventional algorithms were used to process differenced phase and pseudorange data to determine eight-day Global Positioning system (GPS) orbits with several meter accuracy. However, random physical processes drive the errors whose magnitudes prevent improving the GPS orbit accuracy. To improve the orbit accuracy, these random processes should be modeled stochastically. The conventional batch least-squares algorithm cannot accommodate stochastic models, only a stochastic estimation algorithm is suitable, such as a sequential filter/smoother. Also, GEODYNII cannot currently model the correlation among data values. Differenced pseudorange, and especially differenced phase, are precise data types that can be used to improve the GPS orbit precision. To overcome these limitations and improve the accuracy of GPS orbits computed using GEODYNII, we proposed to develop a sequential stochastic filter/smoother processor by using GEODYNII as a type of trajectory preprocessor. Our proposed processor is now completed. It contains a correlated double difference range processing capability, first order Gauss Markov models for the solar radiation pressure scale coefficient and y-bias acceleration, and a random walk model for the tropospheric refraction correction. The development approach was to interface the standard GEODYNII output files (measurement partials and variationals) with software modules containing the stochastic estimator, the stochastic models, and a double differenced phase range processing routine. Thus, no modifications to the original GEODYNII software were required. A schematic of the development is shown. The observational data are edited in the preprocessor and the data are passed to GEODYNII as one of its standard data types. A reference orbit is determined using GEODYNII as a batch least-squares processor and the GEODYNII measurement partial (FTN90) and variational (FTN80, V-matrix) files are generated. These two files along with a control statement file and a satellite identification and mass file are passed to the filter/smoother to estimate time-varying parameter states at each epoch, improved satellite initial elements, and improved estimates of constant parameters.
NASA Astrophysics Data System (ADS)
Mukhopadhyay, P.; Phani Murali Krishna, R.; Goswami, Bidyut B.; Abhik, S.; Ganai, Malay; Mahakur, M.; Khairoutdinov, Marat; Dudhia, Jimmy
2016-05-01
Inspite of significant improvement in numerical model physics, resolution and numerics, the general circulation models (GCMs) find it difficult to simulate realistic seasonal and intraseasonal variabilities over global tropics and particularly over Indian summer monsoon (ISM) region. The bias is mainly attributed to the improper representation of physical processes. Among all the processes, the cloud and convective processes appear to play a major role in modulating model bias. In recent times, NCEP CFSv2 model is being adopted under Monsoon Mission for dynamical monsoon forecast over Indian region. The analyses of climate free run of CFSv2 in two resolutions namely at T126 and T382, show largely similar bias in simulating seasonal rainfall, in capturing the intraseasonal variability at different scales over the global tropics and also in capturing tropical waves. Thus, the biases of CFSv2 indicate a deficiency in model's parameterization of cloud and convective processes. Keeping this in background and also for the need to improve the model fidelity, two approaches have been adopted. Firstly, in the superparameterization, 32 cloud resolving models each with a horizontal resolution of 4 km are embedded in each GCM (CFSv2) grid and the conventional sub-grid scale convective parameterization is deactivated. This is done to demonstrate the role of resolving cloud processes which otherwise remain unresolved. The superparameterized CFSv2 (SP-CFS) is developed on a coarser version T62. The model is integrated for six and half years in climate free run mode being initialised from 16 May 2008. The analyses reveal that SP-CFS simulates a significantly improved mean state as compared to default CFS. The systematic bias of lesser rainfall over Indian land mass, colder troposphere has substantially been improved. Most importantly the convectively coupled equatorial waves and the eastward propagating MJO has been found to be simulated with more fidelity in SP-CFS. The reason of such betterment in model mean state has been found to be due to the systematic improvement in moisture field, temperature profile and moist instability. The model also has better simulated the cloud and rainfall relation. This initiative demonstrates the role of cloud processes on the mean state of coupled GCM. As the superparameterization approach is computationally expensive, so in another approach, the conventional Simplified Arakawa Schubert (SAS) scheme is replaced by a revised SAS scheme (RSAS) and also the old and simplified cloud scheme of Zhao-Karr (1997) has been replaced by WSM6 in CFSV2 (hereafter CFS-CR). The primary objective of such modifications is to improve the distribution of convective rain in the model by using RSAS and the grid-scale or the large scale nonconvective rain by WSM6. The WSM6 computes the tendency of six class (water vapour, cloud water, ice, snow, graupel, rain water) hydrometeors at each of the model grid and contributes in the low, middle and high cloud fraction. By incorporating WSM6, for the first time in a global climate model, we are able to show a reasonable simulation of cloud ice and cloud liquid water distribution vertically and spatially as compared to Cloudsat observations. The CFS-CR has also showed improvement in simulating annual rainfall cycle and intraseasonal variability over the ISM region. These improvements in CFS-CR are likely to be associated with improvement of the convective and stratiform rainfall distribution in the model. These initiatives clearly address a long standing issue of resolving the cloud processes in climate model and demonstrate that the improved cloud and convective process paramterizations can eventually reduce the systematic bias and improve the model fidelity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crapps, Justin M.; Clarke, Kester D.; Katz, Joel D.
2012-06-06
We use experimentation and finite element modeling to study a Hot Isostatic Press (HIP) manufacturing process for U-10Mo Monolithic Fuel Plates. Finite element simulations are used to identify the material properties affecting the process and improve the process geometry. Accounting for the high temperature material properties and plasticity is important to obtain qualitative agreement between model and experimental results. The model allows us to improve the process geometry and provide guidance on selection of material and finish conditions for the process strongbacks. We conclude that the HIP can must be fully filled to provide uniform normal stress across the bondingmore » interface.« less
DOT National Transportation Integrated Search
2005-01-01
This effort demonstrates business process modeling to describe the integration of particular planning and programming activities of a state highway agency. The motivations to document planning and programming activities are that: (i) resources for co...
Software Engineering Program: Software Process Improvement Guidebook
NASA Technical Reports Server (NTRS)
1996-01-01
The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.
Anatomically constrained neural network models for the categorization of facial expression
NASA Astrophysics Data System (ADS)
McMenamin, Brenton W.; Assadi, Amir H.
2004-12-01
The ability to recognize facial expression in humans is performed with the amygdala which uses parallel processing streams to identify the expressions quickly and accurately. Additionally, it is possible that a feedback mechanism may play a role in this process as well. Implementing a model with similar parallel structure and feedback mechanisms could be used to improve current facial recognition algorithms for which varied expressions are a source for error. An anatomically constrained artificial neural-network model was created that uses this parallel processing architecture and feedback to categorize facial expressions. The presence of a feedback mechanism was not found to significantly improve performance for models with parallel architecture. However the use of parallel processing streams significantly improved accuracy over a similar network that did not have parallel architecture. Further investigation is necessary to determine the benefits of using parallel streams and feedback mechanisms in more advanced object recognition tasks.
Anatomically constrained neural network models for the categorization of facial expression
NASA Astrophysics Data System (ADS)
McMenamin, Brenton W.; Assadi, Amir H.
2005-01-01
The ability to recognize facial expression in humans is performed with the amygdala which uses parallel processing streams to identify the expressions quickly and accurately. Additionally, it is possible that a feedback mechanism may play a role in this process as well. Implementing a model with similar parallel structure and feedback mechanisms could be used to improve current facial recognition algorithms for which varied expressions are a source for error. An anatomically constrained artificial neural-network model was created that uses this parallel processing architecture and feedback to categorize facial expressions. The presence of a feedback mechanism was not found to significantly improve performance for models with parallel architecture. However the use of parallel processing streams significantly improved accuracy over a similar network that did not have parallel architecture. Further investigation is necessary to determine the benefits of using parallel streams and feedback mechanisms in more advanced object recognition tasks.
Mingguang, Zhang; Juncheng, Jiang
2008-10-30
Overpressure is one important cause of domino effect in accidents of chemical process equipments. Damage probability and relative threshold value are two necessary parameters in QRA of this phenomenon. Some simple models had been proposed based on scarce data or oversimplified assumption. Hence, more data about damage to chemical process equipments were gathered and analyzed, a quantitative relationship between damage probability and damage degrees of equipment was built, and reliable probit models were developed associated to specific category of chemical process equipments. Finally, the improvements of present models were evidenced through comparison with other models in literatures, taking into account such parameters: consistency between models and data, depth of quantitativeness in QRA.
SEIPS-based process modeling in primary care.
Wooldridge, Abigail R; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter L T
2017-04-01
Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. Copyright © 2016 Elsevier Ltd. All rights reserved.
SEIPS-Based Process Modeling in Primary Care
Wooldridge, Abigail R.; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter
2016-01-01
Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. PMID:28166883
NASA Astrophysics Data System (ADS)
Shen, Yan; Ge, Jin-ming; Zhang, Guo-qing; Yu, Wen-bin; Liu, Rui-tong; Fan, Wei; Yang, Ying-xuan
2018-01-01
This paper explores the problem of signal processing in optical current transformers (OCTs). Based on the noise characteristics of OCTs, such as overlapping signals, noise frequency bands, low signal-to-noise ratios, and difficulties in acquiring statistical features of noise power, an improved standard Kalman filtering algorithm was proposed for direct current (DC) signal processing. The state-space model of the OCT DC measurement system is first established, and then mixed noise can be processed by adding mixed noise into measurement and state parameters. According to the minimum mean squared error criterion, state predictions and update equations of the improved Kalman algorithm could be deduced based on the established model. An improved central difference Kalman filter was proposed for alternating current (AC) signal processing, which improved the sampling strategy and noise processing of colored noise. Real-time estimation and correction of noise were achieved by designing AC and DC noise recursive filters. Experimental results show that the improved signal processing algorithms had a good filtering effect on the AC and DC signals with mixed noise of OCT. Furthermore, the proposed algorithm was able to achieve real-time correction of noise during the OCT filtering process.
Fractal modeling of fluidic leakage through metal sealing surfaces
NASA Astrophysics Data System (ADS)
Zhang, Qiang; Chen, Xiaoqian; Huang, Yiyong; Chen, Yong
2018-04-01
This paper investigates the fluidic leak rate through metal sealing surfaces by developing fractal models for the contact process and leakage process. An improved model is established to describe the seal-contact interface of two metal rough surface. The contact model divides the deformed regions by classifying the asperities of different characteristic lengths into the elastic, elastic-plastic and plastic regimes. Using the improved contact model, the leakage channel under the contact surface is mathematically modeled based on the fractal theory. The leakage model obtains the leak rate using the fluid transport theory in porous media, considering that the pores-forming percolation channels can be treated as a combination of filled tortuous capillaries. The effects of fractal structure, surface material and gasket size on the contact process and leakage process are analyzed through numerical simulations for sealed ring gaskets.
2017-06-01
This research expands the modeling and simulation (M and S) body of knowledge through the development of an Implicit Model Development Process (IMDP...When augmented to traditional Model Development Processes (MDP), the IMDP enables the development of models that can address a broader array of...where a broader, more holistic approach of defining a models referent is achieved. Next, the IMDP codifies the process for implementing the improved model
Lu, Lingbo; Li, Jingshan; Gisler, Paula
2011-06-01
Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital.
A novel double loop control model design for chemical unstable processes.
Cong, Er-Ding; Hu, Ming-Hui; Tu, Shan-Tung; Xuan, Fu-Zhen; Shao, Hui-He
2014-03-01
In this manuscript, based on Smith predictor control scheme for unstable process in industry, an improved double loop control model is proposed for chemical unstable processes. Inner loop is to stabilize integrating the unstable process and transform the original process to first-order plus pure dead-time dynamic stable process. Outer loop is to enhance the performance of set point response. Disturbance controller is designed to enhance the performance of disturbance response. The improved control system is simple with exact physical meaning. The characteristic equation is easy to realize stabilization. Three controllers are separately design in the improved scheme. It is easy to design each controller and good control performance for the respective closed-loop transfer function separately. The robust stability of the proposed control scheme is analyzed. Finally, case studies illustrate that the improved method can give better system performance than existing design methods. © 2013 ISA Published by ISA All rights reserved.
Effect of Time Varying Gravity on DORIS processing for ITRF2013
NASA Astrophysics Data System (ADS)
Zelensky, N. P.; Lemoine, F. G.; Chinn, D. S.; Beall, J. W.; Melachroinos, S. A.; Beckley, B. D.; Pavlis, D.; Wimert, J.
2013-12-01
Computations are under way to develop a new time series of DORIS SINEX solutions to contribute to the development of the new realization of the terrestrial reference frame (c.f. ITRF2013). One of the improvements that are envisaged is the application of improved models of time-variable gravity in the background orbit modeling. At GSFC we have developed a time series of spherical harmonics to degree and order 5 (using the GOC02S model as a base), based on the processing of SLR and DORIS data to 14 satellites from 1993 to 2013. This is compared with the standard approach used in ITRF2008, based on the static model EIGEN-GL04S1 which included secular variations in only a few select coefficients. Previous work on altimeter satellite POD (c.f. TOPEX/Poseidon, Jason-1, Jason-2) has shown that the standard model is not adequate and orbit improvements are observed with application of more detailed models of time-variable gravity. In this study, we quantify the impact of TVG modeling on DORIS satellite POD, and ascertain the impact on DORIS station positions estimated weekly from 1993 to 2013. The numerous recent improvements to SLR and DORIS processing at GSFC include a more complete compliance to IERS2010 standards, improvements to SLR/DORIS measurement modeling, and improved non-conservative force modeling to DORIS satellites. These improvements will affect gravity coefficient estimates, POD, and the station solutions. Tests evaluate the impact of time varying gravity on tracking data residuals, station consistency, and the geocenter and scale reference frame parameters.
Characterizing and Assessing a Large-Scale Software Maintenance Organization
NASA Technical Reports Server (NTRS)
Briand, Lionel; Melo, Walcelio; Seaman, Carolyn; Basili, Victor
1995-01-01
One important component of a software process is the organizational context in which the process is enacted. This component is often missing or incomplete in current process modeling approaches. One technique for modeling this perspective is the Actor-Dependency (AD) Model. This paper reports on a case study which used this approach to analyze and assess a large software maintenance organization. Our goal was to identify the approach's strengths and weaknesses while providing practical recommendations for improvement and research directions. The AD model was found to be very useful in capturing the important properties of the organizational context of the maintenance process, and aided in the understanding of the flaws found in this process. However, a number of opportunities for extending and improving the AD model were identified. Among others, there is a need to incorporate quantitative information to complement the qualitative model.
Cutting, Elizabeth M; Overby, Casey L; Banchero, Meghan; Pollin, Toni; Kelemen, Mark; Shuldiner, Alan R; Beitelshees, Amber L
Delivering genetic test results to clinicians is a complex process. It involves many actors and multiple steps, requiring all of these to work together in order to create an optimal course of treatment for the patient. We used information gained from focus groups in order to illustrate the current process of delivering genetic test results to clinicians. We propose a business process model and notation (BPMN) representation of this process for a Translational Pharmacogenomics Project being implemented at the University of Maryland Medical Center, so that personalized medicine program implementers can identify areas to improve genetic testing processes. We found that the current process could be improved to reduce input errors, better inform and notify clinicians about the implications of certain genetic tests, and make results more easily understood. We demonstrate our use of BPMN to improve this important clinical process for CYP2C19 genetic testing in patients undergoing invasive treatment of coronary heart disease.
Cutting, Elizabeth M.; Overby, Casey L.; Banchero, Meghan; Pollin, Toni; Kelemen, Mark; Shuldiner, Alan R.; Beitelshees, Amber L.
2015-01-01
Delivering genetic test results to clinicians is a complex process. It involves many actors and multiple steps, requiring all of these to work together in order to create an optimal course of treatment for the patient. We used information gained from focus groups in order to illustrate the current process of delivering genetic test results to clinicians. We propose a business process model and notation (BPMN) representation of this process for a Translational Pharmacogenomics Project being implemented at the University of Maryland Medical Center, so that personalized medicine program implementers can identify areas to improve genetic testing processes. We found that the current process could be improved to reduce input errors, better inform and notify clinicians about the implications of certain genetic tests, and make results more easily understood. We demonstrate our use of BPMN to improve this important clinical process for CYP2C19 genetic testing in patients undergoing invasive treatment of coronary heart disease. PMID:26958179
The TAME Project: Towards improvement-oriented software environments
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Rombach, H. Dieter
1988-01-01
Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.
NASA Astrophysics Data System (ADS)
Rosyidi, C. N.; Jauhari, WA; Suhardi, B.; Hamada, K.
2016-02-01
Quality improvement must be performed in a company to maintain its product competitiveness in the market. The goal of such improvement is to increase the customer satisfaction and the profitability of the company. In current practice, a company needs several suppliers to provide the components in assembly process of a final product. Hence quality improvement of the final product must involve the suppliers. In this paper, an optimization model to allocate the variance reduction is developed. Variation reduction is an important term in quality improvement for both manufacturer and suppliers. To improve suppliers’ components quality, the manufacturer must invest an amount of their financial resources in learning process of the suppliers. The objective function of the model is to minimize the total cost consists of investment cost, and quality costs for both internal and external quality costs. The Learning curve will determine how the employee of the suppliers will respond to the learning processes in reducing the variance of the component.
DOT National Transportation Integrated Search
2005-01-01
This effort demonstrates business process modeling to describe the integration of particular planning and programming activities of a state highway agency. The motivations to document planning and programming activities are that: (i) resources for co...
Application of agent-based system for bioprocess description and process improvement.
Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J
2010-01-01
Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which demonstrate how such a framework may enable the better integration of process operations by providing a plant-wide process description to facilitate process improvement. Copyright 2009 American Institute of Chemical Engineers
NASA Astrophysics Data System (ADS)
Norton, Alexander J.; Rayner, Peter J.; Koffi, Ernest N.; Scholze, Marko
2018-04-01
The synthesis of model and observational information using data assimilation can improve our understanding of the terrestrial carbon cycle, a key component of the Earth's climate-carbon system. Here we provide a data assimilation framework for combining observations of solar-induced chlorophyll fluorescence (SIF) and a process-based model to improve estimates of terrestrial carbon uptake or gross primary production (GPP). We then quantify and assess the constraint SIF provides on the uncertainty in global GPP through model process parameters in an error propagation study. By incorporating 1 year of SIF observations from the GOSAT satellite, we find that the parametric uncertainty in global annual GPP is reduced by 73 % from ±19.0 to ±5.2 Pg C yr-1. This improvement is achieved through strong constraint of leaf growth processes and weak to moderate constraint of physiological parameters. We also find that the inclusion of uncertainty in shortwave down-radiation forcing has a net-zero effect on uncertainty in GPP when incorporated into the SIF assimilation framework. This study demonstrates the powerful capacity of SIF to reduce uncertainties in process-based model estimates of GPP and the potential for improving our predictive capability of this uncertain carbon flux.
2006-06-01
research will cover an overview of business process engineering (BPR) and operation management . The focus will be on the basic process of BPR, inventory...management and improvement of the process of business operation management to appropriately provide a basic model for the Indonesian Air Force in...discuss the operation management aspects of inventory management and process improvement, including Economic Order Quantity, Material Requirement
Modelling the impacts of pests and diseases on agricultural systems.
Donatelli, M; Magarey, R D; Bregaglio, S; Willocquet, L; Whish, J P M; Savary, S
2017-07-01
The improvement and application of pest and disease models to analyse and predict yield losses including those due to climate change is still a challenge for the scientific community. Applied modelling of crop diseases and pests has mostly targeted the development of support capabilities to schedule scouting or pesticide applications. There is a need for research to both broaden the scope and evaluate the capabilities of pest and disease models. Key research questions not only involve the assessment of the potential effects of climate change on known pathosystems, but also on new pathogens which could alter the (still incompletely documented) impacts of pests and diseases on agricultural systems. Yield loss data collected in various current environments may no longer represent a adequate reference to develop tactical, decision-oriented, models for plant diseases and pests and their impacts, because of the ongoing changes in climate patterns. Process-based agricultural simulation modelling, on the other hand, appears to represent a viable methodology to estimate the impacts of these potential effects. A new generation of tools based on state-of-the-art knowledge and technologies is needed to allow systems analysis including key processes and their dynamics over appropriate suitable range of environmental variables. This paper offers a brief overview of the current state of development in coupling pest and disease models to crop models, and discusses technical and scientific challenges. We propose a five-stage roadmap to improve the simulation of the impacts caused by plant diseases and pests; i) improve the quality and availability of data for model inputs; ii) improve the quality and availability of data for model evaluation; iii) improve the integration with crop models; iv) improve the processes for model evaluation; and v) develop a community of plant pest and disease modelers.
Process of Continual Improvement in a School of Nursing.
ERIC Educational Resources Information Center
Norman, Linda D.; Lutenbacher, Melanie
1996-01-01
Vanderbilt University School of Nursing used the Batalden model of systems improvement to change its program. The model analyzes services and products, customers, social community need, and customer knowledge to approach improvements in a systematic way. (JOW)
The Iterative Research Cycle: Process-Based Model Evaluation
NASA Astrophysics Data System (ADS)
Vrugt, J. A.
2014-12-01
The ever increasing pace of computational power, along with continued advances in measurement technologies and improvements in process understanding has stimulated the development of increasingly complex physics based models that simulate a myriad of processes at different spatial and temporal scales. Reconciling these high-order system models with perpetually larger volumes of field data is becoming more and more difficult, particularly because classical likelihood-based fitting methods lack the power to detect and pinpoint deficiencies in the model structure. In this talk I will give an overview of our latest research on process-based model calibration and evaluation. This approach, rooted in Bayesian theory, uses summary metrics of the calibration data rather than the data itself to help detect which component(s) of the model is (are) malfunctioning and in need of improvement. A few case studies involving hydrologic and geophysical models will be used to demonstrate the proposed methodology.
NASA Astrophysics Data System (ADS)
Jackson-Blake, L.
2014-12-01
Process-based catchment water quality models are increasingly used as tools to inform land management. However, for such models to be reliable they need to be well calibrated and shown to reproduce key catchment processes. Calibration can be challenging for process-based models, which tend to be complex and highly parameterised. Calibrating a large number of parameters generally requires a large amount of monitoring data, but even in well-studied catchments, streams are often only sampled at a fortnightly or monthly frequency. The primary aim of this study was therefore to investigate how the quality and uncertainty of model simulations produced by one process-based catchment model, INCA-P (the INtegrated CAtchment model of Phosphorus dynamics), were improved by calibration to higher frequency water chemistry data. Two model calibrations were carried out for a small rural Scottish catchment: one using 18 months of daily total dissolved phosphorus (TDP) concentration data, another using a fortnightly dataset derived from the daily data. To aid comparability, calibrations were carried out automatically using the MCMC-DREAM algorithm. Using daily rather than fortnightly data resulted in improved simulation of the magnitude of peak TDP concentrations, in turn resulting in improved model performance statistics. Marginal posteriors were better constrained by the higher frequency data, resulting in a large reduction in parameter-related uncertainty in simulated TDP (the 95% credible interval decreased from 26 to 6 μg/l). The number of parameters that could be reliably auto-calibrated was lower for the fortnightly data, leading to the recommendation that parameters should not be varied spatially for models such as INCA-P unless there is solid evidence that this is appropriate, or there is a real need to do so for the model to fulfil its purpose. Secondary study aims were to highlight the subjective elements involved in auto-calibration and suggest practical improvements that could make models such as INCA-P more suited to auto-calibration and uncertainty analyses. Two key improvements include model simplification, so that all model parameters can be included in an analysis of this kind, and better documenting of recommended ranges for each parameter, to help in choosing sensible priors.
NASA Astrophysics Data System (ADS)
Kobayashi, Takashi; Komoda, Norihisa
The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.
Evaluating Process Improvement Courses of Action Through Modeling and Simulation
2017-09-16
changes to a process is time consuming and has potential to overlook stochastic effects. By modeling a process as a Numerical Design Structure Matrix...13 Methods to Evaluate Process Performance ................................................................15 The Design Structure...Matrix ......................................................................................16 Numerical Design Structure Matrix
ERIC Educational Resources Information Center
De Corte, Erik; Verschaffel, Lieven
Design and results of an investigation attempting to analyze and improve children's solution processes in elementary addition and subtraction problems are described. As background for the study, a conceptual model was developed based on previous research. One dimension of the model relates to the characteristics of the tasks (numerical versus word…
ERIC Educational Resources Information Center
Tichnor-Wagner, Ariel; Allen, Danielle; Socol, Allison Rose; Cohen-Vogel, Lora; Rutledge, Stacey A.; Xing, Qi W.
2018-01-01
Background/Context: This study examines the implementation of an academic and social-emotional learning innovation called Personalization for Academic and Social-Emotional Learning, or PASL. The innovation was designed, tested, and implemented using a continuous continuous-improvement model. The model emphasized a top-and-bottom process in which…
2012-06-01
THIS PAGE INTENTIONALLY LEFT BLANK xv LIST OF ACRONYMS AND ABBREVIATIONS BPM Business Process Model BPMN Business Process Modeling Notation C&A...checking leads to an improvement in the quality and success of enterprise software development. Business Process Modeling Notation ( BPMN ) is an...emerging standard that allows business processes to be captured in a standardized format. BPMN lacks formal semantics which leaves many of its features
Microphysics, Radiation and Surface Processes in the Goddard Cumulus Ensemble (GCE) Model
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo; Starr, David (Technical Monitor)
2002-01-01
One of the most promising methods to test the representation of cloud processes used in climate models is to use observations together with Cloud Resolving Models (CRMs). The CRMs use more sophisticated and realistic representations of cloud microphysical processes, and they can reasonably well resolve the time evolution, structure, and life cycles of clouds and cloud systems (size about 2-200 km). The CRMs also allow explicit interaction between out-going longwave (cooling) and in-coming solar (heating) radiation with clouds. Observations can provide the initial conditions and validation for CRM results. The Goddard Cumulus Ensemble (GCE) Model, a CRM, has been developed and improved at NASA/Goddard Space Flight Center over the past two decades. The GCE model has been used to understand the following: 1) water and energy cycles and their roles in the tropical climate system; 2) the vertical redistribution of ozone and trace constituents by individual clouds and well organized convective systems over various spatial scales; 3) the relationship between the vertical distribution of latent heating (phase change of water) and the large-scale (pre-storm) environment; 4) the validity of assumptions used in the representation of cloud processes in climate and global circulation models; and 5) the representation of cloud microphysical processes and their interaction with radiative forcing over tropical and midlatitude regions. Four-dimensional cloud and latent heating fields simulated from the GCE model have been provided to the TRMM Science Data and Information System (TSDIS) to develop and improve algorithms for retrieving rainfall and latent heating rates for TRMM and the NASA Earth Observing System (EOS). More than 90 referred papers using the GCE model have been published in the last two decades. Also, more than 10 national and international universities are currently using the GCE model for research and teaching. In this talk, five specific major GCE improvements: (1) ice microphysics, (2) longwave and shortwave radiative transfer processes, (3) land surface processes, (4) ocean surface fluxes and (5) ocean mixed layer processes are presented. The performance of these new GCE improvements will be examined. Observations are used for model validation.
Generalized Processing Tree Models: Jointly Modeling Discrete and Continuous Variables.
Heck, Daniel W; Erdfelder, Edgar; Kieslich, Pascal J
2018-05-24
Multinomial processing tree models assume that discrete cognitive states determine observed response frequencies. Generalized processing tree (GPT) models extend this conceptual framework to continuous variables such as response times, process-tracing measures, or neurophysiological variables. GPT models assume finite-mixture distributions, with weights determined by a processing tree structure, and continuous components modeled by parameterized distributions such as Gaussians with separate or shared parameters across states. We discuss identifiability, parameter estimation, model testing, a modeling syntax, and the improved precision of GPT estimates. Finally, a GPT version of the feature comparison model of semantic categorization is applied to computer-mouse trajectories.
Analysis and Modeling of Ground Operations at Hub Airports
NASA Technical Reports Server (NTRS)
Atkins, Stephen (Technical Monitor); Andersson, Kari; Carr, Francis; Feron, Eric; Hall, William D.
2000-01-01
Building simple and accurate models of hub airports can considerably help one understand airport dynamics, and may provide quantitative estimates of operational airport improvements. In this paper, three models are proposed to capture the dynamics of busy hub airport operations. Two simple queuing models are introduced to capture the taxi-out and taxi-in processes. An integer programming model aimed at representing airline decision-making attempts to capture the dynamics of the aircraft turnaround process. These models can be applied for predictive purposes. They may also be used to evaluate control strategies for improving overall airport efficiency.
An interval programming model for continuous improvement in micro-manufacturing
NASA Astrophysics Data System (ADS)
Ouyang, Linhan; Ma, Yizhong; Wang, Jianjun; Tu, Yiliu; Byun, Jai-Hyun
2018-03-01
Continuous quality improvement in micro-manufacturing processes relies on optimization strategies that relate an output performance to a set of machining parameters. However, when determining the optimal machining parameters in a micro-manufacturing process, the economics of continuous quality improvement and decision makers' preference information are typically neglected. This article proposes an economic continuous improvement strategy based on an interval programming model. The proposed strategy differs from previous studies in two ways. First, an interval programming model is proposed to measure the quality level, where decision makers' preference information is considered in order to determine the weight of location and dispersion effects. Second, the proposed strategy is a more flexible approach since it considers the trade-off between the quality level and the associated costs, and leaves engineers a larger decision space through adjusting the quality level. The proposed strategy is compared with its conventional counterparts using an Nd:YLF laser beam micro-drilling process.
Pre-engineering Spaceflight Validation of Environmental Models and the 2005 HZETRN Simulation Code
NASA Technical Reports Server (NTRS)
Nealy, John E.; Cucinotta, Francis A.; Wilson, John W.; Badavi, Francis F.; Dachev, Ts. P.; Tomov, B. T.; Walker, Steven A.; DeAngelis, Giovanni; Blattnig, Steve R.; Atwell, William
2006-01-01
The HZETRN code has been identified by NASA for engineering design in the next phase of space exploration highlighting a return to the Moon in preparation for a Mars mission. In response, a new series of algorithms beginning with 2005 HZETRN, will be issued by correcting some prior limitations and improving control of propagated errors along with established code verification processes. Code validation processes will use new/improved low Earth orbit (LEO) environmental models with a recently improved International Space Station (ISS) shield model to validate computational models and procedures using measured data aboard ISS. These validated models will provide a basis for flight-testing the designs of future space vehicles and systems of the Constellation program in the LEO environment.
NASA Astrophysics Data System (ADS)
Zheng, Fei; Zhu, Jiang
2017-04-01
How to design a reliable ensemble prediction strategy with considering the major uncertainties of a forecasting system is a crucial issue for performing an ensemble forecast. In this study, a new stochastic perturbation technique is developed to improve the prediction skills of El Niño-Southern Oscillation (ENSO) through using an intermediate coupled model. We first estimate and analyze the model uncertainties from the ensemble Kalman filter analysis results through assimilating the observed sea surface temperatures. Then, based on the pre-analyzed properties of model errors, we develop a zero-mean stochastic model-error model to characterize the model uncertainties mainly induced by the missed physical processes of the original model (e.g., stochastic atmospheric forcing, extra-tropical effects, Indian Ocean Dipole). Finally, we perturb each member of an ensemble forecast at each step by the developed stochastic model-error model during the 12-month forecasting process, and add the zero-mean perturbations into the physical fields to mimic the presence of missing processes and high-frequency stochastic noises. The impacts of stochastic model-error perturbations on ENSO deterministic predictions are examined by performing two sets of 21-yr hindcast experiments, which are initialized from the same initial conditions and differentiated by whether they consider the stochastic perturbations. The comparison results show that the stochastic perturbations have a significant effect on improving the ensemble-mean prediction skills during the entire 12-month forecasting process. This improvement occurs mainly because the nonlinear terms in the model can form a positive ensemble-mean from a series of zero-mean perturbations, which reduces the forecasting biases and then corrects the forecast through this nonlinear heating mechanism.
Simulation Modeling of Software Development Processes
NASA Technical Reports Server (NTRS)
Calavaro, G. F.; Basili, V. R.; Iazeolla, G.
1996-01-01
A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.
Hung, Sheng-Hui; Wang, Pa-Chun; Lin, Hung-Chun; Chen, Hung-Ying; Su, Chao-Ton
2015-01-01
Specimen handling is a critical patient safety issue. Problematic handling process, such as misidentification (of patients, surgical site, and specimen counts), specimen loss, or improper specimen preparation can lead to serious patient harms and lawsuits. Value stream map (VSM) is a tool used to find out non-value-added works, enhance the quality, and reduce the cost of the studied process. On the other hand, healthcare failure mode and effect analysis (HFMEA) is now frequently employed to avoid possible medication errors in healthcare process. Both of them have a goal similar to Six Sigma methodology for process improvement. This study proposes a model that integrates VSM and HFMEA into the framework, which mainly consists of define, measure, analyze, improve, and control (DMAIC), of Six Sigma. A Six Sigma project for improving the process of surgical specimen handling in a hospital was conducted to demonstrate the effectiveness of the proposed model.
Observations and Modeling of the Green Ocean Amazon 2014/15. CHUVA Field Campaign Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Machado, L. A. T.
2016-03-01
The physical processes inside clouds are one of the most unknown components of weather and climate systems. A description of cloud processes through the use of standard meteorological parameters in numerical models has to be strongly improved to accurately describe the characteristics of hydrometeors, latent heating profiles, radiative balance, air entrainment, and cloud updrafts and downdrafts. Numerical models have been improved to run at higher spatial resolutions where it is necessary to explicitly describe these cloud processes. For instance, to analyze the effects of global warming in a given region it is necessary to perform simulations taking into account allmore » of the cloud processes described above. Another important application that requires this knowledge is satellite precipitation estimation. The analysis will be performed focusing on the microphysical evolution and cloud life cycle, different precipitation estimation algorithms, the development of thunderstorms and lightning formation, processes in the boundary layer, and cloud microphysical modeling. This project intends to extend the knowledge of these cloud processes to reduce the uncertainties in precipitation estimation, mainly from warm clouds, and, consequently, improve knowledge of the water and energy budget and cloud microphysics.« less
The Climate Variability & Predictability (CVP) Program at NOAA - Recent Program Advancements
NASA Astrophysics Data System (ADS)
Lucas, S. E.; Todd, J. F.
2015-12-01
The Climate Variability & Predictability (CVP) Program supports research aimed at providing process-level understanding of the climate system through observation, modeling, analysis, and field studies. This vital knowledge is needed to improve climate models and predictions so that scientists can better anticipate the impacts of future climate variability and change. To achieve its mission, the CVP Program supports research carried out at NOAA and other federal laboratories, NOAA Cooperative Institutes, and academic institutions. The Program also coordinates its sponsored projects with major national and international scientific bodies including the World Climate Research Programme (WCRP), the International and U.S. Climate Variability and Predictability (CLIVAR/US CLIVAR) Program, and the U.S. Global Change Research Program (USGCRP). The CVP program sits within NOAA's Climate Program Office (http://cpo.noaa.gov/CVP). The CVP Program currently supports multiple projects in areas that are aimed at improved representation of physical processes in global models. Some of the topics that are currently funded include: i) Improved Understanding of Intraseasonal Tropical Variability - DYNAMO field campaign and post -field projects, and the new climate model improvement teams focused on MJO processes; ii) Climate Process Teams (CPTs, co-funded with NSF) with projects focused on Cloud macrophysical parameterization and its application to aerosol indirect effects, and Internal-Wave Driven Mixing in Global Ocean Models; iii) Improved Understanding of Tropical Pacific Processes, Biases, and Climatology; iv) Understanding Arctic Sea Ice Mechanism and Predictability;v) AMOC Mechanisms and Decadal Predictability Recent results from CVP-funded projects will be summarized. Additional information can be found at http://cpo.noaa.gov/CVP.
NASA Technical Reports Server (NTRS)
Browder, Joan A.; May, L. Nelson, Jr.; Rosenthal, Alan; Baumann, Robert H.; Gosselink, James G.
1987-01-01
A stochastic spatial computer model addressing coastal resource problems in Lousiana is being refined and validated using thematic mapper (TM) imagery. The TM images of brackish marsh sites were processed and data were tabulated on spatial parameters from TM images of the salt marsh sites. The Fisheries Image Processing Systems (FIPS) was used to analyze the TM scene. Activities were concentrated on improving the structure of the model and developing a structure and methodology for calibrating the model with spatial-pattern data from the TM imagery.
NASA Astrophysics Data System (ADS)
Limatahu, I.; Sutoyo, S.; Wasis; Prahani, B. K.
2018-03-01
In the previous research, CCDSR (Condition, Construction, Development, Simulation, and Reflection) learning model has been developed to improve science process skills for pre-service physics teacher. This research is aimed to analyze the effectiveness of CCDSR learning model towards the improvement skills of creating lesson plan and worksheet of Science Process Skill (SPS) for pre-service physics teacher in academic year 2016/2017. This research used one group pre-test and post-test design on 12 pre-service physics teacher at Physics Education, University of Khairun. Data collection was conducted through test and observation. Creating lesson plan and worksheet SPS skills of pre-service physics teacher measurement were conducted through Science Process Skill Evaluation Sheet (SPSES). The data analysis technique was done by Wilcoxon t-test and n-gain. The CCDSR learning model consists of 5 phases, including (1) Condition, (2) Construction, (3) Development, (4) Simulation, and (5) Reflection. The results showed that there was a significant increase in creating lesson plan and worksheet SPS skills of pre-service physics teacher at α = 5% and n-gain average of moderate category. Thus, the CCDSR learning model is effective for improving skills of creating lesson plan and worksheet SPS for pre-service physics teacher.
da Silveira, Christian L; Mazutti, Marcio A; Salau, Nina P G
2016-07-08
Process modeling can lead to of advantages such as helping in process control, reducing process costs and product quality improvement. This work proposes a solid-state fermentation distributed parameter model composed by seven differential equations with seventeen parameters to represent the process. Also, parameters estimation with a parameters identifyability analysis (PIA) is performed to build an accurate model with optimum parameters. Statistical tests were made to verify the model accuracy with the estimated parameters considering different assumptions. The results have shown that the model assuming substrate inhibition better represents the process. It was also shown that eight from the seventeen original model parameters were nonidentifiable and better results were obtained with the removal of these parameters from the estimation procedure. Therefore, PIA can be useful to estimation procedure, since it may reduce the number of parameters that can be evaluated. Further, PIA improved the model results, showing to be an important procedure to be taken. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:905-917, 2016. © 2016 American Institute of Chemical Engineers.
Shafer, Michael S.; Dembo, Richard; del Mar Vega-Debién, Graciela; Pankow, Jennifer; Duvall, Jamieson L.; Belenko, Steven; Frisman, Linda K.; Visher, Christy A.; Pich, Michele; Patterson, Yvonne
2014-01-01
Objectives. We tested a modified Network for the Improvement of Addiction Treatment (NIATx) process improvement model to implement improved HIV services (prevention, testing, and linkage to treatment) for offenders under correctional supervision. Methods. As part of the Criminal Justice Drug Abuse Treatment Studies, Phase 2, the HIV Services and Treatment Implementation in Corrections study conducted 14 cluster-randomized trials in 2011 to 2013 at 9 US sites, where one correctional facility received training in HIV services and coaching in a modified NIATx model and the other received only HIV training. The outcome measure was the odds of successful delivery of an HIV service. Results. The results were significant at the .05 level, and the point estimate for the odds ratio was 2.14. Although overall the results were heterogeneous, the experiments that focused on implementing HIV prevention interventions had a 95% confidence interval that exceeded the no-difference point. Conclusions. Our results demonstrate that a modified NIATx process improvement model can effectively implement improved rates of delivery of some types of HIV services in correctional environments. PMID:25322311
Pearson, Frank S; Shafer, Michael S; Dembo, Richard; Del Mar Vega-Debién, Graciela; Pankow, Jennifer; Duvall, Jamieson L; Belenko, Steven; Frisman, Linda K; Visher, Christy A; Pich, Michele; Patterson, Yvonne
2014-12-01
We tested a modified Network for the Improvement of Addiction Treatment (NIATx) process improvement model to implement improved HIV services (prevention, testing, and linkage to treatment) for offenders under correctional supervision. As part of the Criminal Justice Drug Abuse Treatment Studies, Phase 2, the HIV Services and Treatment Implementation in Corrections study conducted 14 cluster-randomized trials in 2011 to 2013 at 9 US sites, where one correctional facility received training in HIV services and coaching in a modified NIATx model and the other received only HIV training. The outcome measure was the odds of successful delivery of an HIV service. The results were significant at the .05 level, and the point estimate for the odds ratio was 2.14. Although overall the results were heterogeneous, the experiments that focused on implementing HIV prevention interventions had a 95% confidence interval that exceeded the no-difference point. Our results demonstrate that a modified NIATx process improvement model can effectively implement improved rates of delivery of some types of HIV services in correctional environments.
Manothum, Aniruth; Rukijkanpanich, Jittra; Thawesaengskulthai, Damrong; Thampitakkul, Boonwa; Chaikittiporn, Chalermchai; Arphorn, Sara
2009-01-01
The purpose of this study was to evaluate the implementation of an Occupational Health and Safety Management Model for informal sector workers in Thailand. The studied model was characterized by participatory approaches to preliminary assessment, observation of informal business practices, group discussion and participation, and the use of environmental measurements and samples. This model consisted of four processes: capacity building, risk analysis, problem solving, and monitoring and control. The participants consisted of four local labor groups from different regions, including wood carving, hand-weaving, artificial flower making, and batik processing workers. The results demonstrated that, as a result of applying the model, the working conditions of the informal sector workers had improved to meet necessary standards. This model encouraged the use of local networks, which led to cooperation within the groups to create appropriate technologies to solve their problems. The authors suggest that this model could effectively be applied elsewhere to improve informal sector working conditions on a broader scale.
The Context, Process, and Outcome Evaluation Model for Organisational Health Interventions
Fridrich, Annemarie; Jenny, Gregor J.; Bauer, Georg F.
2015-01-01
To facilitate evaluation of complex, organisational health interventions (OHIs), this paper aims at developing a context, process, and outcome (CPO) evaluation model. It builds on previous model developments in the field and advances them by clearly defining and relating generic evaluation categories for OHIs. Context is defined as the underlying frame that influences and is influenced by an OHI. It is further differentiated into the omnibus and discrete contexts. Process is differentiated into the implementation process, as the time-limited enactment of the original intervention plan, and the change process of individual and collective dynamics triggered by the implementation process. These processes lead to proximate, intermediate, and distal outcomes, as all results of the change process that are meaningful for various stakeholders. Research questions that might guide the evaluation of an OHI according to the CPO categories and a list of concrete themes/indicators and methods/sources applied within the evaluation of an OHI project at a hospital in Switzerland illustrate the model's applicability in structuring evaluations of complex OHIs. In conclusion, the model supplies a common language and a shared mental model for improving communication between researchers and company members and will improve the comparability and aggregation of evaluation study results. PMID:26557665
The Context, Process, and Outcome Evaluation Model for Organisational Health Interventions.
Fridrich, Annemarie; Jenny, Gregor J; Bauer, Georg F
2015-01-01
To facilitate evaluation of complex, organisational health interventions (OHIs), this paper aims at developing a context, process, and outcome (CPO) evaluation model. It builds on previous model developments in the field and advances them by clearly defining and relating generic evaluation categories for OHIs. Context is defined as the underlying frame that influences and is influenced by an OHI. It is further differentiated into the omnibus and discrete contexts. Process is differentiated into the implementation process, as the time-limited enactment of the original intervention plan, and the change process of individual and collective dynamics triggered by the implementation process. These processes lead to proximate, intermediate, and distal outcomes, as all results of the change process that are meaningful for various stakeholders. Research questions that might guide the evaluation of an OHI according to the CPO categories and a list of concrete themes/indicators and methods/sources applied within the evaluation of an OHI project at a hospital in Switzerland illustrate the model's applicability in structuring evaluations of complex OHIs. In conclusion, the model supplies a common language and a shared mental model for improving communication between researchers and company members and will improve the comparability and aggregation of evaluation study results.
An overview of the model integration process: From pre-integration assessment to testing
Integration of models requires linking models which can be developed using different tools, methodologies, and assumptions. We performed a literature review with the aim of improving our understanding of model integration process, and also presenting better strategies for buildin...
Investigating outliers to improve conceptual models of bedrock aquifers
NASA Astrophysics Data System (ADS)
Worthington, Stephen R. H.
2018-06-01
Numerical models play a prominent role in hydrogeology, with simplifying assumptions being inevitable when implementing these models. However, there is a risk of oversimplification, where important processes become neglected. Such processes may be associated with outliers, and consideration of outliers can lead to an improved scientific understanding of bedrock aquifers. Using rigorous logic to investigate outliers can help to explain fundamental scientific questions such as why there are large variations in permeability between different bedrock lithologies.
Weled, Barry J; Adzhigirey, Lana A; Hodgman, Tudy M; Brilli, Richard J; Spevetz, Antoinette; Kline, Andrea M; Montgomery, Vicki L; Puri, Nitin; Tisherman, Samuel A; Vespa, Paul M; Pronovost, Peter J; Rainey, Thomas G; Patterson, Andrew J; Wheeler, Derek S
2015-07-01
In 2001, the Society of Critical Care Medicine published practice model guidelines that focused on the delivery of critical care and the roles of different ICU team members. An exhaustive review of the additional literature published since the last guideline has demonstrated that both the structure and process of care in the ICU are important for achieving optimal patient outcomes. Since the publication of the original guideline, several authorities have recognized that improvements in the processes of care, ICU structure, and the use of quality improvement science methodologies can beneficially impact patient outcomes and reduce costs. Herein, we summarize findings of the American College of Critical Care Medicine Task Force on Models of Critical Care: 1) An intensivist-led, high-performing, multidisciplinary team dedicated to the ICU is an integral part of effective care delivery; 2) Process improvement is the backbone of achieving high-quality ICU outcomes; 3) Standardized protocols including care bundles and order sets to facilitate measurable processes and outcomes should be used and further developed in the ICU setting; and 4) Institutional support for comprehensive quality improvement programs as well as tele-ICU programs should be provided.
Benchmarking novel approaches for modelling species range dynamics
Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H.; Moore, Kara A.; Zimmermann, Niklaus E.
2016-01-01
Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species’ range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species’ response to climate change but also emphasise several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches operational for large numbers of species. PMID:26872305
Benchmarking novel approaches for modelling species range dynamics.
Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H; Moore, Kara A; Zimmermann, Niklaus E
2016-08-01
Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species' range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species' response to climate change but also emphasize several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches operational for large numbers of species. © 2016 John Wiley & Sons Ltd.
Business Models for Training and Performance Improvement Departments
ERIC Educational Resources Information Center
Carliner, Saul
2004-01-01
Although typically applied to entire enterprises, the concept of business models applies to training and performance improvement groups. Business models are "the method by which firm[s] build and use [their] resources to offer.. value." Business models affect the types of projects, services offered, skills required, business processes, and type of…
Multiphase porous media modelling: A novel approach to predicting food processing performance.
Khan, Md Imran H; Joardder, M U H; Kumar, Chandan; Karim, M A
2018-03-04
The development of a physics-based model of food processing is essential to improve the quality of processed food and optimize energy consumption. Food materials, particularly plant-based food materials, are complex in nature as they are porous and have hygroscopic properties. A multiphase porous media model for simultaneous heat and mass transfer can provide a realistic understanding of transport processes and thus can help to optimize energy consumption and improve food quality. Although the development of a multiphase porous media model for food processing is a challenging task because of its complexity, many researchers have attempted it. The primary aim of this paper is to present a comprehensive review of the multiphase models available in the literature for different methods of food processing, such as drying, frying, cooking, baking, heating, and roasting. A critical review of the parameters that should be considered for multiphase modelling is presented which includes input parameters, material properties, simulation techniques and the hypotheses. A discussion on the general trends in outcomes, such as moisture saturation, temperature profile, pressure variation, and evaporation patterns, is also presented. The paper concludes by considering key issues in the existing multiphase models and future directions for development of multiphase models.
A Binaural Grouping Model for Predicting Speech Intelligibility in Multitalker Environments
Colburn, H. Steven
2016-01-01
Spatially separating speech maskers from target speech often leads to a large intelligibility improvement. Modeling this phenomenon has long been of interest to binaural-hearing researchers for uncovering brain mechanisms and for improving signal-processing algorithms in hearing-assistive devices. Much of the previous binaural modeling work focused on the unmasking enabled by binaural cues at the periphery, and little quantitative modeling has been directed toward the grouping or source-separation benefits of binaural processing. In this article, we propose a binaural model that focuses on grouping, specifically on the selection of time-frequency units that are dominated by signals from the direction of the target. The proposed model uses Equalization-Cancellation (EC) processing with a binary decision rule to estimate a time-frequency binary mask. EC processing is carried out to cancel the target signal and the energy change between the EC input and output is used as a feature that reflects target dominance in each time-frequency unit. The processing in the proposed model requires little computational resources and is straightforward to implement. In combination with the Coherence-based Speech Intelligibility Index, the model is applied to predict the speech intelligibility data measured by Marrone et al. The predicted speech reception threshold matches the pattern of the measured data well, even though the predicted intelligibility improvements relative to the colocated condition are larger than some of the measured data, which may reflect the lack of internal noise in this initial version of the model. PMID:27698261
A Binaural Grouping Model for Predicting Speech Intelligibility in Multitalker Environments.
Mi, Jing; Colburn, H Steven
2016-10-03
Spatially separating speech maskers from target speech often leads to a large intelligibility improvement. Modeling this phenomenon has long been of interest to binaural-hearing researchers for uncovering brain mechanisms and for improving signal-processing algorithms in hearing-assistive devices. Much of the previous binaural modeling work focused on the unmasking enabled by binaural cues at the periphery, and little quantitative modeling has been directed toward the grouping or source-separation benefits of binaural processing. In this article, we propose a binaural model that focuses on grouping, specifically on the selection of time-frequency units that are dominated by signals from the direction of the target. The proposed model uses Equalization-Cancellation (EC) processing with a binary decision rule to estimate a time-frequency binary mask. EC processing is carried out to cancel the target signal and the energy change between the EC input and output is used as a feature that reflects target dominance in each time-frequency unit. The processing in the proposed model requires little computational resources and is straightforward to implement. In combination with the Coherence-based Speech Intelligibility Index, the model is applied to predict the speech intelligibility data measured by Marrone et al. The predicted speech reception threshold matches the pattern of the measured data well, even though the predicted intelligibility improvements relative to the colocated condition are larger than some of the measured data, which may reflect the lack of internal noise in this initial version of the model. © The Author(s) 2016.
Probabilistic models of eukaryotic evolution: time for integration
Lartillot, Nicolas
2015-01-01
In spite of substantial work and recent progress, a global and fully resolved picture of the macroevolutionary history of eukaryotes is still under construction. This concerns not only the phylogenetic relations among major groups, but also the general characteristics of the underlying macroevolutionary processes, including the patterns of gene family evolution associated with endosymbioses, as well as their impact on the sequence evolutionary process. All these questions raise formidable methodological challenges, calling for a more powerful statistical paradigm. In this direction, model-based probabilistic approaches have played an increasingly important role. In particular, improved models of sequence evolution accounting for heterogeneities across sites and across lineages have led to significant, although insufficient, improvement in phylogenetic accuracy. More recently, one main trend has been to move away from simple parametric models and stepwise approaches, towards integrative models explicitly considering the intricate interplay between multiple levels of macroevolutionary processes. Such integrative models are in their infancy, and their application to the phylogeny of eukaryotes still requires substantial improvement of the underlying models, as well as additional computational developments. PMID:26323768
Electromagnetic Modelling of MMIC CPWs for High Frequency Applications
NASA Astrophysics Data System (ADS)
Sinulingga, E. P.; Kyabaggu, P. B. K.; Rezazadeh, A. A.
2018-02-01
Realising the theoretical electrical characteristics of components through modelling can be carried out using computer-aided design (CAD) simulation tools. If the simulation model provides the expected characteristics, the fabrication process of Monolithic Microwave Integrated Circuit (MMIC) can be performed for experimental verification purposes. Therefore improvements can be suggested before mass fabrication takes place. This research concentrates on development of MMIC technology by providing accurate predictions of the characteristics of MMIC components using an improved Electromagnetic (EM) modelling technique. The knowledge acquired from the modelling and characterisation process in this work can be adopted by circuit designers for various high frequency applications.
System approach to modeling of industrial technologies
NASA Astrophysics Data System (ADS)
Toropov, V. S.; Toropov, E. S.
2018-03-01
The authors presented a system of methods for modeling and improving industrial technologies. The system consists of information and software. The information part is structured information about industrial technologies. The structure has its template. The template has several essential categories used to improve the technological process and eliminate weaknesses in the process chain. The base category is the physical effect that takes place when the technical process proceeds. The programming part of the system can apply various methods of creative search to the content stored in the information part of the system. These methods pay particular attention to energy transformations in the technological process. The system application will allow us to systematize the approach to improving technologies and obtaining new technical solutions.
Electron Transport Modeling of Molecular Nanoscale Bridges Used in Energy Conversion Schemes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunietz, Barry D
2016-08-09
The goal of the research program is to reliably describe electron transport and transfer processes at the molecular level. Such insight is essential for improving molecular applications of solar and thermal energy conversion. We develop electronic structure models to study (1) photoinduced electron transfer and transport processes in organic semiconducting materials, and (2) charge and heat transport through molecular bridges. We seek fundamental understanding of key processes, which lead to design new experiments and ultimately to achieve systems with improved properties.
Chemistry-Transport Modeling of the Satellite Observed Distribution of Tropical Tropospheric Ozone
NASA Technical Reports Server (NTRS)
Peters, Wouter; Krol, Maarten; Dentener, Frank; Thompson, Anne M.; Leloeveld, Jos; Bhartia, P. K. (Technical Monitor)
2002-01-01
We have compared the 14-year record of satellite derived tropical tropospheric ozone columns (TTOC) from the NIMBUS-7 Total Ozone Mapping Spectrometer (TOMS) to TTOC calculated by a chemistry-transport model (CTM). An objective measure of error, based on the zonal distribution of TTOC in the tropics, is applied to perform this comparison systematically. In addition, the sensitivity of the model to several key processes in the tropics is quantified to select directions for future improvements. The comparisons indicate a widespread, systematic (20%) discrepancy over the tropical Atlantic Ocean, which maximizes during austral Spring. Although independent evidence from ozonesondes shows that some of the disagreement is due to satellite over-estimate of TTOC, the Atlantic mismatch is largely due to a misrepresentation of seasonally recurring processes in the model. Only minor differences between the model and observations over the Pacific occur, mostly due to interannual variability not captured by the model. Although chemical processes determine the TTOC extent, dynamical processes dominate the TTOC distribution, as the use of actual meteorology pertaining to the year of observations always leads to a better agreement with TTOC observations than using a random year or a climatology. The modeled TTOC is remarkably insensitive to many model parameters due to efficient feedbacks in the ozone budget. Nevertheless, the simulations would profit from an improved biomass burning calendar, as well as from an increase in NOX abundances in free tropospheric biomass burning plumes. The model showed the largest response to lightning NOX emissions, but systematic improvements could not be found. The use of multi-year satellite derived tropospheric data to systematically test and improve a CTM is a promising new addition to existing methods of model validation, and is a first step to integrating tropospheric satellite observations into global ozone modeling studies. Conversely,the CTM may suggest improvements to evolving satellite retrievals for tropospheric ozone.
NASA Earth Science Research Results for Improved Regional Crop Yield Prediction
NASA Astrophysics Data System (ADS)
Mali, P.; O'Hara, C. G.; Shrestha, B.; Sinclair, T. R.; G de Goncalves, L. G.; Salado Navarro, L. R.
2007-12-01
National agencies such as USDA Foreign Agricultural Service (FAS), Production Estimation and Crop Assessment Division (PECAD) work specifically to analyze and generate timely crop yield estimates that help define national as well as global food policies. The USDA/FAS/PECAD utilizes a Decision Support System (DSS) called CADRE (Crop Condition and Data Retrieval Evaluation) mainly through an automated database management system that integrates various meteorological datasets, crop and soil models, and remote sensing data; providing significant contribution to the national and international crop production estimates. The "Sinclair" soybean growth model has been used inside CADRE DSS as one of the crop models. This project uses Sinclair model (a semi-mechanistic crop growth model) for its potential to be effectively used in a geo-processing environment with remote-sensing-based inputs. The main objective of this proposed work is to verify, validate and benchmark current and future NASA earth science research results for the benefit in the operational decision making process of the PECAD/CADRE DSS. For this purpose, the NASA South American Land Data Assimilation System (SALDAS) meteorological dataset is tested for its applicability as a surrogate meteorological input in the Sinclair model meteorological input requirements. Similarly, NASA sensor MODIS products is tested for its applicability in the improvement of the crop yield prediction through improving precision of planting date estimation, plant vigor and growth monitoring. The project also analyzes simulated Visible/Infrared Imager/Radiometer Suite (VIIRS, a future NASA sensor) vegetation product for its applicability in crop growth prediction to accelerate the process of transition of VIIRS research results for the operational use of USDA/FAS/PECAD DSS. The research results will help in providing improved decision making capacity to the USDA/FAS/PECAD DSS through improved vegetation growth monitoring from high spatial and temporal resolution remote sensing datasets; improved time-series meteorological inputs required for crop growth models; and regional prediction capability through geo-processing-based yield modeling.
Liu, Tongzhu; Shen, Aizong; Hu, Xiaojian; Tong, Guixian; Gu, Wei
2017-06-01
We aimed to apply collaborative business intelligence (BI) system to hospital supply, processing and distribution (SPD) logistics management model. We searched Engineering Village database, China National Knowledge Infrastructure (CNKI) and Google for articles (Published from 2011 to 2016), books, Web pages, etc., to understand SPD and BI related theories and recent research status. For the application of collaborative BI technology in the hospital SPD logistics management model, we realized this by leveraging data mining techniques to discover knowledge from complex data and collaborative techniques to improve the theories of business process. For the application of BI system, we: (i) proposed a layered structure of collaborative BI system for intelligent management in hospital logistics; (ii) built data warehouse for the collaborative BI system; (iii) improved data mining techniques such as supporting vector machines (SVM) and swarm intelligence firefly algorithm to solve key problems in hospital logistics collaborative BI system; (iv) researched the collaborative techniques oriented to data and business process optimization to improve the business processes of hospital logistics management. Proper combination of SPD model and BI system will improve the management of logistics in the hospitals. The successful implementation of the study requires: (i) to innovate and improve the traditional SPD model and make appropriate implement plans and schedules for the application of BI system according to the actual situations of hospitals; (ii) the collaborative participation of internal departments in hospital including the department of information, logistics, nursing, medical and financial; (iii) timely response of external suppliers.
Kirkham, R; Boyle, J A; Whitbread, C; Dowden, M; Connors, C; Corpus, S; McCarthy, L; Oats, J; McIntyre, H D; Moore, E; O'Dea, K; Brown, A; Maple-Brown, L
2017-08-03
Australian Aboriginal and Torres Strait Islander women have high rates of gestational and pre-existing type 2 diabetes in pregnancy. The Northern Territory (NT) Diabetes in Pregnancy Partnership was established to enhance systems and services to improve health outcomes. It has three arms: a clinical register, developing models of care and a longitudinal birth cohort. This study used a process evaluation to report on health professional's perceptions of models of care and related quality improvement activities since the implementation of the Partnership. Changes to models of care were documented according to goals and aims of the Partnership and reviewed annually by the Partnership Steering group. A 'systems assessment tool' was used to guide six focus groups (49 healthcare professionals). Transcripts were coded and analysed according to pre-identified themes of orientation and guidelines, education, communication, logistics and access, and information technology. Key improvements since implementation of the Partnership include: health professional relationships, communication and education; and integration of quality improvement activities. Focus groups with 49 health professionals provided in depth information about how these activities have impacted their practice and models of care for diabetes in pregnancy. Co-ordination of care was reported to have improved, however it was also identified as an opportunity for further development. Recommendations included a central care coordinator, better integration of information technology systems and ongoing comprehensive quality improvement processes. The Partnership has facilitated quality improvement through supporting the development of improved systems that enhance models of care. Persisting challenges exist for delivering care to a high risk population however improvements in formal processes and structures, as demonstrated in this work thus far, play an important role in work towards improving health outcomes.
Improved compliance by BPM-driven workflow automation.
Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin
2014-12-01
Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.
An Evaluation of Understandability of Patient Journey Models in Mental Health.
Percival, Jennifer; McGregor, Carolyn
2016-07-28
There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers.
Necpálová, Magdalena; Anex, Robert P.; Fienen, Michael N.; Del Grosso, Stephen J.; Castellano, Michael J.; Sawyer, John E.; Iqbal, Javed; Pantoja, Jose L.; Barker, Daniel W.
2015-01-01
The ability of biogeochemical ecosystem models to represent agro-ecosystems depends on their correct integration with field observations. We report simultaneous calibration of 67 DayCent model parameters using multiple observation types through inverse modeling using the PEST parameter estimation software. Parameter estimation reduced the total sum of weighted squared residuals by 56% and improved model fit to crop productivity, soil carbon, volumetric soil water content, soil temperature, N2O, and soil3NO− compared to the default simulation. Inverse modeling substantially reduced predictive model error relative to the default model for all model predictions, except for soil 3NO− and 4NH+. Post-processing analyses provided insights into parameter–observation relationships based on parameter correlations, sensitivity and identifiability. Inverse modeling tools are shown to be a powerful way to systematize and accelerate the process of biogeochemical model interrogation, improving our understanding of model function and the underlying ecosystem biogeochemical processes that they represent.
Scheiblauer, Johannes; Scheiner, Stefan; Joksch, Martin; Kavsek, Barbara
2018-09-14
A combined experimental/theoretical approach is presented, for improving the predictability of Saccharomyces cerevisiae fermentations. In particular, a mathematical model was developed explicitly taking into account the main mechanisms of the fermentation process, allowing for continuous computation of key process variables, including the biomass concentration and the respiratory quotient (RQ). For model calibration and experimental validation, batch and fed-batch fermentations were carried out. Comparison of the model-predicted biomass concentrations and RQ developments with the corresponding experimentally recorded values shows a remarkably good agreement for both batch and fed-batch processes, confirming the adequacy of the model. Furthermore, sensitivity studies were performed, in order to identify model parameters whose variations have significant effects on the model predictions: our model responds with significant sensitivity to the variations of only six parameters. These studies provide a valuable basis for model reduction, as also demonstrated in this paper. Finally, optimization-based parametric studies demonstrate how our model can be utilized for improving the efficiency of Saccharomyces cerevisiae fermentations. Copyright © 2018 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Boerner, Kathrin; Jopp, Daniela
2007-01-01
This article focuses on the common and unique contributions of three major life-span theories in addressing improvement/maintenance and reorientation, which represent central processes of coping with major life change and loss. For this purpose, we review and compare the dual-process model of assimilative and accommodative coping, the model of…
Process Improvement Should Link to Security: SEPG 2007 Security Track Recap
2007-09-01
the Systems Security Engineering Capability Maturity Model (SSE- CMM / ISO 21827) and its use in system software developments ...software development life cycle ( SDLC )? 6. In what ways should process improvement support security in the SDLC ? 1.2 10BPANEL RESOURCES For each... project management, and support practices through the use of the capability maturity models including the CMMI and the Systems Security
Predicting concrete corrosion of sewers using artificial neural network.
Jiang, Guangming; Keller, Jurg; Bond, Philip L; Yuan, Zhiguo
2016-04-01
Corrosion is often a major failure mechanism for concrete sewers and under such circumstances the sewer service life is largely determined by the progression of microbially induced concrete corrosion. The modelling of sewer processes has become possible due to the improved understanding of in-sewer transformation. Recent systematic studies about the correlation between the corrosion processes and sewer environment factors should be utilized to improve the prediction capability of service life by sewer models. This paper presents an artificial neural network (ANN)-based approach for modelling the concrete corrosion processes in sewers. The approach included predicting the time for the corrosion to initiate and then predicting the corrosion rate after the initiation period. The ANN model was trained and validated with long-term (4.5 years) corrosion data obtained in laboratory corrosion chambers, and further verified with field measurements in real sewers across Australia. The trained model estimated the corrosion initiation time and corrosion rates very close to those measured in Australian sewers. The ANN model performed better than a multiple regression model also developed on the same dataset. Additionally, the ANN model can serve as a prediction framework for sewer service life, which can be progressively improved and expanded by including corrosion rates measured in different sewer conditions. Furthermore, the proposed methodology holds promise to facilitate the construction of analytical models associated with corrosion processes of concrete sewers. Copyright © 2016 Elsevier Ltd. All rights reserved.
Turner, Tari; Green, Sally; Tovey, David; McDonald, Steve; Soares-Weiser, Karla; Pestridge, Charlotte; Elliott, Julian
2017-08-01
Producing high-quality, relevant systematic reviews and keeping them up to date is challenging. Cochrane is a leading provider of systematic reviews in health. For Cochrane to continue to contribute to improvements in heath, Cochrane Reviews must be rigorous, reliable and up to date. We aimed to explore existing models of Cochrane Review production and emerging opportunities to improve the efficiency and sustainability of these processes. To inform discussions about how to best achieve this, we conducted 26 interviews and an online survey with 106 respondents. Respondents highlighted the importance and challenge of creating reliable, timely systematic reviews. They described the challenges and opportunities presented by current production models, and they shared what they are doing to improve review production. They particularly highlighted significant challenges with increasing complexity of review methods; difficulty keeping authors on board and on track; and the length of time required to complete the process. Strong themes emerged about the roles of authors and Review Groups, the central actors in the review production process. The results suggest that improvements to Cochrane's systematic review production models could come from improving clarity of roles and expectations, ensuring continuity and consistency of input, enabling active management of the review process, centralising some review production steps; breaking reviews into smaller "chunks", and improving approaches to building capacity of and sharing information between authors and Review Groups. Respondents noted the important role new technologies have to play in enabling these improvements. The findings of this study will inform the development of new Cochrane Review production models and may provide valuable data for other systematic review producers as they consider how best to produce rigorous, reliable, up-to-date reviews.
Process improvement as an investment: Measuring its worth
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Jeletic, Kellyann
1993-01-01
This paper discusses return on investment (ROI) generated from software process improvement programs. It details the steps needed to compute ROI and compares these steps from the perspective of two process improvement approaches: the widely known Software Engineering Institute's capability maturity model and the approach employed by NASA's Software Engineering Laboratory (SEL). The paper then describes the specific investments made in the SEL over the past 18 years and discusses the improvements gained from this investment by the production organization in the SEL.
Development and application of an acceptance testing model
NASA Technical Reports Server (NTRS)
Pendley, Rex D.; Noonan, Caroline H.; Hall, Kenneth R.
1992-01-01
The process of acceptance testing large software systems for NASA has been analyzed, and an empirical planning model of the process constructed. This model gives managers accurate predictions of the staffing needed, the productivity of a test team, and the rate at which the system will pass. Applying the model to a new system shows a high level of agreement between the model and actual performance. The model also gives managers an objective measure of process improvement.
NASA Astrophysics Data System (ADS)
Hidy, Dóra; Barcza, Zoltán; Marjanović, Hrvoje; Zorana Ostrogović Sever, Maša; Dobor, Laura; Gelybó, Györgyi; Fodor, Nándor; Pintér, Krisztina; Churkina, Galina; Running, Steven; Thornton, Peter; Bellocchi, Gianni; Haszpra, László; Horváth, Ferenc; Suyker, Andrew; Nagy, Zoltán
2016-12-01
The process-based biogeochemical model Biome-BGC was enhanced to improve its ability to simulate carbon, nitrogen, and water cycles of various terrestrial ecosystems under contrasting management activities. Biome-BGC version 4.1.1 was used as a base model. Improvements included addition of new modules such as the multilayer soil module, implementation of processes related to soil moisture and nitrogen balance, soil-moisture-related plant senescence, and phenological development. Vegetation management modules with annually varying options were also implemented to simulate management practices of grasslands (mowing, grazing), croplands (ploughing, fertilizer application, planting, harvesting), and forests (thinning). New carbon and nitrogen pools have been defined to simulate yield and soft stem development of herbaceous ecosystems. The model version containing all developments is referred to as Biome-BGCMuSo (Biome-BGC with multilayer soil module; in this paper, Biome-BGCMuSo v4.0 is documented). Case studies on a managed forest, cropland, and grassland are presented to demonstrate the effect of model developments on the simulation of plant growth as well as on carbon and water balance.
On the next generation of reliability analysis tools
NASA Technical Reports Server (NTRS)
Babcock, Philip S., IV; Leong, Frank; Gai, Eli
1987-01-01
The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-26
... Definition To Address Advanced Fuel Designs,'' Using the Consolidated Line Item Improvement Process AGENCY...-specific adoption using the Consolidated Line Item Improvement Process (CLIIP). Additionally, the NRC staff..., which may be more reactive at shutdown temperatures above 68[emsp14][deg]F. This STS improvement is part...
Selker, Harry P.; Leslie, Laurel K.
2015-01-01
Abstract There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in‐person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. PMID:26332869
Modelling of additive manufacturing processes: a review and classification
NASA Astrophysics Data System (ADS)
Stavropoulos, Panagiotis; Foteinopoulos, Panagis
2018-03-01
Additive manufacturing (AM) is a very promising technology; however, there are a number of open issues related to the different AM processes. The literature on modelling the existing AM processes is reviewed and classified. A categorization of the different AM processes in process groups, according to the process mechanism, has been conducted and the most important issues are stated. Suggestions are made as to which approach is more appropriate according to the key performance indicator desired to be modelled and a discussion is included as to the way that future modelling work can better contribute to improving today's AM process understanding.
Orbit Determination for the Lunar Reconnaissance Orbiter Using an Extended Kalman Filter
NASA Technical Reports Server (NTRS)
Slojkowski, Steven; Lowe, Jonathan; Woodburn, James
2015-01-01
Orbit determination (OD) analysis results are presented for the Lunar Reconnaissance Orbiter (LRO) using a commercially available Extended Kalman Filter, Analytical Graphics' Orbit Determination Tool Kit (ODTK). Process noise models for lunar gravity and solar radiation pressure (SRP) are described and OD results employing the models are presented. Definitive accuracy using ODTK meets mission requirements and is better than that achieved using the operational LRO OD tool, the Goddard Trajectory Determination System (GTDS). Results demonstrate that a Vasicek stochastic model produces better estimates of the coefficient of solar radiation pressure than a Gauss-Markov model, and prediction accuracy using a Vasicek model meets mission requirements over the analysis span. Modeling the effect of antenna motion on range-rate tracking considerably improves residuals and filter-smoother consistency. Inclusion of off-axis SRP process noise and generalized process noise improves filter performance for both definitive and predicted accuracy. Definitive accuracy from the smoother is better than achieved using GTDS and is close to that achieved by precision OD methods used to generate definitive science orbits. Use of a multi-plate dynamic spacecraft area model with ODTK's force model plugin capability provides additional improvements in predicted accuracy.
NASA Astrophysics Data System (ADS)
Haili, Hasnawati; Maknun, Johar; Siahaan, Parsaoran
2017-08-01
Physics is a lessons that related to students' daily experience. Therefore, before the students studying in class formally, actually they have already have a visualization and prior knowledge about natural phenomenon and could wide it themselves. The learning process in class should be aimed to detect, process, construct, and use students' mental model. So, students' mental model agree with and builds in the right concept. The previous study held in MAN 1 Muna informs that in learning process the teacher did not pay attention students' mental model. As a consequence, the learning process has not tried to build students' mental modelling ability (MMA). The purpose of this study is to describe the improvement of students' MMA as a effect of problem solving based learning model with multiple representations approach. This study is pre experimental design with one group pre post. It is conducted in XI IPA MAN 1 Muna 2016/2017. Data collection uses problem solving test concept the kinetic theory of gasses and interview to get students' MMA. The result of this study is clarification students' MMA which is categorized in 3 category; High Mental Modelling Ability (H-MMA) for 7
Steinfeld, Bradley; Scott, Jennifer; Vilander, Gavin; Marx, Larry; Quirk, Michael; Lindberg, Julie; Koerner, Kelly
2015-10-01
To effectively implement evidence-based practices (EBP) in behavioral health care, an organization needs to have operating structures and processes that can address core EBP implementation factors and stages. Lean, a widely used quality improvement process, can potentially address the factors crucial to successful implementation of EBP. This article provides an overview of Lean and the relationship between Lean process improvement steps, and EBP implementation models. Examples of how Lean process improvement methodologies can be used to help plan and carry out implementation of EBP in mental health delivery systems are presented along with limitations and recommendations for future research and clinical application.
Graham, Emily B.; Knelman, Joseph E.; Schindlbacher, Andreas; Siciliano, Steven; Breulmann, Marc; Yannarell, Anthony; Beman, J. M.; Abell, Guy; Philippot, Laurent; Prosser, James; Foulquier, Arnaud; Yuste, Jorge C.; Glanville, Helen C.; Jones, Davey L.; Angel, Roey; Salminen, Janne; Newton, Ryan J.; Bürgmann, Helmut; Ingram, Lachlan J.; Hamer, Ute; Siljanen, Henri M. P.; Peltoniemi, Krista; Potthast, Karin; Bañeras, Lluís; Hartmann, Martin; Banerjee, Samiran; Yu, Ri-Qing; Nogaro, Geraldine; Richter, Andreas; Koranda, Marianne; Castle, Sarah C.; Goberna, Marta; Song, Bongkeun; Chatterjee, Amitava; Nunes, Olga C.; Lopes, Ana R.; Cao, Yiping; Kaisermann, Aurore; Hallin, Sara; Strickland, Michael S.; Garcia-Pausas, Jordi; Barba, Josep; Kang, Hojeong; Isobe, Kazuo; Papaspyrou, Sokratis; Pastorelli, Roberta; Lagomarsino, Alessandra; Lindström, Eva S.; Basiliko, Nathan; Nemergut, Diana R.
2016-01-01
Microorganisms are vital in mediating the earth’s biogeochemical cycles; yet, despite our rapidly increasing ability to explore complex environmental microbial communities, the relationship between microbial community structure and ecosystem processes remains poorly understood. Here, we address a fundamental and unanswered question in microbial ecology: ‘When do we need to understand microbial community structure to accurately predict function?’ We present a statistical analysis investigating the value of environmental data and microbial community structure independently and in combination for explaining rates of carbon and nitrogen cycling processes within 82 global datasets. Environmental variables were the strongest predictors of process rates but left 44% of variation unexplained on average, suggesting the potential for microbial data to increase model accuracy. Although only 29% of our datasets were significantly improved by adding information on microbial community structure, we observed improvement in models of processes mediated by narrow phylogenetic guilds via functional gene data, and conversely, improvement in models of facultative microbial processes via community diversity metrics. Our results also suggest that microbial diversity can strengthen predictions of respiration rates beyond microbial biomass parameters, as 53% of models were improved by incorporating both sets of predictors compared to 35% by microbial biomass alone. Our analysis represents the first comprehensive analysis of research examining links between microbial community structure and ecosystem function. Taken together, our results indicate that a greater understanding of microbial communities informed by ecological principles may enhance our ability to predict ecosystem process rates relative to assessments based on environmental variables and microbial physiology. PMID:26941732
Graham, Emily B; Knelman, Joseph E; Schindlbacher, Andreas; Siciliano, Steven; Breulmann, Marc; Yannarell, Anthony; Beman, J M; Abell, Guy; Philippot, Laurent; Prosser, James; Foulquier, Arnaud; Yuste, Jorge C; Glanville, Helen C; Jones, Davey L; Angel, Roey; Salminen, Janne; Newton, Ryan J; Bürgmann, Helmut; Ingram, Lachlan J; Hamer, Ute; Siljanen, Henri M P; Peltoniemi, Krista; Potthast, Karin; Bañeras, Lluís; Hartmann, Martin; Banerjee, Samiran; Yu, Ri-Qing; Nogaro, Geraldine; Richter, Andreas; Koranda, Marianne; Castle, Sarah C; Goberna, Marta; Song, Bongkeun; Chatterjee, Amitava; Nunes, Olga C; Lopes, Ana R; Cao, Yiping; Kaisermann, Aurore; Hallin, Sara; Strickland, Michael S; Garcia-Pausas, Jordi; Barba, Josep; Kang, Hojeong; Isobe, Kazuo; Papaspyrou, Sokratis; Pastorelli, Roberta; Lagomarsino, Alessandra; Lindström, Eva S; Basiliko, Nathan; Nemergut, Diana R
2016-01-01
Microorganisms are vital in mediating the earth's biogeochemical cycles; yet, despite our rapidly increasing ability to explore complex environmental microbial communities, the relationship between microbial community structure and ecosystem processes remains poorly understood. Here, we address a fundamental and unanswered question in microbial ecology: 'When do we need to understand microbial community structure to accurately predict function?' We present a statistical analysis investigating the value of environmental data and microbial community structure independently and in combination for explaining rates of carbon and nitrogen cycling processes within 82 global datasets. Environmental variables were the strongest predictors of process rates but left 44% of variation unexplained on average, suggesting the potential for microbial data to increase model accuracy. Although only 29% of our datasets were significantly improved by adding information on microbial community structure, we observed improvement in models of processes mediated by narrow phylogenetic guilds via functional gene data, and conversely, improvement in models of facultative microbial processes via community diversity metrics. Our results also suggest that microbial diversity can strengthen predictions of respiration rates beyond microbial biomass parameters, as 53% of models were improved by incorporating both sets of predictors compared to 35% by microbial biomass alone. Our analysis represents the first comprehensive analysis of research examining links between microbial community structure and ecosystem function. Taken together, our results indicate that a greater understanding of microbial communities informed by ecological principles may enhance our ability to predict ecosystem process rates relative to assessments based on environmental variables and microbial physiology.
Using task analysis to improve the requirements elicitation in health information system.
Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa
2007-01-01
This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.
Systems approach to managing educational quality in the engineering classroom
NASA Astrophysics Data System (ADS)
Grygoryev, Kostyantyn
Today's competitive environment in post-secondary education requires universities to demonstrate the quality of their programs in order to attract financing, and student and academic talent. Despite significant efforts devoted to improving the quality of higher education, systematic, continuous performance measurement and management still have not reached the level where educational outputs and outcomes are actually produced---the classroom. An engineering classroom is a complex environment in which educational inputs are transformed by educational processes into educational outputs and outcomes. By treating a classroom as a system, one can apply tools such as Structural Equation Modeling, Statistical Process Control, and System Dynamics in order to discover cause-and-effect relationships among the classroom variables, control the classroom processes, and evaluate the effect of changes to the course organization, content, and delivery, on educational processes and outcomes. Quality improvement is best achieved through the continuous, systematic application of efforts and resources. Improving classroom processes and outcomes is an iterative process that starts with identifying opportunities for improvement, designing the action plan, implementing the changes, and evaluating their effects. Once the desired objectives are achieved, the quality improvement cycle may start again. The goal of this research was to improve the educational processes and outcomes in an undergraduate engineering management course taught at the University of Alberta. The author was involved with the course, first, as a teaching assistant, and, then, as a primary instructor. The data collected from the course over four years were used to create, first, a static and, then, a dynamic model of a classroom system. By using model output and qualitative feedback from students, changes to the course organization and content were introduced. These changes led to a lower perceived course workload and increased the students' satisfaction with the instructor, but the students' overall satisfaction with the course did not change significantly, and their attitude toward the course subject actually became more negative. This research brought performance measurement to the level of a classroom, created a dynamic model of the classroom system based on the cause-and-effect relationships discovered by using statistical analysis, and used a systematic, continuous improvement approach to modify the course in order to improve selected educational processes and outcomes.
A simulation study on garment manufacturing process
NASA Astrophysics Data System (ADS)
Liong, Choong-Yeun; Rahim, Nur Azreen Abdul
2015-02-01
Garment industry is an important industry and continues to evolve in order to meet the consumers' high demands. Therefore, elements of innovation and improvement are important. In this work, research studies were conducted at a local company in order to model the sewing process of clothes manufacturing by using simulation modeling. Clothes manufacturing at the company involves 14 main processes, which are connecting the pattern, center sewing and side neating, pockets sewing, backside-sewing, attaching the front and back, sleeves preparation, attaching the sleeves and over lock, collar preparation, collar sewing, bottomedge sewing, buttonholing sewing, removing excess thread, marking button, and button cross sewing. Those fourteen processes are operated by six tailors only. The last four sets of processes are done by a single tailor. Data collection was conducted by on site observation and the probability distribution of processing time for each of the processes is determined by using @Risk's Bestfit. Then a simulation model is developed using Arena Software based on the data collected. Animated simulation model is developed in order to facilitate understanding and verifying that the model represents the actual system. With such model, what if analysis and different scenarios of operations can be experimented with virtually. The animation and improvement models will be presented in further work.
State of the art in pathology business process analysis, modeling, design and optimization.
Schrader, Thomas; Blobel, Bernd; García-Rojo, Marcial; Daniel, Christel; Słodkowska, Janina
2012-01-01
For analyzing current workflows and processes, for improving them, for quality management and quality assurance, for integrating hardware and software components, but also for education, training and communication between different domains' experts, modeling business process in a pathology department is inevitable. The authors highlight three main processes in pathology: general diagnostic, cytology diagnostic, and autopsy. In this chapter, those processes are formally modeled and described in detail. Finally, specialized processes such as immunohistochemistry and frozen section have been considered.
[Advance in researches on the effect of forest on hydrological process].
Zhang, Zhiqiang; Yu, Xinxiao; Zhao, Yutao; Qin, Yongsheng
2003-01-01
According to the effects of forest on hydrological process, forest hydrology can be divided into three related aspects: experimental research on the effects of forest changing on hydrological process quantity and water quality; mechanism study on the effects of forest changing on hydrological cycle, and establishing and exploitating physical-based distributed forest hydrological model for resource management and engineering construction. Orientation experiment research can not only support the first-hand data for forest hydrological model, but also make clear the precipitation-runoff mechanisms. Research on runoff mechanisms can be valuable for the exploitation and improvement of physical based hydrological models. Moreover, the model can also improve the experimental and runoff mechanism researches. A review of above three aspects are summarized in this paper.
Kadakia, Ekta; Shah, Lipa; Amiji, Mansoor M
2017-07-01
Nanoemulsions have shown potential in delivering drug across epithelial and endothelial cell barriers, which express efflux transporters. However, their transport mechanisms are not entirely understood. Our goal was to investigate the cellular permeability of nanoemulsion-encapsulated drugs and apply mathematical modeling to elucidate transport mechanisms and sensitive nanoemulsion attributes. Transport studies were performed in Caco-2 cells, using fish oil nanoemulsions and a model substrate, rhodamine-123. Permeability data was modeled using a semi-mechanistic approach, capturing the following cellular processes: endocytotic uptake of the nanoemulsion, release of rhodamine-123 from the nanoemulsion, efflux and passive permeability of rhodamine-123 in aqueous solution. Nanoemulsions not only improved the permeability of rhodamine-123, but were also less sensitive to efflux transporters. The model captured bidirectional permeability results and identified sensitive processes, such as the release of the nanoemulsion-encapsulated drug and cellular uptake of the nanoemulsion. Mathematical description of cellular processes, improved our understanding of transport mechanisms, such as nanoemulsions don't inhibit efflux to improve drug permeability. Instead, their endocytotic uptake, results in higher intracellular drug concentrations, thereby increasing the concentration gradient and transcellular permeability across biological barriers. Modeling results indicated optimizing nanoemulsion attributes like the droplet size and intracellular drug release rate, may further improve drug permeability.
Airport security inspection process model and optimization based on GSPN
NASA Astrophysics Data System (ADS)
Mao, Shuainan
2018-04-01
Aiming at the efficiency of airport security inspection process, Generalized Stochastic Petri Net is used to establish the security inspection process model. The model is used to analyze the bottleneck problem of airport security inspection process. The solution to the bottleneck is given, which can significantly improve the efficiency and reduce the waiting time by adding the place for people to remove their clothes and the X-ray detector.
Computer-Based Enhancements for the Improvement of Learning.
ERIC Educational Resources Information Center
Tennyson, Robert D.
The third of four symposium papers argues that, if instructional methods are to improve learning, they must have two aspects: a direct trace to a specific learning process, and empirical support that demonstrates their significance. Focusing on the tracing process, the paper presents an information processing model of learning that can be used by…
NASA Astrophysics Data System (ADS)
Lee, Y. J.; Bonfanti, C. E.; Trailovic, L.; Etherton, B.; Govett, M.; Stewart, J.
2017-12-01
At present, a fraction of all satellite observations are ultimately used for model assimilation. The satellite data assimilation process is computationally expensive and data are often reduced in resolution to allow timely incorporation into the forecast. This problem is only exacerbated by the recent launch of Geostationary Operational Environmental Satellite (GOES)-16 satellite and future satellites providing several order of magnitude increase in data volume. At the NOAA Earth System Research Laboratory (ESRL) we are researching the use of machine learning the improve the initial selection of satellite data to be used in the model assimilation process. In particular, we are investigating the use of deep learning. Deep learning is being applied to many image processing and computer vision problems with great success. Through our research, we are using convolutional neural network to find and mark regions of interest (ROI) to lead to intelligent extraction of observations from satellite observation systems. These targeted observations will be used to improve the quality of data selected for model assimilation and ultimately improve the impact of satellite data on weather forecasts. Our preliminary efforts to identify the ROI's are focused in two areas: applying and comparing state-of-art convolutional neural network models using the analysis data from the National Center for Environmental Prediction (NCEP) Global Forecast System (GFS) weather model, and using these results as a starting point to optimize convolution neural network model for pattern recognition on the higher resolution water vapor data from GOES-WEST and other satellite. This presentation will provide an introduction to our convolutional neural network model to identify and process these ROI's, along with the challenges of data preparation, training the model, and parameter optimization.
Simulating pedestrian flow by an improved two-process cellular automaton model
NASA Astrophysics Data System (ADS)
Jin, Cheng-Jie; Wang, Wei; Jiang, Rui; Dong, Li-Yun
In this paper, we study the pedestrian flow with an Improved Two-Process (ITP) cellular automaton model, which is originally proposed by Blue and Adler. Simulations of pedestrian counterflow have been conducted, under both periodic and open boundary conditions. The lane formation phenomenon has been reproduced without using the place exchange rule. We also present and discuss the flow-density and velocity-density relationships of both uni-directional flow and counterflow. By the comparison with the Blue-Adler model, we find the ITP model has higher values of maximum flow, critical density and completely jammed density under different conditions.
A process proof test for model concepts: Modelling the meso-scale
NASA Astrophysics Data System (ADS)
Hellebrand, Hugo; Müller, Christoph; Matgen, Patrick; Fenicia, Fabrizio; Savenije, Huub
In hydrological modelling the use of detailed soil data is sometimes troublesome, since often these data are hard to obtain and, if available at all, difficult to interpret and process in a way that makes them meaningful for the model at hand. Intuitively the understanding and mapping of dominant runoff processes in the soil show high potential for improving hydrological models. In this study a labour-intensive methodology to assess dominant runoff processes is simplified in such a way that detailed soil maps are no longer needed. Nonetheless, there is an ongoing debate on how to integrate this type of information in hydrological models. In this study, dominant runoff processes (DRP) are mapped for meso-scale basins using the permeability of the substratum, land use information and the slope in a GIS. During a field campaign the processes are validated and for each DRP assumptions are made concerning their water storage capacity. The latter is done by means of combining soil data obtained during the field campaign with soil data obtained from the literature. Second, several parsimoniously parameterized conceptual hydrological models are used that incorporate certain aspects of the DRP. The result of these models are compared with a benchmark model in which the soil is represented as only one lumped parameter to test the contribution of the DRP in hydrological models. The proposed methodology is tested for 15 meso-scale river basins located in Luxembourg. The main goal of this study is to investigate if integrating dominant runoff processes, which have high information content concerning soil characteristics, with hydrological models allows the improvement of simulation results models with a view to regionalization and predictions in ungauged basins. The regionalization procedure gave no clear results. The calibration procedure and the well-mixed discharge signal of the calibration basins are considered major causes for this and it made the deconvolution of discharge signals of meso-scale basins problematic. From the results it is also suggested that DRP could very well display some sort of uniqueness of place, which was not foreseen in the methods from which they were derived. Furthermore, a strong seasonal influence on model performance was observed, implying a seasonal dependence of the DRP. When comparing the performance between the DRP models and the benchmark model no real distinction was found. To improve the performance of the DRP models, which are used in this study and also for then use of conceptual models in general, there is a need for an improved identification of the mechanisms that cause the different dominant runoff processes at the meso-scale. To achieve this, more orthogonal data could be of use for a better conceptualization of the DRPs. Then, models concepts should be adapted accordingly.
Four decades of modeling methane cycling in terrestrial ecosystems: Where we are heading?
NASA Astrophysics Data System (ADS)
Xu, X.; Yuan, F.; Hanson, P. J.; Wullschleger, S. D.; Thornton, P. E.; Tian, H.; Riley, W. J.; Song, X.; Graham, D. E.; Song, C.
2015-12-01
A modeling approach to methane (CH4) is widely used to quantify the budget, investigate spatial and temporal variabilities, and understand the mechanistic processes and environmental controls on CH4 fluxes across spatial and temporal scales. Moreover, CH4 models are an important tool for integrating CH4 data from multiple sources, such as laboratory-based incubation and molecular analysis, field observational experiments, remote sensing, and aircraft-based measurements across a variety of terrestrial ecosystems. We reviewed 39 terrestrial CH4 models to characterize their strengths and weaknesses and to design a roadmap for future model improvement and application. We found that: (1) the focus of CH4 models have been shifted from theoretical to site- to regional-level application over the past four decades, expressed as dramatic increases in CH4 model development on regional budget quantification; (2) large discrepancies exist among models in terms of representing CH4 processes and their environmental controls; (3) significant data-model and model-model mismatches are partially attributed to different representations of wetland characterization and inundation dynamics. Three efforts should be paid special attention for future improvements and applications of fully mechanistic CH4 models: (1) CH4 models should be improved to represent the mechanisms underlying land-atmosphere CH4 exchange, with emphasis on improving and validating individual CH4 processes over depth and horizontal space; (2) models should be developed that are capable of simulating CH4 fluxes across space and time (particularly hot moments and hot spots); (3) efforts should be invested to develop model benchmarking frameworks that can easily be used for model improvement, evaluation, and integration with data from molecular to global scales. A newly developed microbial functional group-based CH4 model (CLM-Microbe) was further used to demonstrate the features of mechanistic representation and integration with multiple source of observational datasets.
Re-engineering pre-employment check-up systems: a model for improving health services.
Rateb, Said Abdel Hakim; El Nouman, Azza Abdel Razek; Rateb, Moshira Abdel Hakim; Asar, Mohamed Naguib; El Amin, Ayman Mohammed; Gad, Saad abdel Aziz; Mohamed, Mohamed Salah Eldin
2011-01-01
The purpose of this paper is to develop a model for improving health services provided by the pre-employment medical fitness check-up system affiliated to Egypt's Health Insurance Organization (HIO). Operations research, notably system re-engineering, is used in six randomly selected centers and findings before and after re-engineering are compared. The re-engineering model follows a systems approach, focusing on three areas: structure, process and outcome. The model is based on six main components: electronic booking, standardized check-up processes, protected medical documents, advanced archiving through an electronic content management (ECM) system, infrastructure development, and capacity building. The model originates mainly from customer needs and expectations. The centers' monthly customer flow increased significantly after re-engineering. The mean time spent per customer cycle improved after re-engineering--18.3 +/- 5.5 minutes as compared to 48.8 +/- 14.5 minutes before. Appointment delay was also significantly decreased from an average 18 to 6.2 days. Both beneficiaries and service providers were significantly more satisfied with the services after re-engineering. The model proves that re-engineering program costs are exceeded by increased revenue. Re-engineering in this study involved multiple structure and process elements. The literature review did not reveal similar re-engineering healthcare packages. Therefore, each element was compared separately. This model is highly recommended for improving service effectiveness and efficiency. This research is the first in Egypt to apply the re-engineering approach to public health systems. Developing user-friendly models for service improvement is an added value.
NASA Astrophysics Data System (ADS)
Hunter, Jason M.; Maier, Holger R.; Gibbs, Matthew S.; Foale, Eloise R.; Grosvenor, Naomi A.; Harders, Nathan P.; Kikuchi-Miller, Tahali C.
2018-05-01
Salinity modelling in river systems is complicated by a number of processes, including in-stream salt transport and various mechanisms of saline accession that vary dynamically as a function of water level and flow, often at different temporal scales. Traditionally, salinity models in rivers have either been process- or data-driven. The primary problem with process-based models is that in many instances, not all of the underlying processes are fully understood or able to be represented mathematically. There are also often insufficient historical data to support model development. The major limitation of data-driven models, such as artificial neural networks (ANNs) in comparison, is that they provide limited system understanding and are generally not able to be used to inform management decisions targeting specific processes, as different processes are generally modelled implicitly. In order to overcome these limitations, a generic framework for developing hybrid process and data-driven models of salinity in river systems is introduced and applied in this paper. As part of the approach, the most suitable sub-models are developed for each sub-process affecting salinity at the location of interest based on consideration of model purpose, the degree of process understanding and data availability, which are then combined to form the hybrid model. The approach is applied to a 46 km reach of the Murray River in South Australia, which is affected by high levels of salinity. In this reach, the major processes affecting salinity include in-stream salt transport, accession of saline groundwater along the length of the reach and the flushing of three waterbodies in the floodplain during overbank flows of various magnitudes. Based on trade-offs between the degree of process understanding and data availability, a process-driven model is developed for in-stream salt transport, an ANN model is used to model saline groundwater accession and three linear regression models are used to account for the flushing of the different floodplain storages. The resulting hybrid model performs very well on approximately 3 years of daily validation data, with a Nash-Sutcliffe efficiency (NSE) of 0.89 and a root mean squared error (RMSE) of 12.62 mg L-1 (over a range from approximately 50 to 250 mg L-1). Each component of the hybrid model results in noticeable improvements in model performance corresponding to the range of flows for which they are developed. The predictive performance of the hybrid model is significantly better than that of a benchmark process-driven model (NSE = -0.14, RMSE = 41.10 mg L-1, Gbench index = 0.90) and slightly better than that of a benchmark data-driven (ANN) model (NSE = 0.83, RMSE = 15.93 mg L-1, Gbench index = 0.36). Apart from improved predictive performance, the hybrid model also has advantages over the ANN benchmark model in terms of increased capacity for improving system understanding and greater ability to support management decisions.
Evaluation of an urban canopy model in a tropical city: the role of tree evapotranspiration
NASA Astrophysics Data System (ADS)
Liu, Xuan; Li, Xian-Xiang; Harshan, Suraj; Roth, Matthias; Velasco, Erik
2017-09-01
A single layer urban canopy model (SLUCM) with enhanced hydrologic processes, is evaluated in a tropical city, Singapore. The evaluation was performed using an 11 month offline simulation with the coupled Noah land surface model/SLUCM over a compact low-rise residential area. Various hydrological processes are considered, including anthropogenic latent heat release, and evaporation from impervious urban facets. Results show that the prediction of energy fluxes, in particular latent heat flux, is improved when these processes were included. However, the simulated latent heat flux is still underestimated by ∼40%. Considering Singapore’s high green cover ratio, the tree evapotranspiration process is introduced into the model, which significantly improves the simulated latent heat flux. In particular, the systematic error of the model is greatly reduced, and becomes lower than the unsystematic error in some seasons. The effect of tree evapotranspiration on the urban surface energy balance is further demonstrated during an unusual dry spell. The present study demonstrates that even at sites with relatively low (11%) tree coverage, ignoring evapotranspiration from trees may cause serious underestimation of the latent heat flux and atmospheric humidity. The improved model is also transferable to other tropical or temperate regions to study the impact of tree evapotranspiration on urban climate.
Performance improvement CME for quality: challenges inherent to the process.
Vakani, Farhan Saeed; O'Beirne, Ronan
2015-01-01
The purpose of this paper is to discuss the perspective debates upon the real-time challenges for a three-staged Performance Improvement Continuing Medical Education (PI-CME) model, an innovative and potential approach for future CME, to inform providers to think, prepare and to act proactively. In this discussion, the challenges associated for adopting the American Medical Association's three-staged PI-CME model are reported. Not many institutions in USA are using a three-staged performance improvement model and then customizing it to their own healthcare context for the specific targeted audience. They integrate traditional CME methods with performance and quality initiatives, and linking with CME credits. Overall the US health system is interested in a structured PI-CME model with the potential to improve physicians practicing behaviors. Knowing the dearth of evidence for applying this structured performance improvement methodology into the design of CME activities, and the lack of clarity on challenges inherent to the process that learners and providers encounter. This paper establishes all-important first step to render the set of challenges for a three-staged PI-CME model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allgor, R.J.; Feehery, W.F.; Tolsma, J.E.
The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.
Study of the 5E Instructional Model to Improve the Instructional Design Process of Novice Teachers
ERIC Educational Resources Information Center
Hu, Jiuhua; Gao, Chong; Liu, Yang
2017-01-01
This study investigated the effects of 5E instructional model on the teaching processes of novice teachers. First, we conducted a teaching design training project based on the 5E model for 40 novice teachers, and compared pre-texts of the teachers' teaching process from before the training with post-texts obtained immediately following the…
NASA Astrophysics Data System (ADS)
Kuras, P. K.; Weiler, M.; Alila, Y.; Spittlehouse, D.; Winkler, R.
2006-12-01
Hydrologic models have been increasingly used in forest hydrology to overcome the limitations of paired watershed experiments, where vegetative recovery and natural variability obscure the inferences and conclusions that can be drawn from such studies. Models, however, are also plagued by uncertainty stemming from a limited understanding of hydrological processes in forested catchments and parameter equifinality is a common concern. This has created the necessity to improve our understanding of how hydrological systems work, through the development of hydrological measures, analyses and models that address the question: are we getting the right answers for the right reasons? Hence, physically-based, spatially-distributed hydrologic models should be validated with high-quality experimental data describing multiple concurrent internal catchment processes under a range of hydrologic regimes. The distributed hydrology soil vegetation model (DHSVM) frequently used in forest management applications is an example of a process-based model used to address the aforementioned circumstances, and this study takes a novel approach at collectively examining the ability of a pre-calibrated model application to realistically simulate outlet flows along with the spatial-temporal variation of internal catchment processes including: continuous groundwater dynamics at 9 locations, stream and road network flow at 67 locations for six individual days throughout the freshet, and pre-melt season snow distribution. Model efficiency was improved over prior evaluations due to continuous efforts in improving the quality of meteorological data in the watershed. Road and stream network flows were very well simulated for a range of hydrological conditions, and the spatial distribution of the pre-melt season snowpack was in general agreement with observed values. The model was effective in simulating the spatial variability of subsurface flow generation, except at locations where strong stream-groundwater interactions existed, as the model is not capable of simulating such processes and subsurface flows always drain to the stream network. The model has proven overall to be quite capable in realistically simulating internal catchment processes in the watershed, which creates more confidence in future model applications exploring the effects of various forest management scenarios on the watershed's hydrological processes.
Cognitive Components Underpinning the Development of Model-Based Learning
Potter, Tracey C.S.; Bryce, Nessa V.; Hartley, Catherine A.
2016-01-01
Reinforcement learning theory distinguishes “model-free” learning, which fosters reflexive repetition of previously rewarded actions, from “model-based” learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9–25, we examined whether the abilities to infer sequential regularities in the environment (“statistical learning”), maintain information in an active state (“working memory”) and integrate distant concepts to solve problems (“fluid reasoning”) predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. PMID:27825732
Cognitive components underpinning the development of model-based learning.
Potter, Tracey C S; Bryce, Nessa V; Hartley, Catherine A
2017-06-01
Reinforcement learning theory distinguishes "model-free" learning, which fosters reflexive repetition of previously rewarded actions, from "model-based" learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9-25, we examined whether the abilities to infer sequential regularities in the environment ("statistical learning"), maintain information in an active state ("working memory") and integrate distant concepts to solve problems ("fluid reasoning") predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woods, Sarah
2015-12-01
The dual objectives of this project were improving our basic understanding of processes that control cirrus microphysical properties and improvement of the representation of these processes in the parameterizations. A major effort in the proposed research was to integrate, calibrate, and better understand the uncertainties in all of these measurements.
Drupsteen, Linda; Groeneweg, Jop; Zwetsloot, Gerard I J M
2013-01-01
Many incidents have occurred because organisations have failed to learn from lessons of the past. This means that there is room for improvement in the way organisations analyse incidents, generate measures to remedy identified weaknesses and prevent reoccurrence: the learning from incidents process. To improve that process, it is necessary to gain insight into the steps of this process and to identify factors that hinder learning (bottlenecks). This paper presents a model that enables organisations to analyse the steps in a learning from incidents process and to identify the bottlenecks. The study describes how this model is used in a survey and in 3 exploratory case studies in The Netherlands. The results show that there is limited use of learning potential, especially in the evaluation stage. To improve learning, an approach that considers all steps is necessary.
LIU, Tongzhu; SHEN, Aizong; HU, Xiaojian; TONG, Guixian; GU, Wei
2017-01-01
Background: We aimed to apply collaborative business intelligence (BI) system to hospital supply, processing and distribution (SPD) logistics management model. Methods: We searched Engineering Village database, China National Knowledge Infrastructure (CNKI) and Google for articles (Published from 2011 to 2016), books, Web pages, etc., to understand SPD and BI related theories and recent research status. For the application of collaborative BI technology in the hospital SPD logistics management model, we realized this by leveraging data mining techniques to discover knowledge from complex data and collaborative techniques to improve the theories of business process. Results: For the application of BI system, we: (i) proposed a layered structure of collaborative BI system for intelligent management in hospital logistics; (ii) built data warehouse for the collaborative BI system; (iii) improved data mining techniques such as supporting vector machines (SVM) and swarm intelligence firefly algorithm to solve key problems in hospital logistics collaborative BI system; (iv) researched the collaborative techniques oriented to data and business process optimization to improve the business processes of hospital logistics management. Conclusion: Proper combination of SPD model and BI system will improve the management of logistics in the hospitals. The successful implementation of the study requires: (i) to innovate and improve the traditional SPD model and make appropriate implement plans and schedules for the application of BI system according to the actual situations of hospitals; (ii) the collaborative participation of internal departments in hospital including the department of information, logistics, nursing, medical and financial; (iii) timely response of external suppliers. PMID:28828316
Fong, Allan; Harriott, Nicole; Walters, Donna M; Foley, Hanan; Morrissey, Richard; Ratwani, Raj R
2017-08-01
Many healthcare providers have implemented patient safety event reporting systems to better understand and improve patient safety. Reviewing and analyzing these reports is often time consuming and resource intensive because of both the quantity of reports and length of free-text descriptions in the reports. Natural language processing (NLP) experts collaborated with clinical experts on a patient safety committee to assist in the identification and analysis of medication related patient safety events. Different NLP algorithmic approaches were developed to identify four types of medication related patient safety events and the models were compared. Well performing NLP models were generated to categorize medication related events into pharmacy delivery delays, dispensing errors, Pyxis discrepancies, and prescriber errors with receiver operating characteristic areas under the curve of 0.96, 0.87, 0.96, and 0.81 respectively. We also found that modeling the brief without the resolution text generally improved model performance. These models were integrated into a dashboard visualization to support the patient safety committee review process. We demonstrate the capabilities of various NLP models and the use of two text inclusion strategies at categorizing medication related patient safety events. The NLP models and visualization could be used to improve the efficiency of patient safety event data review and analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Animated-simulation modeling facilitates clinical-process costing.
Zelman, W N; Glick, N D; Blackmore, C C
2001-09-01
Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making.
Using CASE to Exploit Process Modeling in Technology Transfer
NASA Technical Reports Server (NTRS)
Renz-Olar, Cheryl
2003-01-01
A successful business will be one that has processes in place to run that business. Creating processes, reengineering processes, and continually improving processes can be accomplished through extensive modeling. Casewise(R) Corporate Modeler(TM) CASE is a computer aided software engineering tool that will enable the Technology Transfer Department (TT) at NASA Marshall Space Flight Center (MSFC) to capture these abilities. After successful implementation of CASE, it could then go on to be applied in other departments at MSFC and other centers at NASA. The success of a business process is dependent upon the players working as a team and continuously improving the process. A good process fosters customer satisfaction as well as internal satisfaction in the organizational infrastructure. CASE provides a method for business process success through functions consisting of systems and processes business models; specialized diagrams; matrix management; simulation; report generation and publishing; and, linking, importing, and exporting documents and files. The software has an underlying repository or database to support these functions. The Casewise. manual informs us that dynamics modeling is a technique used in business design and analysis. Feedback is used as a tool for the end users and generates different ways of dealing with the process. Feedback on this project resulted from collection of issues through a systems analyst interface approach of interviews with process coordinators and Technical Points of Contact (TPOCs).
ERIC Educational Resources Information Center
Oncu, Elif Celebi
2016-01-01
The main objective of this study was improving university students' from different faculties creativity thinking through a creativity education process. The education process took twelve weeks' time. As pretest, Torrance test of creative thinking (TTCT) figural form was used. Participants were 24 university students from different faculties who…
He, Yan-Lin; Xu, Yuan; Geng, Zhi-Qiang; Zhu, Qun-Xiong
2016-03-01
In this paper, a hybrid robust model based on an improved functional link neural network integrating with partial least square (IFLNN-PLS) is proposed. Firstly, an improved functional link neural network with small norm of expanded weights and high input-output correlation (SNEWHIOC-FLNN) was proposed for enhancing the generalization performance of FLNN. Unlike the traditional FLNN, the expanded variables of the original inputs are not directly used as the inputs in the proposed SNEWHIOC-FLNN model. The original inputs are attached to some small norm of expanded weights. As a result, the correlation coefficient between some of the expanded variables and the outputs is enhanced. The larger the correlation coefficient is, the more relevant the expanded variables tend to be. In the end, the expanded variables with larger correlation coefficient are selected as the inputs to improve the performance of the traditional FLNN. In order to test the proposed SNEWHIOC-FLNN model, three UCI (University of California, Irvine) regression datasets named Housing, Concrete Compressive Strength (CCS), and Yacht Hydro Dynamics (YHD) are selected. Then a hybrid model based on the improved FLNN integrating with partial least square (IFLNN-PLS) was built. In IFLNN-PLS model, the connection weights are calculated using the partial least square method but not the error back propagation algorithm. Lastly, IFLNN-PLS was developed as an intelligent measurement model for accurately predicting the key variables in the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. Simulation results illustrated that the IFLNN-PLS could significant improve the prediction performance. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Topography exerts critical controls on many hydrologic, geomorphologic, and environmental biophysical processes. Unfortunately many watershed modeling systems use topography only to define basin boundaries and stream channels and do not explicitly account for the topographic controls on processes su...
Innovative model of business process reengineering at machine building enterprises
NASA Astrophysics Data System (ADS)
Nekrasov, R. Yu; Tempel, Yu A.; Tempel, O. A.
2017-10-01
The paper provides consideration of business process reengineering viewed as amanagerial innovation accepted by present day machine building enterprises, as well as waysto improve its procedure. A developed innovative model of reengineering measures isdescribed and is based on the process approach and other principles of company management.
Improvement of the model for surface process of tritium release from lithium oxide
NASA Astrophysics Data System (ADS)
Yamaki, Daiju; Iwamoto, Akira; Jitsukawa, Shiro
2000-12-01
Among the various tritium transport processes in lithium ceramics, the importance and the detailed mechanism of surface reactions remain to be elucidated. The dynamic adsorption and desorption model for tritium desorption from lithium ceramics, especially Li 2O was constructed. From the experimental results, it was considered that both H 2 and H 2O are dissociatively adsorbed on Li 2O and generate OH - on the surface. In the first model developed in 1994, it was assumed that either the dissociative adsorption of H 2 or H 2O on Li 2O generates two OH - on the surface. However, recent calculation results show that the generation of one OH - and one H - is more stable than that of two OH -s by the dissociative adsorption of H 2. Therefore, assumption of H 2 adsorption and desorption in the first model is improved and the tritium release behavior from Li 2O surface is evaluated again by using the improved model. The tritium residence time on the Li 2O surface is calculated using the improved model, and the results are compared with the experimental results. The calculation results using the improved model agree well with the experimental results than those using the first model.
NASA Astrophysics Data System (ADS)
Nasution, Derlina; Syahreni Harahap, Putri; Harahap, Marabangun
2018-03-01
This research aims to: (1) developed a instrument’s learning (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) of physics learning through scientific inquiry learning model based Batak culture to achieve skills improvement process of science students and the students’ curiosity; (2) describe the quality of the result of develop instrument’s learning in high school using scientific inquiry learning model based Batak culture (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) to achieve the science process skill improvement of students and the student curiosity. This research is research development. This research developed a instrument’s learning of physics by using a development model that is adapted from the development model Thiagarajan, Semmel, and Semmel. The stages are traversed until retrieved a valid physics instrument’s learning, practical, and effective includes :(1) definition phase, (2) the planning phase, and (3) stages of development. Test performed include expert test/validation testing experts, small groups, and test classes is limited. Test classes are limited to do in SMAN 1 Padang Bolak alternating on a class X MIA. This research resulted in: 1) the learning of physics static fluid material specially for high school grade 10th consisted of (lesson plan, worksheet, student’s book, teacher’s guide book, and instrument test) and quality worthy of use in the learning process; 2) each component of the instrument’s learning meet the criteria have valid learning, practical, and effective way to reach the science process skill improvement and curiosity in students.
Computer modeling of lung cancer diagnosis-to-treatment process
Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U.; Yu, Xinhua; Faris, Nick
2015-01-01
We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed. PMID:26380181
Data Assimilation at FLUXNET to Improve Models towards Ecological Forecasting (Invited)
NASA Astrophysics Data System (ADS)
Luo, Y.
2009-12-01
Dramatically increased volumes of data from observational and experimental networks such as FLUXNET call for transformation of ecological research to increase its emphasis on quantitative forecasting. Ecological forecasting will also meet the societal need to develop better strategies for natural resource management in a world of ongoing global change. Traditionally, ecological forecasting has been based on process-based models, informed by data in largely ad hoc ways. Although most ecological models incorporate some representation of mechanistic processes, today’s ecological models are generally not adequate to quantify real-world dynamics and provide reliable forecasts with accompanying estimates of uncertainty. A key tool to improve ecological forecasting is data assimilation, which uses data to inform initial conditions and to help constrain a model during simulation to yield results that approximate reality as closely as possible. In an era with dramatically increased availability of data from observational and experimental networks, data assimilation is a key technique that helps convert the raw data into ecologically meaningful products so as to accelerate our understanding of ecological processes, test ecological theory, forecast changes in ecological services, and better serve the society. This talk will use examples to illustrate how data from FLUXNET have been assimilated with process-based models to improve estimates of model parameters and state variables; to quantify uncertainties in ecological forecasting arising from observations, models and their interactions; and to evaluate information contributions of data and model toward short- and long-term forecasting of ecosystem responses to global change.
Optimization Control of the Color-Coating Production Process for Model Uncertainty
He, Dakuo; Wang, Zhengsong; Yang, Le; Mao, Zhizhong
2016-01-01
Optimized control of the color-coating production process (CCPP) aims at reducing production costs and improving economic efficiency while meeting quality requirements. However, because optimization control of the CCPP is hampered by model uncertainty, a strategy that considers model uncertainty is proposed. Previous work has introduced a mechanistic model of CCPP based on process analysis to simulate the actual production process and generate process data. The partial least squares method is then applied to develop predictive models of film thickness and economic efficiency. To manage the model uncertainty, the robust optimization approach is introduced to improve the feasibility of the optimized solution. Iterative learning control is then utilized to further refine the model uncertainty. The constrained film thickness is transformed into one of the tracked targets to overcome the drawback that traditional iterative learning control cannot address constraints. The goal setting of economic efficiency is updated continuously according to the film thickness setting until this reaches its desired value. Finally, fuzzy parameter adjustment is adopted to ensure that the economic efficiency and film thickness converge rapidly to their optimized values under the constraint conditions. The effectiveness of the proposed optimization control strategy is validated by simulation results. PMID:27247563
Optimization Control of the Color-Coating Production Process for Model Uncertainty.
He, Dakuo; Wang, Zhengsong; Yang, Le; Mao, Zhizhong
2016-01-01
Optimized control of the color-coating production process (CCPP) aims at reducing production costs and improving economic efficiency while meeting quality requirements. However, because optimization control of the CCPP is hampered by model uncertainty, a strategy that considers model uncertainty is proposed. Previous work has introduced a mechanistic model of CCPP based on process analysis to simulate the actual production process and generate process data. The partial least squares method is then applied to develop predictive models of film thickness and economic efficiency. To manage the model uncertainty, the robust optimization approach is introduced to improve the feasibility of the optimized solution. Iterative learning control is then utilized to further refine the model uncertainty. The constrained film thickness is transformed into one of the tracked targets to overcome the drawback that traditional iterative learning control cannot address constraints. The goal setting of economic efficiency is updated continuously according to the film thickness setting until this reaches its desired value. Finally, fuzzy parameter adjustment is adopted to ensure that the economic efficiency and film thickness converge rapidly to their optimized values under the constraint conditions. The effectiveness of the proposed optimization control strategy is validated by simulation results.
NASA Technical Reports Server (NTRS)
Colle, Brian A.; Molthan, Andrew L.
2013-01-01
The representation of clouds in climate and weather models is a driver in forecast uncertainty. Cloud microphysics parameterizations are challenged by having to represent a diverse range of ice species. Key characteristics of predicted ice species include habit and fall speed, and complex interactions that result from mixed-phased processes like riming. Our proposed activity leverages Global Precipitation Measurement (GPM) Mission ground validation studies to improve parameterizations
Visual Perceptual Learning and Models.
Dosher, Barbara; Lu, Zhong-Lin
2017-09-15
Visual perceptual learning through practice or training can significantly improve performance on visual tasks. Originally seen as a manifestation of plasticity in the primary visual cortex, perceptual learning is more readily understood as improvements in the function of brain networks that integrate processes, including sensory representations, decision, attention, and reward, and balance plasticity with system stability. This review considers the primary phenomena of perceptual learning, theories of perceptual learning, and perceptual learning's effect on signal and noise in visual processing and decision. Models, especially computational models, play a key role in behavioral and physiological investigations of the mechanisms of perceptual learning and for understanding, predicting, and optimizing human perceptual processes, learning, and performance. Performance improvements resulting from reweighting or readout of sensory inputs to decision provide a strong theoretical framework for interpreting perceptual learning and transfer that may prove useful in optimizing learning in real-world applications.
Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software
NASA Astrophysics Data System (ADS)
Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.
2017-09-01
This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.
A CPT for Improving Turbulence and Cloud Processes in the NCEP Global Models
NASA Astrophysics Data System (ADS)
Krueger, S. K.; Moorthi, S.; Randall, D. A.; Pincus, R.; Bogenschutz, P.; Belochitski, A.; Chikira, M.; Dazlich, D. A.; Swales, D. J.; Thakur, P. K.; Yang, F.; Cheng, A.
2016-12-01
Our Climate Process Team (CPT) is based on the premise that the NCEP (National Centers for Environmental Prediction) global models can be improved by installing an integrated, self-consistent description of turbulence, clouds, deep convection, and the interactions between clouds and radiative and microphysical processes. The goal of our CPT is to unify the representation of turbulence and subgrid-scale (SGS) cloud processes and to unify the representation of SGS deep convective precipitation and grid-scale precipitation as the horizontal resolution decreases. We aim to improve the representation of small-scale phenomena by implementing a PDF-based SGS turbulence and cloudiness scheme that replaces the boundary layer turbulence scheme, the shallow convection scheme, and the cloud fraction schemes in the GFS (Global Forecast System) and CFS (Climate Forecast System) global models. We intend to improve the treatment of deep convection by introducing a unified parameterization that scales continuously between the simulation of individual clouds when and where the grid spacing is sufficiently fine and the behavior of a conventional parameterization of deep convection when and where the grid spacing is coarse. We will endeavor to improve the representation of the interactions of clouds, radiation, and microphysics in the GFS/CFS by using the additional information provided by the PDF-based SGS cloud scheme. The team is evaluating the impacts of the model upgrades with metrics used by the NCEP short-range and seasonal forecast operations.
Measuring, managing and maximizing refinery performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bascur, O.A.; Kennedy, J.P.
1996-01-01
Implementing continuous quality improvement is a confluence of total quality management, people empowerment, performance indicators and information engineering. Supporting information technologies allow a refiner to narrow the gap between management objectives and the process control level. Dynamic performance monitoring benefits come from production cost savings, improved communications and enhanced decision making. A refinery workgroup information flow model helps automate continuous improvement of processes, performance and the organization. The paper discusses the rethinking of refinery operations, dynamic performance monitoring, continuous process improvement, the knowledge coordinator and repository manager, an integrated plant operations workflow, and successful implementation.
An assembly process model based on object-oriented hierarchical time Petri Nets
NASA Astrophysics Data System (ADS)
Wang, Jiapeng; Liu, Shaoli; Liu, Jianhua; Du, Zenghui
2017-04-01
In order to improve the versatility, accuracy and integrity of the assembly process model of complex products, an assembly process model based on object-oriented hierarchical time Petri Nets is presented. A complete assembly process information model including assembly resources, assembly inspection, time, structure and flexible parts is established, and this model describes the static and dynamic data involved in the assembly process. Through the analysis of three-dimensional assembly process information, the assembly information is hierarchically divided from the whole, the local to the details and the subnet model of different levels of object-oriented Petri Nets is established. The communication problem between Petri subnets is solved by using message database, and it reduces the complexity of system modeling effectively. Finally, the modeling process is presented, and a five layer Petri Nets model is established based on the hoisting process of the engine compartment of a wheeled armored vehicle.
Docimo, A B; Pronovost, P J; Davis, R O; Concordia, E B; Gabrish, C M; Adessa, M S; Bessman, E
2000-09-01
In 1998 the emergency department (ED) Work Group at Johns Hopkins Bayview Medical Center (Baltimore) worked to reinvigorate the fast-track program within the ED to improve throughput for patients with minor illnesses and injuries who present for care. There had been two prior unsuccessful attempts to overhaul the fast-track process. The work group used a change model intended to improve both processes and relationships for complex organizational problems that span departments and functional units. Before the first work group meeting, the work group evaluated the institutional commitment to address the issue. The next step was to find data to fully understand the issues and establish a baseline for evaluating improvements--for example, patients with minor illnesses and injuries had excessively long total ED (registration to discharge) times: 5 hours 57 minutes on average for nonacute patients. ONLINE AND OFFLINE MEETINGS: The work group identified process problems, but relationship barriers became evident as the new processes were discussed. Yet offline work was needed to minimize the potential for online surprises. The work group leaders met separately in small groups with nursing staff, lab staff, x-ray staff, registrars, and physician's assistants to inform them of data, obtain input about process changes, and address any potential concerns. The group conducted four tests of change (using Plan-Do-Study-Act cycles) to eliminate the root causes of slow turnaround identified previously. Total ED time decreased to an average of 1 hour 47 minutes; the practice of placing nonacute patients in fast track before all higher-acuity patients were seen gained acceptance. The percentage of higher-acuity patients sent to fast track decreased from 17% of all patients seen in fast track in January 1998 to 8.5% by February 1999. Patients with minor illnesses and injuries no longer had to wait behind higher-acuity patients just to be registered. The average wait for registration decreased from 42 minutes in January 1998 to 14 minutes in February 1999. Physician's assistant, nursing, and technician staff all report improved working relationships and feeling a team spirit. The offline component of the integrated model helped to improve organizational relationships and dialogue among team members, thereby facilitating the effectiveness of online efforts to improve processes. This model has also been applied to improve patient registration (revenue recovery) and the emergency transfer and admissions process.
Wang, Jie-sheng; Han, Shuang; Shen, Na-na
2014-01-01
For predicting the key technology indicators (concentrate grade and tailings recovery rate) of flotation process, an echo state network (ESN) based fusion soft-sensor model optimized by the improved glowworm swarm optimization (GSO) algorithm is proposed. Firstly, the color feature (saturation and brightness) and texture features (angular second moment, sum entropy, inertia moment, etc.) based on grey-level co-occurrence matrix (GLCM) are adopted to describe the visual characteristics of the flotation froth image. Then the kernel principal component analysis (KPCA) method is used to reduce the dimensionality of the high-dimensional input vector composed by the flotation froth image characteristics and process datum and extracts the nonlinear principal components in order to reduce the ESN dimension and network complex. The ESN soft-sensor model of flotation process is optimized by the GSO algorithm with congestion factor. Simulation results show that the model has better generalization and prediction accuracy to meet the online soft-sensor requirements of the real-time control in the flotation process. PMID:24982935
Graham, Emily B.; Knelman, Joseph E.; Schindlbacher, Andreas; ...
2016-02-24
In this study, microorganisms are vital in mediating the earth’s biogeochemical cycles; yet, despite our rapidly increasing ability to explore complex environmental microbial communities, the relationship between microbial community structure and ecosystem processes remains poorly understood. Here, we address a fundamental and unanswered question in microbial ecology: ‘When do we need to understand microbial community structure to accurately predict function?’ We present a statistical analysis investigating the value of environmental data and microbial community structure independently and in combination for explaining rates of carbon and nitrogen cycling processes within 82 global datasets. Environmental variables were the strongest predictors of processmore » rates but left 44% of variation unexplained on average, suggesting the potential for microbial data to increase model accuracy. Although only 29% of our datasets were significantly improved by adding information on microbial community structure, we observed improvement in models of processes mediated by narrow phylogenetic guilds via functional gene data, and conversely, improvement in models of facultative microbial processes via community diversity metrics. Our results also suggest that microbial diversity can strengthen predictions of respiration rates beyond microbial biomass parameters, as 53% of models were improved by incorporating both sets of predictors compared to 35% by microbial biomass alone. Our analysis represents the first comprehensive analysis of research examining links between microbial community structure and ecosystem function. Taken together, our results indicate that a greater understanding of microbial communities informed by ecological principles may enhance our ability to predict ecosystem process rates relative to assessments based on environmental variables and microbial physiology.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graham, Emily B.; Knelman, Joseph E.; Schindlbacher, Andreas
In this study, microorganisms are vital in mediating the earth’s biogeochemical cycles; yet, despite our rapidly increasing ability to explore complex environmental microbial communities, the relationship between microbial community structure and ecosystem processes remains poorly understood. Here, we address a fundamental and unanswered question in microbial ecology: ‘When do we need to understand microbial community structure to accurately predict function?’ We present a statistical analysis investigating the value of environmental data and microbial community structure independently and in combination for explaining rates of carbon and nitrogen cycling processes within 82 global datasets. Environmental variables were the strongest predictors of processmore » rates but left 44% of variation unexplained on average, suggesting the potential for microbial data to increase model accuracy. Although only 29% of our datasets were significantly improved by adding information on microbial community structure, we observed improvement in models of processes mediated by narrow phylogenetic guilds via functional gene data, and conversely, improvement in models of facultative microbial processes via community diversity metrics. Our results also suggest that microbial diversity can strengthen predictions of respiration rates beyond microbial biomass parameters, as 53% of models were improved by incorporating both sets of predictors compared to 35% by microbial biomass alone. Our analysis represents the first comprehensive analysis of research examining links between microbial community structure and ecosystem function. Taken together, our results indicate that a greater understanding of microbial communities informed by ecological principles may enhance our ability to predict ecosystem process rates relative to assessments based on environmental variables and microbial physiology.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crater, Jason; Galleher, Connor; Lievense, Jeff
NREL is developing an advanced aerobic bubble column model using Aspen Custom Modeler (ACM). The objective of this work is to integrate the new fermentor model with existing techno-economic models in Aspen Plus and Excel to establish a new methodology for guiding process design. To assist this effort, NREL has contracted Genomatica to critique and make recommendations for improving NREL's bioreactor model and large scale aerobic bioreactor design for biologically producing lipids at commercial scale. Genomatica has highlighted a few areas for improving the functionality and effectiveness of the model. Genomatica recommends using a compartment model approach with an integratedmore » black-box kinetic model of the production microbe. We also suggest including calculations for stirred tank reactors to extend the models functionality and adaptability for future process designs. Genomatica also suggests making several modifications to NREL's large-scale lipid production process design. The recommended process modifications are based on Genomatica's internal techno-economic assessment experience and are focused primarily on minimizing capital and operating costs. These recommendations include selecting/engineering a thermotolerant yeast strain with lipid excretion; using bubble column fermentors; increasing the size of production fermentors; reducing the number of vessels; employing semi-continuous operation; and recycling cell mass.« less
Quantitative CMMI Assessment for Offshoring through the Analysis of Project Management Repositories
NASA Astrophysics Data System (ADS)
Sunetnanta, Thanwadee; Nobprapai, Ni-On; Gotel, Olly
The nature of distributed teams and the existence of multiple sites in offshore software development projects pose a challenging setting for software process improvement. Often, the improvement and appraisal of software processes is achieved through a turnkey solution where best practices are imposed or transferred from a company’s headquarters to its offshore units. In so doing, successful project health checks and monitoring for quality on software processes requires strong project management skills, well-built onshore-offshore coordination, and often needs regular onsite visits by software process improvement consultants from the headquarters’ team. This paper focuses on software process improvement as guided by the Capability Maturity Model Integration (CMMI) and proposes a model to evaluate the status of such improvement efforts in the context of distributed multi-site projects without some of this overhead. The paper discusses the application of quantitative CMMI assessment through the collection and analysis of project data gathered directly from project repositories to facilitate CMMI implementation and reduce the cost of such implementation for offshore-outsourced software development projects. We exemplify this approach to quantitative CMMI assessment through the analysis of project management data and discuss the future directions of this work in progress.
NASA Astrophysics Data System (ADS)
Bommel, P.; Bautista Solís, P.; Leclerc, G.
2016-12-01
We implemented a participatory process with water stakeholders for improving resilience to drought at watershed scale, and for reducing water pollution disputes in drought prone Northwestern Costa Rica. The purpose is to facilitate co-management in a rural watershed impacted by recurrent droughts related to ENSO. The process involved designing "ContaMiCuenca", a hybrid agent-based model where users can specify the decisions of their agents. We followed a Companion Modeling approach (www.commod.org) and organized 10 workshops that included research techniques such as participatory diagnostics, actor-resources-interaction and UML diagrams, multi-agents model design, and interactive simulation sessions. We collectively assessed the main water issues in the watershed, prioritized their importance, defined the objectives of the process, and pilot-tested ContaMiCuenca for environmental education with adults and children. Simulation sessions resulted in debates about the need to improve the model accuracy, arguably more relevant for decision-making. This helped identify sensible knowledge gaps in the groundwater pollution and aquifer dynamics that need to be addressed in order to improve our collective learning. Significant mismatches among participants expectations, objectives, and agendas considerably slowed down the participatory process. The main issue may originate in participants expecting technical solutions from a positivist science, as constantly promoted in the region by dole-out initiatives, which is incompatible with the constructivist stance of participatory modellers. This requires much closer interaction of community members with modellers, which may be hard to attain in the current research practice and institutional context. Nevertheless, overcoming these constraints is necessary for a true involvement of water stakeholders to achieve community-based decisions that facilitate integrated water management. Our findings provide significant guidance for improving the trans-generational engagement of stakeholders in participatory modeling processes in a context of limited technical skills and information, research expectative mismatches, and poor multi-stakeholder interaction for decision-making.
Validating a model that predicts daily growth and feed quality of New Zealand dairy pastures.
Woodward, S J
2001-09-01
The Pasture Quality (PQ) model is a simple, mechanistic, dynamical system model that was designed to capture the essential biological processes in grazed grass-clover pasture, and to be optimised to derive improved grazing strategies for New Zealand dairy farms. While the individual processes represented in the model (photosynthesis, tissue growth, flowering, leaf death, decomposition, worms) were based on experimental data, this did not guarantee that the assembled model would accurately predict the behaviour of the system as a whole (i.e., pasture growth and quality). Validation of the whole model was thus a priority, since any strategy derived from the model could impact a farm business in the order of thousands of dollars per annum if adopted. This paper describes the process of defining performance criteria for the model, obtaining suitable data to test the model, and carrying out the validation analysis. The validation process highlighted a number of weaknesses in the model, which will lead to the model being improved. As a result, the model's utility will be enhanced. Furthermore, validation was found to have an unexpected additional benefit, in that despite the model's poor initial performance, support was generated for the model among field scientists involved in the wider project.
Developing the Mathematics Learning Management Model for Improving Creative Thinking in Thailand
ERIC Educational Resources Information Center
Sriwongchai, Arunee; Jantharajit, Nirat; Chookhampaeng, Sumalee
2015-01-01
The study purposes were: 1) To study current states and problems of relevant secondary students in developing mathematics learning management model for improving creative thinking, 2) To evaluate the effectiveness of model about: a) efficiency of learning process, b) comparisons of pretest and posttest on creative thinking and achievement of…
A Model Schedule for a Capital Improvement Program.
ERIC Educational Resources Information Center
Oates, Arnold D.; Burch, A. Lee
The Model Schedule for a Capital Improvement Program described in this paper encourages school leaders to consider a more holistic view of the planning process. It is intended to assist those responsible for educational facility planning, who must assure that all important and relevant tasks are accomplished in a timely manner. The model's six…
Nuclear emergency management procedures in Europe
NASA Astrophysics Data System (ADS)
Carter, Emma
The Chernobyl accident brought to the fore the need for decision-making in nuclear emergency management to be transparent and consistent across Europe. A range of systems to support decision-making in future emergencies have since been developed, but, by and large, with little consultation with potential decision makers and limited understanding of the emergency management procedures across Europe and how they differ. In nuclear emergency management, coordination, communication and information sharing are of paramount importance. There are many key players with their own technical expertise, and several key activities occur in parallel, across different locations. Business process modelling can facilitate understanding through the representation of processes, aid transparency and structure the analysis, comparison and improvement of processes. This work has been conducted as part of a European Fifth Framework Programme project EVATECH, whose aim was to improve decision support methods, models and processes taking into account stakeholder expectations and concerns. It has involved the application of process modelling to document and compare the emergency management processes in four European countries. It has also involved a multidisciplinary approach taking a socio-technical perspective. The use of process modelling did indeed facilitate understanding and provided a common platform, which was not previously available, to consider emergency management processes. This thesis illustrates the structured analysis approach that process modelling enables. Firstly, through an individual analysis for the United Kingdom (UK) model that illustrated the potential benefits for a country. These are for training purposes, to build reflexive shared mental models, to aid coordination and for process improvement. Secondly, through a comparison of the processes in Belgium, Germany, Slovak Republic and the UK. In this comparison of the four processes we observed that the four process models are substantially different in their organisational structure and identified differences in the management of advice, where decisions are made and the communication network style. Another key aspect of this work is that through the structured analysis conducted we were able to develop a framework for the evaluation of DSS from the perspective of process. This work concludes reflecting on the challenges, which the European off-site nuclear emergency community face and suggest direction for future work, with particular reference to a recent conference on the capabilities and challenges of offsite nuclear emergency management, the Salzburg Symposium 2003.
Risk Quantification of Systems Engineering Documents Improves Probability of DOD Project Success
2009-09-01
comprehensive risk model for DoD milestone review documentation as well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project...Milestone Documentation, Project Planning, Rational Frame, Political Frame, CMMI Project Planning Process Area, CMMI Risk Management Process Area...well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project Planning and Risk Management process areas. The intent is to
Using simulation modeling to improve patient flow at an outpatient orthopedic clinic.
Rohleder, Thomas R; Lewkonia, Peter; Bischak, Diane P; Duffy, Paul; Hendijani, Rosa
2011-06-01
We report on the use of discrete event simulation modeling to support process improvements at an orthopedic outpatient clinic. The clinic was effective in treating patients, but waiting time and congestion in the clinic created patient dissatisfaction and staff morale issues. The modeling helped to identify improvement alternatives including optimized staffing levels, better patient scheduling, and an emphasis on staff arriving promptly. Quantitative results from the modeling provided motivation to implement the improvements. Statistical analysis of data taken before and after the implementation indicate that waiting time measures were significantly improved and overall patient time in the clinic was reduced.
Reducing RN Vacancy Rate: A Nursing Recruitment Office Process Improvement Project.
Hisgen, Stephanie A; Page, Nancy E; Thornlow, Deirdre K; Merwin, Elizabeth I
2018-06-01
The aim of this study was to reduce the RN vacancy rate at an academic medical center by improving the hiring process in the Nursing Recruitment Office. Inability to fill RN positions can lead to higher vacancy rates and negatively impact staff and patient satisfaction, quality outcomes, and the organization's bottom line. The Model for Improvement was used to design and implement a process improvement project to improve the hiring process from time of interview through the position being filled. Number of days to interview and check references decreased significantly, but no change in overall time to hire and time to fill positions was noted. RN vacancy rate also decreased significantly. Nurse manager satisfaction with the hiring process increased significantly. Redesigning the recruitment process supported operational efficiencies of the organization related to RN recruitment.
Devos, Olivier; Downey, Gerard; Duponchel, Ludovic
2014-04-01
Classification is an important task in chemometrics. For several years now, support vector machines (SVMs) have proven to be powerful for infrared spectral data classification. However such methods require optimisation of parameters in order to control the risk of overfitting and the complexity of the boundary. Furthermore, it is established that the prediction ability of classification models can be improved using pre-processing in order to remove unwanted variance in the spectra. In this paper we propose a new methodology based on genetic algorithm (GA) for the simultaneous optimisation of SVM parameters and pre-processing (GENOPT-SVM). The method has been tested for the discrimination of the geographical origin of Italian olive oil (Ligurian and non-Ligurian) on the basis of near infrared (NIR) or mid infrared (FTIR) spectra. Different classification models (PLS-DA, SVM with mean centre data, GENOPT-SVM) have been tested and statistically compared using McNemar's statistical test. For the two datasets, SVM with optimised pre-processing give models with higher accuracy than the one obtained with PLS-DA on pre-processed data. In the case of the NIR dataset, most of this accuracy improvement (86.3% compared with 82.8% for PLS-DA) occurred using only a single pre-processing step. For the FTIR dataset, three optimised pre-processing steps are required to obtain SVM model with significant accuracy improvement (82.2%) compared to the one obtained with PLS-DA (78.6%). Furthermore, this study demonstrates that even SVM models have to be developed on the basis of well-corrected spectral data in order to obtain higher classification rates. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Neumann, Yoram; Neumann, Edith; Lewis, Shelia
2017-01-01
This study integrated the Spiral Curriculum approach into the Robust Learning Model as part of a continuous improvement process that was designed to improve educational effectiveness and then assessed the differences between the initial and integrated models as well as the predictability of the first course in the integrated learning model on a…
ERIC Educational Resources Information Center
Miller, John
1994-01-01
Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)
Class Model Development Using Business Rules
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Gudas, Saulius
New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.
Evaluating and improving a model of nursing care delivery: a process of partnership.
Hall, Catherine; McCutcheon, Helen; Deuter, Kate; Matricciani, Lisa
2012-01-01
Evaluating and improving a model of nursing care is a fundamental part of clinical practice improvement. While Australian nurses are showing increasing interest in improving models of care delivery, more research is needed that addresses and articulates the processes attendant upon evaluating, re-designing and implementing improvements to the provision of nursing care. Providing nurses with an open opportunity to plan, act, observe and reflect on their practice promotes successful partnerships between academics and clinicians. The aim of this study was to evaluate and improve the model of nursing care delivery to patients in a general surgical ward using participatory action research. Researchers conducted non-participant observations (n = 9) of two hours duration across the 24 h period. Focus groups (n = 3) were used to share non-participant observation data with staff, providing them with an opportunity to reflect on their practice and explore possible solutions. Data was collected in 2008-2009. Two main problem areas were identified as impeding the nurses' ability to provide care to patients: (i) practices and behaviours of nurses and (ii) infrastructure and physical layout of the ward. An overview of issues within each problem area is presented. Shifting the focus of task-centred care towards a more patient-centred care approach, results directly in improvements in resource utilisation, improved cost-effectiveness and job satisfaction for nursing staff. New ways of thinking about nursing processes and systems, workflow design and skill allocation will guide hospital administrators and managers in the effective and efficient allocation of nursing work in similar settings.
Takaki, Koki; Wade, Andrew J; Collins, Chris D
2015-11-01
The aim of this study was to assess and improve the accuracy of biotransfer models for the organic pollutants (PCBs, PCDD/Fs, PBDEs, PFCAs, and pesticides) into cow's milk and beef used in human exposure assessment. Metabolic rate in cattle is known as a key parameter for this biotransfer, however few experimental data and no simulation methods are currently available. In this research, metabolic rate was estimated using existing QSAR biodegradation models of microorganisms (BioWIN) and fish (EPI-HL and IFS-HL). This simulated metabolic rate was then incorporated into the mechanistic cattle biotransfer models (RAIDAR, ACC-HUMAN, OMEGA, and CKow). The goodness of fit tests showed that RAIDAR, ACC-HUMAN, OMEGA model performances were significantly improved using either of the QSARs when comparing the new model outputs to observed data. The CKow model is the only one that separates the processes in the gut and liver. This model showed the lowest residual error of all the models tested when the BioWIN model was used to represent the ruminant metabolic process in the gut and the two fish QSARs were used to represent the metabolic process in the liver. Our testing included EUSES and CalTOX which are KOW-regression models that are widely used in regulatory assessment. New regressions based on the simulated rate of the two metabolic processes are also proposed as an alternative to KOW-regression models for a screening risk assessment. The modified CKow model is more physiologically realistic, but has equivalent usability to existing KOW-regression models for estimating cattle biotransfer of organic pollutants. Copyright © 2015. Published by Elsevier Ltd.
Strengthening organizations to implement evidence-based clinical practices.
VanDeusen Lukas, Carol; Engle, Ryann L; Holmes, Sally K; Parker, Victoria A; Petzel, Robert A; Nealon Seibert, Marjorie; Shwartz, Michael; Sullivan, Jennifer L
2010-01-01
Despite recognition that implementation of evidence-based clinical practices (EBPs) usually depends on the structure and processes of the larger health care organizational context, the dynamics of implementation are not well understood. This project's aim was to deepen that understanding by implementing and evaluating an organizational model hypothesized to strengthen the ability of health care organizations to facilitate EBPs. CONCEPTUAL MODEL: The model posits that implementation of EBPs will be enhanced through the presence of three interacting components: active leadership commitment to quality, robust clinical process redesign incorporating EBPs into routine operations, and use of management structures and processes to support and align redesign. In a mixed-methods longitudinal comparative case study design, seven medical centers in one network in the Department of Veterans Affairs participated in an intervention to implement the organizational model over 3 years. The network was selected randomly from three interested in using the model. The target EBP was hand-hygiene compliance. Measures included ratings of implementation fidelity, observed hand-hygiene compliance, and factors affecting model implementation drawn from interviews. Analyses support the hypothesis that greater fidelity to the organizational model was associated with higher compliance with hand-hygiene guidelines. High-fidelity sites showed larger effect sizes for improvement in hand-hygiene compliance than lower-fidelity sites. Adherence to the organizational model was in turn affected by factors in three categories: urgency to improve, organizational environment, and improvement climate. Implementation of EBPs, particularly those that cut across multiple processes of care, is a complex process with many possibilities for failure. The results provide the basis for a refined understanding of relationships among components of the organizational model and factors in the organizational context affecting them. This understanding suggests practical lessons for future implementation efforts and contributes to theoretical understanding of the dynamics of the implementation of EBPs.
Improvements in continuum modeling for biomolecular systems
NASA Astrophysics Data System (ADS)
Yu, Qiao; Ben-Zhuo, Lu
2016-01-01
Modeling of biomolecular systems plays an essential role in understanding biological processes, such as ionic flow across channels, protein modification or interaction, and cell signaling. The continuum model described by the Poisson- Boltzmann (PB)/Poisson-Nernst-Planck (PNP) equations has made great contributions towards simulation of these processes. However, the model has shortcomings in its commonly used form and cannot capture (or cannot accurately capture) some important physical properties of the biological systems. Considerable efforts have been made to improve the continuum model to account for discrete particle interactions and to make progress in numerical methods to provide accurate and efficient simulations. This review will summarize recent main improvements in continuum modeling for biomolecular systems, with focus on the size-modified models, the coupling of the classical density functional theory and the PNP equations, the coupling of polar and nonpolar interactions, and numerical progress. Project supported by the National Natural Science Foundation of China (Grant No. 91230106) and the Chinese Academy of Sciences Program for Cross & Cooperative Team of the Science & Technology Innovation.
A Robust Multi-Scale Modeling System for the Study of Cloud and Precipitation Processes
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo
2012-01-01
During the past decade, numerical weather and global non-hydrostatic models have started using more complex microphysical schemes originally developed for high resolution cloud resolving models (CRMs) with 1-2 km or less horizontal resolutions. These microphysical schemes affect the dynamic through the release of latent heat (buoyancy loading and pressure gradient) the radiation through the cloud coverage (vertical distribution of cloud species), and surface processes through rainfall (both amount and intensity). Recently, several major improvements of ice microphysical processes (or schemes) have been developed for cloud-resolving model (Goddard Cumulus Ensemble, GCE, model) and regional scale (Weather Research and Forecast, WRF) model. These improvements include an improved 3-ICE (cloud ice, snow and graupel) scheme (Lang et al. 2010); a 4-ICE (cloud ice, snow, graupel and hail) scheme and a spectral bin microphysics scheme and two different two-moment microphysics schemes. The performance of these schemes has been evaluated by using observational data from TRMM and other major field campaigns. In this talk, we will present the high-resolution (1 km) GeE and WRF model simulations and compared the simulated model results with observation from recent field campaigns [i.e., midlatitude continental spring season (MC3E; 2010), high latitude cold-season (C3VP, 2007; GCPEx, 2012), and tropical oceanic (TWP-ICE, 2006)].
An improved approximate-Bayesian model-choice method for estimating shared evolutionary history
2014-01-01
Background To understand biological diversification, it is important to account for large-scale processes that affect the evolutionary history of groups of co-distributed populations of organisms. Such events predict temporally clustered divergences times, a pattern that can be estimated using genetic data from co-distributed species. I introduce a new approximate-Bayesian method for comparative phylogeographical model-choice that estimates the temporal distribution of divergences across taxa from multi-locus DNA sequence data. The model is an extension of that implemented in msBayes. Results By reparameterizing the model, introducing more flexible priors on demographic and divergence-time parameters, and implementing a non-parametric Dirichlet-process prior over divergence models, I improved the robustness, accuracy, and power of the method for estimating shared evolutionary history across taxa. Conclusions The results demonstrate the improved performance of the new method is due to (1) more appropriate priors on divergence-time and demographic parameters that avoid prohibitively small marginal likelihoods for models with more divergence events, and (2) the Dirichlet-process providing a flexible prior on divergence histories that does not strongly disfavor models with intermediate numbers of divergence events. The new method yields more robust estimates of posterior uncertainty, and thus greatly reduces the tendency to incorrectly estimate models of shared evolutionary history with strong support. PMID:24992937
ERIC Educational Resources Information Center
Scheiter, Katharina; Schubert, Carina; Schüler, Anne
2018-01-01
Background: When learning with text and pictures, learners often fail to adequately process the materials, which can be explained as a failure to self-regulate one's learning by choosing adequate cognitive learning processes. Eye movement modelling examples (EMME) showing how to process multimedia instruction have improved elementary school…
A problem-solving routine for improving hospital operations.
Ghosh, Manimay; Sobek Ii, Durward K
2015-01-01
The purpose of this paper is to examine empirically why a systematic problem-solving routine can play an important role in the process improvement efforts of hospitals. Data on 18 process improvement cases were collected through semi-structured interviews, reports and other documents, and artifacts associated with the cases. The data were analyzed using a grounded theory approach. Adherence to all the steps of the problem-solving routine correlated to greater degrees of improvement across the sample. Analysis resulted in two models. The first partially explains why hospital workers tended to enact short-term solutions when faced with process-related problems; and tended not seek longer-term solutions that prevent problems from recurring. The second model highlights a set of self-reinforcing behaviors that are more likely to address problem recurrence and result in sustained process improvement. The study was conducted in one hospital setting. Hospital managers can improve patient care and increase operational efficiency by adopting and diffusing problem-solving routines that embody three key characteristics. This paper offers new insights on why caregivers adopt short-term approaches to problem solving. Three characteristics of an effective problem-solving routine in a healthcare setting are proposed.
Development of Pangasius steaks by improved sous-vide technology and its process optimization.
Kumari, Namita; Singh, Chongtham Baru; Kumar, Raushan; Martin Xavier, K A; Lekshmi, Manjusha; Venkateshwarlu, Gudipati; Balange, Amjad K
2016-11-01
The present study embarked on the objective of optimizing improved sous - vide processing condition for development of ready-to-cook Pangasius steaks with extended shelf-life using response surface methodology. For the development of improved sous - vide cooked product, Pangasius steaks were treated with additional hurdles in various combinations for optimization. Based on the study, suitable combination of chitosan and spices was selected which enhanced antimicrobial and oxidative stability of the product. The Box-Behnken experimental design with 15 trials per model was adopted for designing the experiment to know the effect of independent variables, namely chitosan concentration (X 1 ), cooking time (X 2 ) and cooking temperature (X 3 ) on dependent variable i.e. TBARS value (Y 1 ). From RSM generated model, the optimum condition for sous - vide processing of Pangasius steaks were 1.08% chitosan concentration, 70.93 °C of cooking temperature and 16.48 min for cooking time and predicted minimum value of multiple response optimal condition was Y = 0.855 mg MDA/Kg of fish. The high correlation coefficient (R 2 = 0.975) between the model and the experimental data showed that the model was able to efficiently predict processing condition for development of sous - vide processed Pangasius steaks. This research may help the processing industries and Pangasius fish farmer as it provides an alternative low cost technology for the proper utilization of Pangasius .
Daudelin, Denise H; Selker, Harry P; Leslie, Laurel K
2015-12-01
There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in-person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. © 2015 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc.
Al-Kuwaiti, Ahmed; Homa, Karen; Maruthamuthu, Thennarasu
2016-01-01
A performance improvement model was developed that focuses on the analysis and interpretation of performance indicator (PI) data using statistical process control and benchmarking. PIs are suitable for comparison with benchmarks only if the data fall within the statistically accepted limit-that is, show only random variation. Specifically, if there is no significant special-cause variation over a period of time, then the data are ready to be benchmarked. The proposed Define, Measure, Control, Internal Threshold, and Benchmark model is adapted from the Define, Measure, Analyze, Improve, Control (DMAIC) model. The model consists of the following five steps: Step 1. Define the process; Step 2. Monitor and measure the variation over the period of time; Step 3. Check the variation of the process; if stable (no significant variation), go to Step 4; otherwise, control variation with the help of an action plan; Step 4. Develop an internal threshold and compare the process with it; Step 5.1. Compare the process with an internal benchmark; and Step 5.2. Compare the process with an external benchmark. The steps are illustrated through the use of health care-associated infection (HAI) data collected for 2013 and 2014 from the Infection Control Unit, King Fahd Hospital, University of Dammam, Saudi Arabia. Monitoring variation is an important strategy in understanding and learning about a process. In the example, HAI was monitored for variation in 2013, and the need to have a more predictable process prompted the need to control variation by an action plan. The action plan was successful, as noted by the shift in the 2014 data, compared to the historical average, and, in addition, the variation was reduced. The model is subject to limitations: For example, it cannot be used without benchmarks, which need to be calculated the same way with similar patient populations, and it focuses only on the "Analyze" part of the DMAIC model.
NASA Astrophysics Data System (ADS)
Lee, K. David; Colony, Mike
2011-06-01
Modeling and simulation has been established as a cost-effective means of supporting the development of requirements, exploring doctrinal alternatives, assessing system performance, and performing design trade-off analysis. The Army's constructive simulation for the evaluation of equipment effectiveness in small combat unit operations is currently limited to representation of situation awareness without inclusion of the many uncertainties associated with real world combat environments. The goal of this research is to provide an ability to model situation awareness and decision process uncertainties in order to improve evaluation of the impact of battlefield equipment on ground soldier and small combat unit decision processes. Our Army Probabilistic Inference and Decision Engine (Army-PRIDE) system provides this required uncertainty modeling through the application of two critical techniques that allow Bayesian network technology to be applied to real-time applications. (Object-Oriented Bayesian Network methodology and Object-Oriented Inference technique). In this research, we implement decision process and situation awareness models for a reference scenario using Army-PRIDE and demonstrate its ability to model a variety of uncertainty elements, including: confidence of source, information completeness, and information loss. We also demonstrate that Army-PRIDE improves the realism of the current constructive simulation's decision processes through Monte Carlo simulation.
Improving the process of process modelling by the use of domain process patterns
NASA Astrophysics Data System (ADS)
Koschmider, Agnes; Reijers, Hajo A.
2015-01-01
The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.
Experimental Evaluation of a Serious Game for Teaching Software Process Modeling
ERIC Educational Resources Information Center
Chaves, Rafael Oliveira; von Wangenheim, Christiane Gresse; Furtado, Julio Cezar Costa; Oliveira, Sandro Ronaldo Bezerra; Santos, Alex; Favero, Eloi Luiz
2015-01-01
Software process modeling (SPM) is an important area of software engineering because it provides a basis for managing, automating, and supporting software process improvement (SPI). Teaching SPM is a challenging task, mainly because it lays great emphasis on theory and offers few practical exercises. Furthermore, as yet few teaching approaches…
An Evaluation of Understandability of Patient Journey Models in Mental Health
2016-01-01
Background There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. Objectives This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Method Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. Results The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. Conclusions The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers. PMID:27471006
Yu, Wenxi; Liu, Yang; Ma, Zongwei; Bi, Jun
2017-08-01
Using satellite-based aerosol optical depth (AOD) measurements and statistical models to estimate ground-level PM 2.5 is a promising way to fill the areas that are not covered by ground PM 2.5 monitors. The statistical models used in previous studies are primarily Linear Mixed Effects (LME) and Geographically Weighted Regression (GWR) models. In this study, we developed a new regression model between PM 2.5 and AOD using Gaussian processes in a Bayesian hierarchical setting. Gaussian processes model the stochastic nature of the spatial random effects, where the mean surface and the covariance function is specified. The spatial stochastic process is incorporated under the Bayesian hierarchical framework to explain the variation of PM 2.5 concentrations together with other factors, such as AOD, spatial and non-spatial random effects. We evaluate the results of our model and compare them with those of other, conventional statistical models (GWR and LME) by within-sample model fitting and out-of-sample validation (cross validation, CV). The results show that our model possesses a CV result (R 2 = 0.81) that reflects higher accuracy than that of GWR and LME (0.74 and 0.48, respectively). Our results indicate that Gaussian process models have the potential to improve the accuracy of satellite-based PM 2.5 estimates.
INHALATION EXPOSURE AND INTAKE DOSE MODEL IMPROVEMENTS
This presentation highlights recent human exposure model improvements and products developed by the EMRB in coordination with scientists in the OAQPS and provides insight into how these products are used by the OAQPS in its regulatory process. Besides providing a status report of...
Reusing models of actors and services in smart homecare to improve sustainability.
Walderhaug, Ståle; Stav, Erlend; Mikalsen, Marius
2008-01-01
Industrial countries are faced with a growing elderly population. Homecare systems with assistive smart house technology enable elderly to live independently at home. Development of such smart home care systems is complex and expensive and there is no common reference model that can facilitate service reuse. This paper proposes reusable actor and service models based on a model-driven development process where end user organizations and domain healthcare experts from four European countries have been involved. The models, specified using UML can be reused actively as assets in the system design and development process and can reduce development costs, and improve interoperability and sustainability of systems. The models are being evaluated in the European IST project MPOWER.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lah, J; Shin, D; Kim, G
Purpose: To show how tolerance design and tolerancing approaches can be used to predict and improve the site-specific range in patient QA process in implementing the Six Sigma. Methods: In this study, patient QA plans were selected according to 6 site-treatment groups: head &neck (94 cases), spine (76 cases), lung (89 cases), liver (53 cases), pancreas (55 cases), and prostate (121 cases), treated between 2007 and 2013. We evaluated a model of the Six Sigma that determines allowable deviations in design parameters and process variables in patient-specific QA, where possible, tolerance may be loosened, then customized if it necessary tomore » meet the functional requirements. A Six Sigma problem-solving methodology is known as DMAIC phases, which are used stand for: Define a problem or improvement opportunity, Measure process performance, Analyze the process to determine the root causes of poor performance, Improve the process by fixing root causes, Control the improved process to hold the gains. Results: The process capability for patient-specific range QA is 0.65 with only ±1 mm of tolerance criteria. Our results suggested the tolerance level of ±2–3 mm for prostate and liver cases and ±5 mm for lung cases. We found that customized tolerance between calculated and measured range reduce that patient QA plan failure and almost all sites had failure rates less than 1%. The average QA time also improved from 2 hr to less than 1 hr for all including planning and converting process, depth-dose measurement and evaluation. Conclusion: The objective of tolerance design is to achieve optimization beyond that obtained through QA process improvement and statistical analysis function detailing to implement a Six Sigma capable design.« less
Verifying and Validating Proposed Models for FSW Process Optimization
NASA Technical Reports Server (NTRS)
Schneider, Judith
2008-01-01
This slide presentation reviews Friction Stir Welding (FSW) and the attempts to model the process in order to optimize and improve the process. The studies are ongoing to validate and refine the model of metal flow in the FSW process. There are slides showing the conventional FSW process, a couple of weld tool designs and how the design interacts with the metal flow path. The two basic components of the weld tool are shown, along with geometries of the shoulder design. Modeling of the FSW process is reviewed. Other topics include (1) Microstructure features, (2) Flow Streamlines, (3) Steady-state Nature, and (4) Grain Refinement Mechanisms
Signori, Marcos R; Garcia, Renato
2010-01-01
This paper presents a model that aids the Clinical Engineering to deal with Risk Management in the Healthcare Technological Process. The healthcare technological setting is complex and supported by three basics entities: infrastructure (IS), healthcare technology (HT), and human resource (HR). Was used an Enterprise Architecture - MODAF (Ministry of Defence Architecture Framework) - to model this process for risk management. Thus, was created a new model to contribute to the risk management in the HT process, through the Clinical Engineering viewpoint. This architecture model can support and improve the decision making process of the Clinical Engineering to the Risk Management in the Healthcare Technological process.
Ogrinc, Greg; Hoffman, Kimberly G.; Stevenson, Katherine M.; Shalaby, Marc; Beard, Albertine S.; Thörne, Karin E.; Coleman, Mary T.; Baum, Karyn D.
2016-01-01
Problem Current models of health care quality improvement do not explicitly describe the role of health professions education. The authors propose the Exemplary Care and Learning Site (ECLS) model as an approach to achieving continual improvement in care and learning in the clinical setting. Approach From 2008–2012, an iterative, interactive process was used to develop the ECLS model and its core elements—patients and families informing process changes; trainees engaging both in care and the improvement of care; leaders knowing, valuing, and practicing improvement; data transforming into useful information; and health professionals competently engaging both in care improvement and teaching about care improvement. In 2012–2013, a three-part feasibility test of the model, including a site self-assessment, an independent review of each site’s ratings, and implementation case stories, was conducted at six clinical teaching sites (in the United States and Sweden). Outcomes Site leaders reported the ECLS model provided a systematic approach toward improving patient (and population) outcomes, system performance, and professional development. Most sites found it challenging to incorporate the patients and families element. The trainee element was strong at four sites. The leadership and data elements were self-assessed as the most fully developed. The health professionals element exhibited the greatest variability across sites. Next Steps The next test of the model should be prospective, linked to clinical and educa tional outcomes, to evaluate whether it helps care delivery teams, educators, and patients and families take action to achieve better patient (and population) outcomes, system performance, and professional development. PMID:26760058
Process Improvements in Training Device Acceptance Testing: A Study in Total Quality Management
1990-12-12
Quality Management , a small group of Government and industry specialists examined the existing training device acceptance test process for potential improvements. The agreed-to mission of the Air Force/Industry partnership was to continuously identify and promote implementable approaches to minimize the cost and time required for acceptance testing while ensuring that validated performance supports the user training requirements. Application of a Total Quality process improvement model focused on the customers and their requirements, analyzed how work was accomplished, and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthew R. Kumjian; Giangrande, Scott E.; Mishra, Subashree
Polarimetric radar observations increasingly are used to understand cloud microphysical processes, which is critical for improving their representation in cloud and climate models. In particular, there has been recent focus on improving representations of ice collection processes (e.g., aggregation, riming), as these influence precipitation rate, heating profiles, and ultimately cloud life cycles. However, distinguishing these processes using conventional polarimetric radar observations is difficult, as they produce similar fingerprints. This necessitates improved analysis techniques and integration of complementary data sources. Furthermore, the Midlatitude Continental Convective Clouds Experiment (MC3E) provided such an opportunity.
Process-aware EHR BPM systems: two prototypes and a conceptual framework.
Webster, Charles; Copenhaver, Mark
2010-01-01
Systematic methods to improve the effectiveness and efficiency of electronic health record-mediated processes will be key to EHRs playing an important role in the positive transformation of healthcare. Business process management (BPM) systematically optimizes process effectiveness, efficiency, and flexibility. Therefore BPM offers relevant ideas and technologies. We provide a conceptual model based on EHR productivity and negative feedback control that links EHR and BPM domains, describe two EHR BPM prototype modules, and close with the argument that typical EHRs must become more process-aware if they are to take full advantage of BPM ideas and technology. A prediction: Future extensible clinical groupware will coordinate delivery of EHR functionality to teams of users by combining modular components with executable process models whose usability (effectiveness, efficiency, and user satisfaction) will be systematically improved using business process management techniques.
Matrix approaches to assess terrestrial nitrogen scheme in CLM4.5
NASA Astrophysics Data System (ADS)
Du, Z.
2017-12-01
Terrestrial carbon (C) and nitrogen (N) cycles have been commonly represented by a series of balance equations to track their influxes into and effluxes out of individual pools in earth system models (ESMs). This representation matches our understanding of C and N cycle processes well but makes it difficult to track model behaviors. To overcome these challenges, we developed a matrix approach, which reorganizes the series of terrestrial C and N balance equations in the CLM4.5 into two matrix equations based on original representation of C and N cycle processes and mechanisms. The matrix approach would consequently help improve the comparability of models and data, evaluate impacts of additional model components, facilitate benchmark analyses, model intercomparisons, and data-model fusion, and improve model predictive power.
Dynamic one-dimensional modeling of secondary settling tanks and design impacts of sizing decisions.
Li, Ben; Stenstrom, Michael K
2014-03-01
As one of the most significant components in the activated sludge process (ASP), secondary settling tanks (SSTs) can be investigated with mathematical models to optimize design and operation. This paper takes a new look at the one-dimensional (1-D) SST model by analyzing and considering the impacts of numerical problems, especially the process robustness. An improved SST model with Yee-Roe-Davis technique as the PDE solver is proposed and compared with the widely used Takács model to show its improvement in numerical solution quality. The improved and Takács models are coupled with a bioreactor model to reevaluate ASP design basis and several popular control strategies for economic plausibility, contaminant removal efficiency and system robustness. The time-to-failure due to rising sludge blanket during overloading, as a key robustness indicator, is analyzed to demonstrate the differences caused by numerical issues in SST models. The calculated results indicate that the Takács model significantly underestimates time to failure, thus leading to a conservative design. Copyright © 2013 Elsevier Ltd. All rights reserved.
A Maturity Model: Does It Provide a Path for Online Course Design?
ERIC Educational Resources Information Center
Neuhauser, Charlotte
2004-01-01
Maturity models are successfully used by organizations attempting to improve their processes, products, and delivery. As more faculty include online course design and teaching, a maturity model of online course design may serve as a tool in planning and assessing their courses for improvement based on best practices. This article presents such a…
Naughton, Colleen C; Zhang, Qiong; Mihelcic, James R
2017-01-15
This study improves the global application of methods and analyses, especially Life Cycle Assessment (LCA), that properly incorporates environmental impacts of firewood and a social sustainability indicator (human energy) as tools for sustainable human development. Specifically shea butter production processes, common throughout sub-Saharan Africa and crucial to food security, environmental sustainability, and women's empowerment, are analyzed. Many economic activities in the world rely on firewood for energy and labor that aren't included in traditional LCAs. Human energy (entirely from women) contributed 25-100% of shea butter production processes (2000-6100kJ/kg of shea butter) and mechanized production processes had reduced human energy without considerably greater total energy. Firewood accounted for 94-100% of total embodied energy (103 and 172MJ/kg of shea butter for improved and traditional shea butter production processes respectively) and global warming potential and 18-100% of human toxicity of the production processes. Implementation of improved cookstoves modeled in this study could reduce: (1) global warming potential by 78% (from 18 to 4.1kg CO 2 eq/kg and 11 to 2.4kg CO 2 eq/kg of shea butter for the traditional and improved processes respectively), (2) the embodied energy of using firewood by 52% (from 170 to 82MJ/kg and 103 to 49MJ/kg for the traditional and improved processes respectively), and (3) human toxicity by 83% for the non-mechanized traditional and improved processes (from 0.041 to 0.0071 1,4 DB eq/kg and 0.025 to 0.0042 1,4 DB eq/kg respectively). In addition, this is the first study to compare Economic Input-Output Life Cycle Assessment (EIO-LCA) and process-based LCA in a developing country and evaluate five traditional and improved shea butter production processes over different impact categories. Overall, this study developed a framework to evaluate and improve processes for achievement of the United Nation's Sustainable Development Goals for 2030 particularly to obtain food security. Copyright © 2016 Elsevier B.V. All rights reserved.
Improving the Horizontal Transport in the Lower Troposphere with Four Dimensional Data Assimilation
The physical processes involved in air quality modeling are governed by dynamically-generated meteorological model fields. This research focuses on reducing the uncertainty in the horizontal transport in the lower troposphere by improving the four dimensional data assimilation (F...
Emotional processing during experiential treatment of depression.
Pos, Alberta E; Greenberg, Leslie S; Goldman, Rhonda N; Korman, Lorne M
2003-12-01
This study explored the importance of early and late emotional processing to change in depressive and general symptomology, self-esteem, and interpersonal problems for 34 clients who received 16-20 sessions of experiential treatment for depression. The independent contribution to outcome of the early working alliance was also explored. Early and late emotional processing predicted reductions in reported symptoms and gains in self-esteem. More important, emotional-processing skill significantly improved during treatment. Hierarchical regression models demonstrated that late emotional processing both mediated the relationship between clients' early emotional processing capacity and outcome and was the sole emotional-processing variable that independently predicted improvement. After controlling for emotional processing, the working alliance added an independent contribution to explaining improvement in reported symptomology only. (c) 2003 APA
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
The contribution of temporary storage and executive processes to category learning.
Wang, Tengfei; Ren, Xuezhu; Schweizer, Karl
2015-09-01
Three distinctly different working memory processes, temporary storage, mental shifting and inhibition, were proposed to account for individual differences in category learning. A sample of 213 participants completed a classic category learning task and two working memory tasks that were experimentally manipulated for tapping specific working memory processes. Fixed-links models were used to decompose data of the category learning task into two independent components representing basic performance and improvement in performance in category learning. Processes of working memory were also represented by fixed-links models. In a next step the three working memory processes were linked to components of category learning. Results from modeling analyses indicated that temporary storage had a significant effect on basic performance and shifting had a moderate effect on improvement in performance. In contrast, inhibition showed no effect on any component of the category learning task. These results suggest that temporary storage and the shifting process play different roles in the course of acquiring new categories. Copyright © 2015 Elsevier B.V. All rights reserved.
Brink-Huis, Anita; van Achterberg, Theo; Schoonhoven, Lisette
2008-08-01
This paper reports a review of the literature conducted to identify organisation models in cancer pain management that contain integrated care processes and describe their effectiveness. Pain is experienced by 30-50% of cancer patients receiving treatment and by 70-90% of those with advanced disease. Efforts to improve pain management have been made through the development and dissemination of clinical guidelines. Early improvements in pain management were focussed on just one or two single processes such as pain assessment and patient education. Little is known about organisational models with multiple integrated processes throughout the course of the disease trajectory and concerning all stages of the care process. Systematic review. The review involved a systematic search of the literature, published between 1986-2006. Subject-specific keywords used to describe patients, disease, pain management interventions and integrated care processes, relevant for this review were selected using the thesaurus of the databases. Institutional models, clinical pathways and consultation services are three alternative models for the integration of care processes in cancer pain management. A clinical pathway is a comprehensive institutionalisation model, whereas a pain consultation service is a 'stand-alone' model that can be integrated in a clinical pathway. Positive patient and process outcomes have been described for all three models, although the level of evidence is generally low. Evaluation of the quality of pain management must involve standardised measurements of both patient and process outcomes. We recommend the development of policies for referrals to a pain consultation service. These policies can be integrated within a clinical pathway. To evaluate the effectiveness of pain management models standardised outcome measures are needed.
Hu, Yue-Hua; Kitching, Roger L.; Lan, Guo-Yu; Zhang, Jiao-Lin; Sha, Li-Qing; Cao, Min
2014-01-01
We have investigated the processes of community assembly using size classes of trees. Specifically our work examined (1) whether point process models incorporating an effect of size-class produce more realistic summary outcomes than do models without this effect; (2) which of three selected models incorporating, respectively environmental effects, dispersal and the joint-effect of both of these, is most useful in explaining species-area relationships (SARs) and point dispersion patterns. For this evaluation we used tree species data from the 50-ha forest dynamics plot in Barro Colorado Island, Panama and the comparable 20 ha plot at Bubeng, Southwest China. Our results demonstrated that incorporating an size-class effect dramatically improved the SAR estimation at both the plots when the dispersal only model was used. The joint effect model produced similar improvement but only for the 50-ha plot in Panama. The point patterns results were not improved by incorporation of size-class effects using any of the three models. Our results indicate that dispersal is likely to be a key process determining both SARs and point patterns. The environment-only model and joint-effects model were effective at the species level and the community level, respectively. We conclude that it is critical to use multiple summary characteristics when modelling spatial patterns at the species and community levels if a comprehensive understanding of the ecological processes that shape species’ distributions is sought; without this results may have inherent biases. By influencing dispersal, the effect of size-class contributes to species assembly and enhances our understanding of species coexistence. PMID:25251538
Modeling veterans healthcare administration disclosure processes :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.
As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested acrossmore » a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.« less
ERIC Educational Resources Information Center
Spanbauer, Stanley J.
The Measurement and Costing Model (MCM) described in this book was developed and tested at Fox Valley Technical College (FVTC), Wisconsin, to enhance the college's quality improvement process and to serve as a guide to other institutions interested in improving their quality. The book presents a description of the model and outlines seven steps…
Song, Zirui; Rose, Sherri; Chernew, Michael E.; Safran, Dana Gelb
2018-01-01
As population-based payment models become increasingly common, it is crucial to understand how such payment models affect health disparities. We evaluated health care quality and spending among enrollees in areas with lower versus higher socioeconomic status in Massachusetts before and after providers entered into the Alternative Quality Contract, a two-sided population-based payment model with substantial incentives tied to quality. We compared changes in process measures, outcome measures, and spending between enrollees in areas with lower and higher socioeconomic status from 2006 to 2012 (outcome measures were measured after the intervention only). Quality improved for all enrollees in the Alternative Quality Contract after their provider organizations entered the contract. Process measures improved 1.2 percentage points per year more among enrollees in areas with lower socioeconomic status than among those in areas with higher socioeconomic status. Outcome measure improvement was no different between the subgroups; neither were changes in spending. Larger or comparable improvements in quality among enrollees in areas with lower socioeconomic status suggest a potential narrowing of disparities. Strong pay-for-performance incentives within a population-based payment model could encourage providers to focus on improving quality for more disadvantaged populations. PMID:28069849
Quality management benchmarking: FDA compliance in pharmaceutical industry.
Jochem, Roland; Landgraf, Katja
2010-01-01
By analyzing and comparing industry and business best practice, processes can be optimized and become more successful mainly because efficiency and competitiveness increase. This paper aims to focus on some examples. Case studies are used to show knowledge exchange in the pharmaceutical industry. Best practice solutions were identified in two companies using a benchmarking method and five-stage model. Despite large administrations, there is much potential regarding business process organization. This project makes it possible for participants to fully understand their business processes. The benchmarking method gives an opportunity to critically analyze value chains (a string of companies or players working together to satisfy market demands for a special product). Knowledge exchange is interesting for companies that like to be global players. Benchmarking supports information exchange and improves competitive ability between different enterprises. Findings suggest that the five-stage model improves efficiency and effectiveness. Furthermore, the model increases the chances for reaching targets. The method gives security to partners that did not have benchmarking experience. The study identifies new quality management procedures. Process management and especially benchmarking is shown to support pharmaceutical industry improvements.
Modeling the Hydrologic Processes of a Permeable Pavement System
A permeable pavement system can capture stormwater to reduce runoff volume and flow rate, improve onsite groundwater recharge, and enhance pollutant controls within the site. A new unit process model for evaluating the hydrologic performance of a permeable pavement system has be...
ERIC Educational Resources Information Center
Dmitrienko, Nadezhda; Ershova, Svetlana; Konovalenko, Tatiana; Kutsova, Elvira; Yurina, Elena
2015-01-01
The article points out that the process of mastering foreign language stimulates students' personal, professional and cultural growth, improving linguistic, communicative competences and viability levels. A proposed pedagogical technology of modeling different communicative situations has a serious synergetic potential for students' self organized…
Software quality and process improvement in scientific simulation codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ambrosiano, J.; Webster, R.
1997-11-01
This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.
ERIC Educational Resources Information Center
Heck, Ronald H.; Hallinger, Philip
2010-01-01
Researchers have persisted in framing leadership as the driver for change and performance improvement in schools despite convincing theoretical commentary that proposes leadership as a process of reciprocal interaction. Although conceptualizing leadership as a reciprocal process offers leverage for understanding leadership effects on learning,…
Beyond Theory: Improving Public Relations Writing through Computer Technology.
ERIC Educational Resources Information Center
Neff, Bonita Dostal
Computer technology (primarily word processing) enables the student of public relations writing to improve the writing process through increased flexibility in writing, enhanced creativity, increased support of management skills and team work. A new instructional model for computer use in public relations courses at Purdue University Calumet…
Raymond, Nancy C; Wyman, Jean F; Dighe, Satlaj; Harwood, Eileen M; Hang, Mikow
2018-06-01
Process evaluation is an important tool in quality improvement efforts. This article illustrates how a systematic and continuous evaluation process can be used to improve the quality of faculty career development programs by using the University of Minnesota's Building Interdisciplinary Research Careers in Women's Health (BIRCWH) K12 program as an exemplar. Data from a rigorous process evaluation incorporating quantitative and qualitative measurements were analyzed and reviewed by the BIRCWH program leadership on a regular basis. Examples are provided of how this evaluation model and processes were used to improve many aspects of the program, thereby improving scholar, mentor, and advisory committee members' satisfaction and scholar outcomes. A rigorous evaluation plan can increase the effectiveness and impact of a research career development plan.
Rolling scheduling of electric power system with wind power based on improved NNIA algorithm
NASA Astrophysics Data System (ADS)
Xu, Q. S.; Luo, C. J.; Yang, D. J.; Fan, Y. H.; Sang, Z. X.; Lei, H.
2017-11-01
This paper puts forth a rolling modification strategy for day-ahead scheduling of electric power system with wind power, which takes the operation cost increment of unit and curtailed wind power of power grid as double modification functions. Additionally, an improved Nondominated Neighbor Immune Algorithm (NNIA) is proposed for solution. The proposed rolling scheduling model has further improved the operation cost of system in the intra-day generation process, enhanced the system’s accommodation capacity of wind power, and modified the key transmission section power flow in a rolling manner to satisfy the security constraint of power grid. The improved NNIA algorithm has defined an antibody preference relation model based on equal incremental rate, regulation deviation constraints and maximum & minimum technical outputs of units. The model can noticeably guide the direction of antibody evolution, and significantly speed up the process of algorithm convergence to final solution, and enhance the local search capability.
Kinetic models in industrial biotechnology - Improving cell factory performance.
Almquist, Joachim; Cvijovic, Marija; Hatzimanikatis, Vassily; Nielsen, Jens; Jirstrand, Mats
2014-07-01
An increasing number of industrial bioprocesses capitalize on living cells by using them as cell factories that convert sugars into chemicals. These processes range from the production of bulk chemicals in yeasts and bacteria to the synthesis of therapeutic proteins in mammalian cell lines. One of the tools in the continuous search for improved performance of such production systems is the development and application of mathematical models. To be of value for industrial biotechnology, mathematical models should be able to assist in the rational design of cell factory properties or in the production processes in which they are utilized. Kinetic models are particularly suitable towards this end because they are capable of representing the complex biochemistry of cells in a more complete way compared to most other types of models. They can, at least in principle, be used to in detail understand, predict, and evaluate the effects of adding, removing, or modifying molecular components of a cell factory and for supporting the design of the bioreactor or fermentation process. However, several challenges still remain before kinetic modeling will reach the degree of maturity required for routine application in industry. Here we review the current status of kinetic cell factory modeling. Emphasis is on modeling methodology concepts, including model network structure, kinetic rate expressions, parameter estimation, optimization methods, identifiability analysis, model reduction, and model validation, but several applications of kinetic models for the improvement of cell factories are also discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert
2015-05-28
System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here withmore » Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.« less
NASA Astrophysics Data System (ADS)
Kleinen, Thomas; Brovkin, Victor; Munhoven, Guy
2016-11-01
Trends in the atmospheric concentration of CO2 during three recent interglacials - the Holocene, the Eemian and Marine Isotope Stage (MIS) 11 - are investigated using an earth system model of intermediate complexity, which we extended with process-based modules to consider two slow carbon cycle processes - peat accumulation and shallow-water CaCO3 sedimentation (coral reef formation). For all three interglacials, model simulations considering peat accumulation and shallow-water CaCO3 sedimentation substantially improve the agreement between model results and ice core CO2 reconstructions in comparison to a carbon cycle set-up neglecting these processes. This enables us to model the trends in atmospheric CO2, with modelled trends similar to the ice core data, forcing the model only with orbital and sea level changes. During the Holocene, anthropogenic CO2 emissions are required to match the observed rise in atmospheric CO2 after 3 ka BP but are not relevant before this time. Our model experiments show a considerable improvement in the modelled CO2 trends by the inclusion of the slow carbon cycle processes, allowing us to explain the CO2 evolution during the Holocene and two recent interglacials consistently using an identical model set-up.
School Improvement Model to Foster Student Learning
ERIC Educational Resources Information Center
Rulloda, Rudolfo Barcena
2011-01-01
Many classroom teachers are still using the traditional teaching methods. The traditional teaching methods are one-way learning process, where teachers would introduce subject contents such as language arts, English, mathematics, science, and reading separately. However, the school improvement model takes into account that all students have…
Voigt, Wieland; Hoellthaler, Josef; Magnani, Tiziana; Corrao, Vito; Valdagni, Riccardo
2014-01-01
Multidisciplinary care of prostate cancer is increasingly offered in specialised cancer centres. It requires the optimisation of medical and operational processes and the integration of the different medical and non-medical stakeholders. To develop a standardised operational process assessment tool basing on the capability maturity model integration (CMMI) able to implement multidisciplinary care and improve process quality and efficiency. Information for model development was derived from medical experts, clinical guidelines, best practice elements of renowned cancer centres, and scientific literature. Data were organised in a hierarchically structured model, consisting of 5 categories, 30 key process areas, 172 requirements, and more than 1500 criteria. Compliance with requirements was assessed through structured on-site surveys covering all relevant clinical and management processes. Comparison with best practice standards allowed to recommend improvements. 'Act On Oncology'(AoO) was applied in a pilot study on a prostate cancer unit in Europe. Several best practice elements such as multidisciplinary clinics or advanced organisational measures for patient scheduling were observed. Substantial opportunities were found in other areas such as centre management and infrastructure. As first improvements the evaluated centre administration described and formalised the organisation of the prostate cancer unit with defined personnel assignments and clinical activities and a formal agreement is being worked on to have structured access to First-Aid Posts. In the pilot study, the AoO approach was feasible to identify opportunities for process improvements. Measures were derived that might increase the operational process quality and efficiency.
NASA Astrophysics Data System (ADS)
Aburas, Maher Milad; Ho, Yuek Ming; Ramli, Mohammad Firuz; Ash'aari, Zulfa Hanan
2017-07-01
The creation of an accurate simulation of future urban growth is considered one of the most important challenges in urban studies that involve spatial modeling. The purpose of this study is to improve the simulation capability of an integrated CA-Markov Chain (CA-MC) model using CA-MC based on the Analytical Hierarchy Process (AHP) and CA-MC based on Frequency Ratio (FR), both applied in Seremban, Malaysia, as well as to compare the performance and accuracy between the traditional and hybrid models. Various physical, socio-economic, utilities, and environmental criteria were used as predictors, including elevation, slope, soil texture, population density, distance to commercial area, distance to educational area, distance to residential area, distance to industrial area, distance to roads, distance to highway, distance to railway, distance to power line, distance to stream, and land cover. For calibration, three models were applied to simulate urban growth trends in 2010; the actual data of 2010 were used for model validation utilizing the Relative Operating Characteristic (ROC) and Kappa coefficient methods Consequently, future urban growth maps of 2020 and 2030 were created. The validation findings confirm that the integration of the CA-MC model with the FR model and employing the significant driving force of urban growth in the simulation process have resulted in the improved simulation capability of the CA-MC model. This study has provided a novel approach for improving the CA-MC model based on FR, which will provide powerful support to planners and decision-makers in the development of future sustainable urban planning.
Investigation of Biogrout processes by numerical analysis at pore scale
NASA Astrophysics Data System (ADS)
Bergwerff, Luke; van Paassen, Leon A.; Picioreanu, Cristian; van Loosdrecht, Mark C. M.
2013-04-01
Biogrout is a soil improving process that aims to improve the strength of sandy soils. The process is based on microbially induced calcite precipitation (MICP). In this study the main process is based on denitrification facilitated by bacteria indigenous to the soil using substrates, which can be derived from pretreated waste streams containing calcium salts of fatty acids and calcium nitrate, making it a cost effective and environmentally friendly process. The goal of this research is to improve the understanding of the process by numerical analysis so that it may be improved and applied properly for varying applications, such as borehole stabilization, liquefaction prevention, levee fortification and mitigation of beach erosion. During the denitrification process there are many phases present in the pore space including a liquid phase containing solutes, crystals, bacteria forming biofilms and gas bubbles. Due to the amount of phases and their dynamic changes (multiphase flow with (non-linear) reactive transport), there are many interactions making the process very complex. To understand this complexity in the system, the interactions between these phases are studied in a reductionist approach, increasing the complexity of the system by one phase at a time. The model will initially include flow, solute transport, crystal nucleation and growth in 2D at pore scale. The flow will be described by Navier-Stokes equations. Initial study and simulations has revealed that describing crystal growth for this application on a fixed grid can introduce significant fundamental errors. Therefore a level set method will be employed to better describe the interface of developing crystals in between sand grains. Afterwards the model will be expanded to 3D to provide more realistic flow, nucleation and clogging behaviour at pore scale. Next biofilms and lastly gas bubbles may be added to the model. From the results of these pore scale models the behaviour of the system may be studied and eventually observations may be extrapolated to a larger continuum scale.
NASA Astrophysics Data System (ADS)
Ahmad, Sabrina; Jalil, Intan Ermahani A.; Ahmad, Sharifah Sakinah Syed
2016-08-01
It is seldom technical issues which impede the process of eliciting software requirements. The involvement of multiple stakeholders usually leads to conflicts and therefore the need of conflict detection and resolution effort is crucial. This paper presents a conceptual model to further improve current efforts. Hence, this paper forwards an improved conceptual model to assist the conflict detection and resolution effort which extends the model ability and improves overall performance. The significant of the new model is to empower the automation of conflicts detection and its severity level with rule-based reasoning.
Model-centric approaches for the development of health information systems.
Tuomainen, Mika; Mykkänen, Juha; Luostarinen, Heli; Pöyhölä, Assi; Paakkanen, Esa
2007-01-01
Modeling is used increasingly in healthcare to increase shared knowledge, to improve the processes, and to document the requirements of the solutions related to health information systems (HIS). There are numerous modeling approaches which aim to support these aims, but a careful assessment of their strengths, weaknesses and deficiencies is needed. In this paper, we compare three model-centric approaches in the context of HIS development: the Model-Driven Architecture, Business Process Modeling with BPMN and BPEL and the HL7 Development Framework. The comparison reveals that all these approaches are viable candidates for the development of HIS. However, they have distinct strengths and abstraction levels, they require local and project-specific adaptation and offer varying levels of automation. In addition, illustration of the solutions to the end users must be improved.
Reviews and syntheses: Four decades of modeling methane cycling in terrestrial ecosystems
NASA Astrophysics Data System (ADS)
Xu, Xiaofeng; Yuan, Fengming; Hanson, Paul J.; Wullschleger, Stan D.; Thornton, Peter E.; Riley, William J.; Song, Xia; Graham, David E.; Song, Changchun; Tian, Hanqin
2016-06-01
Over the past 4 decades, a number of numerical models have been developed to quantify the magnitude, investigate the spatial and temporal variations, and understand the underlying mechanisms and environmental controls of methane (CH4) fluxes within terrestrial ecosystems. These CH4 models are also used for integrating multi-scale CH4 data, such as laboratory-based incubation and molecular analysis, field observational experiments, remote sensing, and aircraft-based measurements across a variety of terrestrial ecosystems. Here we summarize 40 terrestrial CH4 models to characterize their strengths and weaknesses and to suggest a roadmap for future model improvement and application. Our key findings are that (1) the focus of CH4 models has shifted from theoretical to site- and regional-level applications over the past 4 decades, (2) large discrepancies exist among models in terms of representing CH4 processes and their environmental controls, and (3) significant data-model and model-model mismatches are partially attributed to different representations of landscape characterization and inundation dynamics. Three areas for future improvements and applications of terrestrial CH4 models are that (1) CH4 models should more explicitly represent the mechanisms underlying land-atmosphere CH4 exchange, with an emphasis on improving and validating individual CH4 processes over depth and horizontal space, (2) models should be developed that are capable of simulating CH4 emissions across highly heterogeneous spatial and temporal scales, particularly hot moments and hotspots, and (3) efforts should be invested to develop model benchmarking frameworks that can easily be used for model improvement, evaluation, and integration with data from molecular to global scales. These improvements in CH4 models would be beneficial for the Earth system models and further simulation of climate-carbon cycle feedbacks.
Graphic model of the processes involved in the production of casegood furniture
Kristen G. Hoff; Subhash C. Sarin; R. Bruce Anderson; R. Bruce Anderson
1992-01-01
Imports from foreign furniture manufacturers are on ,the rise, and American manufacturers must take advantage of recent technological advances to regain their lost market share. To facilitate the implementation of these technologies for improving productivity and quality, a graphic model of the wood furniture production process is presented using the IDEF modeling...
Application of overlay modeling and control with Zernike polynomials in an HVM environment
NASA Astrophysics Data System (ADS)
Ju, JaeWuk; Kim, MinGyu; Lee, JuHan; Nabeth, Jeremy; Jeon, Sanghuck; Heo, Hoyoung; Robinson, John C.; Pierson, Bill
2016-03-01
Shrinking technology nodes and smaller process margins require improved photolithography overlay control. Generally, overlay measurement results are modeled with Cartesian polynomial functions for both intra-field and inter-field models and the model coefficients are sent to an advanced process control (APC) system operating in an XY Cartesian basis. Dampened overlay corrections, typically via exponentially or linearly weighted moving average in time, are then retrieved from the APC system to apply on the scanner in XY Cartesian form for subsequent lot exposure. The goal of the above method is to process lots with corrections that target the least possible overlay misregistration in steady state as well as in change point situations. In this study, we model overlay errors on product using Zernike polynomials with same fitting capability as the process of reference (POR) to represent the wafer-level terms, and use the standard Cartesian polynomials to represent the field-level terms. APC calculations for wafer-level correction are performed in Zernike basis while field-level calculations use standard XY Cartesian basis. Finally, weighted wafer-level correction terms are converted to XY Cartesian space in order to be applied on the scanner, along with field-level corrections, for future wafer exposures. Since Zernike polynomials have the property of being orthogonal in the unit disk we are able to reduce the amount of collinearity between terms and improve overlay stability. Our real time Zernike modeling and feedback evaluation was performed on a 20-lot dataset in a high volume manufacturing (HVM) environment. The measured on-product results were compared to POR and showed a 7% reduction in overlay variation including a 22% terms variation. This led to an on-product raw overlay Mean + 3Sigma X&Y improvement of 5% and resulted in 0.1% yield improvement.
Computational data sciences for assessment and prediction of climate extremes
NASA Astrophysics Data System (ADS)
Ganguly, A. R.
2011-12-01
Climate extremes may be defined inclusively as severe weather events or large shifts in global or regional weather patterns which may be caused or exacerbated by natural climate variability or climate change. This area of research arguably represents one of the largest knowledge-gaps in climate science which is relevant for informing resource managers and policy makers. While physics-based climate models are essential in view of non-stationary and nonlinear dynamical processes, their current pace of uncertainty reduction may not be adequate for urgent stakeholder needs. The structure of the models may in some cases preclude reduction of uncertainty for critical processes at scales or for the extremes of interest. On the other hand, methods based on complex networks, extreme value statistics, machine learning, and space-time data mining, have demonstrated significant promise to improve scientific understanding and generate enhanced predictions. When combined with conceptual process understanding at multiple spatiotemporal scales and designed to handle massive data, interdisciplinary data science methods and algorithms may complement or supplement physics-based models. Specific examples from the prior literature and our ongoing work suggests how data-guided improvements may be possible, for example, in the context of ocean meteorology, climate oscillators, teleconnections, and atmospheric process understanding, which in turn can improve projections of regional climate, precipitation extremes and tropical cyclones in an useful and interpretable fashion. A community-wide effort is motivated to develop and adapt computational data science tools for translating climate model simulations to information relevant for adaptation and policy, as well as for improving our scientific understanding of climate extremes from both observed and model-simulated data.
Systematic study of 16O-induced fusion with the improved quantum molecular dynamics model
NASA Astrophysics Data System (ADS)
Wang, Ning; Zhao, Kai; Li, Zhuxia
2014-11-01
The heavy-ion fusion reactions with 16O bombarding on 62Ni,65Cu,74Ge,148Nd,180Hf,186W,208Pb,238U are systematically investigated with the improved quantum molecular dynamics model. The fusion cross sections at energies near and above the Coulomb barriers can be reasonably well reproduced by using this semiclassical microscopic transport model with the parameter sets SkP* and IQ3a. The dynamical nucleus-nucleus potentials and the influence of Fermi constraint on the fusion process are also studied simultaneously. In addition to the mean field, the Fermi constraint also plays a key role for the reliable description of the fusion process and for improving the stability of fragments in heavy-ion collisions.
McFadden, Kathleen L; Stock, Gregory N; Gowen, Charles R
2014-10-01
Successful amelioration of medical errors represents a significant problem in the health care industry. There is a need for greater understanding of the factors that lead to improved process quality and patient safety outcomes in hospitals. We present a research model that shows how transformational leadership, safety climate, and continuous quality improvement (CQI) initiatives are related to objective quality and patient safety outcome measures. The proposed framework is tested using structural equation modeling, based on data collected for 204 hospitals, and supplemented with objective outcome data from the Centers for Medicare and Medicaid Services. The results provide empirical evidence that a safety climate, which is connected to the chief executive officer's transformational leadership style, is related to CQI initiatives, which are linked to improved process quality. A unique finding of this study is that, although CQI initiatives are positively associated with improved process quality, they are also associated with higher hospital-acquired condition rates, a measure of patient safety. Likewise, safety climate is directly related to improved patient safety outcomes. The notion that patient safety climate and CQI initiatives are not interchangeable or universally beneficial is an important contribution to the literature. The results confirm the importance of using CQI to effectively enhance process quality in hospitals, and patient safety climate to improve patient safety outcomes. The overall pattern of findings suggests that simultaneous implementation of CQI initiatives and patient safety climate produces greater combined benefits.
McFadden, Kathleen L; Stock, Gregory N; Gowen, Charles R
2015-01-01
Successful amelioration of medical errors represents a significant problem in the health care industry. There is a need for greater understanding of the factors that lead to improved process quality and patient safety outcomes in hospitals. We present a research model that shows how transformational leadership, safety climate, and continuous quality improvement (CQI) initiatives are related to objective quality and patient safety outcome measures. The proposed framework is tested using structural equation modeling, based on data collected for 204 hospitals, and supplemented with objective outcome data from the Centers for Medicare and Medicaid Services. The results provide empirical evidence that a safety climate, which is connected to the chief executive officer's transformational leadership style, is related to CQI initiatives, which are linked to improved process quality. A unique finding of this study is that, although CQI initiatives are positively associated with improved process quality, they are also associated with higher hospital-acquired condition rates, a measure of patient safety. Likewise, safety climate is directly related to improved patient safety outcomes. The notion that patient safety climate and CQI initiatives are not interchangeable or universally beneficial is an important contribution to the literature. The results confirm the importance of using CQI to effectively enhance process quality in hospitals, and patient safety climate to improve patient safety outcomes. The overall pattern of findings suggests that simultaneous implementation of CQI initiatives and patient safety climate produces greater combined benefits.
Developing a Data Driven Process-Based Model for Remote Sensing of Ecosystem Production
NASA Astrophysics Data System (ADS)
Elmasri, B.; Rahman, A. F.
2010-12-01
Estimating ecosystem carbon fluxes at various spatial and temporal scales is essential for quantifying the global carbon cycle. Numerous models have been developed for this purpose using several environmental variables as well as vegetation indices derived from remotely sensed data. Here we present a data driven modeling approach for gross primary production (GPP) that is based on a process based model BIOME-BGC. The proposed model was run using available remote sensing data and it does not depend on look-up tables. Furthermore, this approach combines the merits of both empirical and process models, and empirical models were used to estimate certain input variables such as light use efficiency (LUE). This was achieved by using remotely sensed data to the mathematical equations that represent biophysical photosynthesis processes in the BIOME-BGC model. Moreover, a new spectral index for estimating maximum photosynthetic activity, maximum photosynthetic rate index (MPRI), is also developed and presented here. This new index is based on the ratio between the near infrared and the green bands (ρ858.5/ρ555). The model was tested and validated against MODIS GPP product and flux measurements from two eddy covariance flux towers located at Morgan Monroe State Forest (MMSF) in Indiana and Harvard Forest in Massachusetts. Satellite data acquired by the Advanced Microwave Scanning Radiometer (AMSR-E) and MODIS were used. The data driven model showed a strong correlation between the predicted and measured GPP at the two eddy covariance flux towers sites. This methodology produced better predictions of GPP than did the MODIS GPP product. Moreover, the proportion of error in the predicted GPP for MMSF and Harvard forest was dominated by unsystematic errors suggesting that the results are unbiased. The analysis indicated that maintenance respiration is one of the main factors that dominate the overall model outcome errors and improvement in maintenance respiration estimation will result in improved GPP predictions. Although there might be a room for improvements in our model outcomes through improved parameterization, our results suggest that such a methodology for running BIOME-BGC model based entirely on routinely available data can produce good predictions of GPP.
NASA Astrophysics Data System (ADS)
Orra, Kashfull; Choudhury, Sounak K.
2016-12-01
The purpose of this paper is to build an adaptive feedback linear control system to check the variation of cutting force signal to improve the tool life. The paper discusses the use of transfer function approach in improving the mathematical modelling and adaptively controlling the process dynamics of the turning operation. The experimental results shows to be in agreement with the simulation model and error obtained is less than 3%. The state space approach model used in this paper successfully check the adequacy of the control system through controllability and observability test matrix and can be transferred from one state to another by appropriate input control in a finite time. The proposed system can be implemented to other machining process under varying range of cutting conditions to improve the efficiency and observability of the system.
Thermo-optical Modelling of Laser Matter Interactions in Selective Laser Melting Processes.
NASA Astrophysics Data System (ADS)
Vinnakota, Raj; Genov, Dentcho
Selective laser melting (SLM) is one of the promising advanced manufacturing techniques, which is providing an ideal platform to manufacture components with zero geometric constraints. Coupling the electromagnetic and thermodynamic processes involved in the SLM, and developing the comprehensive theoretical model of the same is of great importance since it can provide significant improvements in the printing processes by revealing the optimal parametric space related to applied laser power, scan velocity, powder material, layer thickness and porosity. Here, we present a self-consistent Thermo-optical model which simultaneously solves the Maxwell's and the heat transfer equations and provides an insight into the electromagnetic energy released in the powder-beds and the concurrent thermodynamics of the particles temperature rise and onset of melting. The numerical calculations are compared with developed analytical model of the SLM process providing insight into the dynamics between laser facilitated Joule heating and radiation mitigated rise in temperature. These results provide guidelines toward improved energy efficiency and optimization of the SLM process scan rates. The current work is funded by the NSF EPSCoR CIMM project under award #OIA-1541079.
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo
2007-01-01
One of the most promising methods to test the representation of cloud processes used in climate models is to use observations together with cloud-resolving models (CRMs). CRMs use more sophisticated and realistic representations of cloud microphysical processes, and they can reasonably well resolve the time evolution, structure, and life cycles of clouds and cloud systems (with sizes ranging from about 2-200 km). CRMs also allow for explicit interaction between clouds, outgoing longwave (cooling) and incoming solar (heating) radiation, and ocean and land surface processes. Observations are required to initialize CRMs and to validate their results. This paper provides a brief discussion and review of the main characteristics of CRMs as well as some of their major applications. These include the use of CRMs to improve our understanding of: (1) convective organization, (2) cloud temperature and water vapor budgets, and convective momentum transport, (3) diurnal variation of precipitation processes, (4) radiative-convective quasi-equilibrium states, (5) cloud-chemistry interaction, (6) aerosol-precipitation interaction, and (7) improving moist processes in large-scale models. In addition, current and future developments and applications of CRMs will be presented.
NASA Astrophysics Data System (ADS)
Villamil-Otero, G.; Zhang, J.; Yao, Y.
2017-12-01
The Antarctic Peninsula (AP) has long been the focus of climate change studies due to its rapid environmental changes such as significantly increased glacier melt and retreat, and ice-shelf break-up. Progress has been continuously made in the use of regional modeling to simulate surface mass changes over ice sheets. Most efforts, however, focus on the ice sheets of Greenland with considerable fewer studies in Antarctica. In this study the Weather Research and Forecasting (WRF) model, which has been applied to the Antarctic region for weather modeling, is adopted to capture the past and future surface mass balance changes over AP. In order to enhance the capabilities of WRF model simulating surface mass balance over the ice surface, we implement various ice and snow processes within the WRF and develop a new WRF suite (WRF-Ice). The WRF-Ice includes a thermodynamic ice sheet model that improves the representation of internal melting and refreezing processes and the thermodynamic effects over ice sheet. WRF-Ice also couples a thermodynamic sea ice model to improve the simulation of surface temperature and fluxes over sea ice. Lastly, complex snow processes are also taken into consideration including the implementation of a snowdrift model that takes into account the redistribution of blowing snow as well as the thermodynamic impact of drifting snow sublimation on the lower atmospheric boundary layer. Intensive testing of these ice and snow processes are performed to assess the capability of WRF-Ice in simulating the surface mass balance changes over AP.
Multidisciplinary drifting Observatory for the Study of Arctic Climate (MOSAiC)
NASA Astrophysics Data System (ADS)
Dethloff, Klaus; Rex, Markus; Shupe, Matthew
2016-04-01
The Multidisciplinary drifting Observatory for the Study of Arctic Climate (MOSAiC) is an international initiative under the International Arctic Science Committee (IASC) umbrella that aims to improve numerical model representations of sea ice, weather, and climate processes through coupled system observations and modeling activities that link the central Arctic atmosphere, sea ice, ocean, and the ecosystem. Observations of many critical parameters such as cloud properties, surface energy fluxes, atmospheric aerosols, small-scale sea-ice and oceanic processes, biological feedbacks with the sea-ice ice and ocean, and others have never been made in the central Arctic in all seasons, and certainly not in a coupled system fashion. The primary objective of MOSAiC is to develop a better understanding of these important coupled-system processes so they can be more accurately represented in regional- and global-scale weather- and climate models. Such enhancements will contribute to improved modeling of global climate and weather, and Arctic sea-ice predictive capabilities. The MOSAiC observations are an important opportunity to gather the high quality and comprehensive observations needed to improve numerical modeling of critical, scale-dependent processes impacting Arctic predictability given diminished sea ice coverage and increased model complexity. Model improvements are needed to understand the effects of a changing Arctic on mid-latitude weather and climate. MOSAiC is specifically designed to provide the multi-parameter, coordinated observations needed to improve sub-grid scale model parameterizations especially with respect to thinner ice conditions. To facilitate, evaluate, and develop the needed model improvements, MOSAiC will employ a hierarchy of modeling approaches ranging from process model studies, to regional climate model intercomparisons, to operational forecasts and assimilation of real-time observations. Model evaluations prior to the field program will be used to identify specific gaps and parameterization needs. Preliminary modeling and operational forecasting will also be necessary to directly guide field planning and optimal implementation of field resources, and to support the safety of the project. The MOSAiC Observatory will be deployed in, and drift with, the Arctic sea-ice pack for at least a full annual cycle, starting in fall 2019 and ending in autumn 2020. Initial plans are for the drift to start in the newly forming autumn sea-ice in, or near, the East Siberian Sea. The specific location will be selected to allow for the observatory to follow the Transpolar Drift towards the North Pole and on to the Fram Strait. IASC has adopted MOSAiC as a key international activity, the German Alfred Wegener Institute has made the huge contribution of the icebreaker Polarstern to serve as the central drifting observatory for this year long endeavor, and the US Department of Energy has committed a comprehensive atmospheric measurement suite. Many other nations and agencies have expressed interest in participation and in gaining access to this unprecedented observational dataset. International coordination is needed to support this groundbreaking endeavor.
A Study on Micropipetting Detection Technology of Automatic Enzyme Immunoassay Analyzer.
Shang, Zhiwu; Zhou, Xiangping; Li, Cheng; Tsai, Sang-Bing
2018-04-10
In order to improve the accuracy and reliability of micropipetting, a method of micro-pipette detection and calibration combining the dynamic pressure monitoring in pipetting process and quantitative identification of pipette volume in image processing was proposed. Firstly, the normalized pressure model for the pipetting process was established with the kinematic model of the pipetting operation, and the pressure model is corrected by the experimental method. Through the pipetting process pressure and pressure of the first derivative of real-time monitoring, the use of segmentation of the double threshold method as pipetting fault evaluation criteria, and the pressure sensor data are processed by Kalman filtering, the accuracy of fault diagnosis is improved. When there is a fault, the pipette tip image is collected through the camera, extract the boundary of the liquid region by the background contrast method, and obtain the liquid volume in the tip according to the geometric characteristics of the pipette tip. The pipette deviation feedback to the automatic pipetting module and deviation correction is carried out. The titration test results show that the combination of the segmented pipetting kinematic model of the double threshold method of pressure monitoring, can effectively real-time judgment and classification of the pipette fault. The method of closed-loop adjustment of pipetting volume can effectively improve the accuracy and reliability of the pipetting system.
Dai, Heng; Ye, Ming; Walker, Anthony P.; ...
2017-03-28
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Ye, Ming; Walker, Anthony P.
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
NASA Astrophysics Data System (ADS)
Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.
2018-03-01
The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.
Biagianti, Bruno; Fisher, Melissa; Neilands, Torsten B.; Loewy, Rachel; Vinogradov, Sophia
2016-01-01
BACKGROUND Individuals with schizophrenia who engage in targeted cognitive training (TCT) of the auditory system show generalized cognitive improvements. The high degree of variability in cognitive gains maybe due to individual differences in the level of engagement of the underlying neural system target. METHODS 131 individuals with schizophrenia underwent 40 hours of TCT. We identified target engagement of auditory system processing efficiency by modeling subject-specific trajectories of auditory processing speed (APS) over time. Lowess analysis, mixed models repeated measures analysis, and latent growth curve modeling were used to examine whether APS trajectories were moderated by age and illness duration, and mediated improvements in cognitive outcome measures. RESULTS We observed signifcant improvements in APS from baseline to 20 hours of training (initial change), followed by a flat APS trajectory (plateau) at subsequent time-points. Participants showed inter-individual variability in the steepness of the initial APS change and in the APS plateau achieved and sustained between 20–40 hours. We found that participants who achieved the fastest APS plateau, showed the greatest transfer effects to untrained cognitive domains. CONCLUSIONS There is a significant association between an individual's ability to generate and sustain auditory processing efficiency and their degree of cognitive improvement after TCT, independent of baseline neurocognition. APS plateau may therefore represent a behavioral measure of target engagement mediating treatment response. Future studies should examine the optimal plateau of auditory processing efficiency required to induce significant cognitive improvements, in the context of inter-individual differences in neural plasticity and sensory system efficiency that characterize schizophrenia. PMID:27617637
Ward, Marie; McDonald, Nick; Morrison, Rabea; Gaynor, Des; Nugent, Tony
2010-02-01
Aircraft maintenance is a highly regulated, safety critical, complex and competitive industry. There is a need to develop innovative solutions to address process efficiency without compromising safety and quality. This paper presents the case that in order to improve a highly complex system such as aircraft maintenance, it is necessary to develop a comprehensive and ecologically valid model of the operational system, which represents not just what is meant to happen, but what normally happens. This model then provides the backdrop against which to change or improve the system. A performance report, the Blocker Report, specific to aircraft maintenance and related to the model was developed gathering data on anything that 'blocks' task or check performance. A Blocker Resolution Process was designed to resolve blockers and improve the current check system. Significant results were obtained for the company in the first trial and implications for safety management systems and hazard identification are discussed. Statement of Relevance: Aircraft maintenance is a safety critical, complex, competitive industry with a need to develop innovative solutions to address process and safety efficiency. This research addresses this through the development of a comprehensive and ecologically valid model of the system linked with a performance reporting and resolution system.
Towards a model-based cognitive neuroscience of stopping - a neuroimaging perspective.
Sebastian, Alexandra; Forstmann, Birte U; Matzke, Dora
2018-07-01
Our understanding of the neural correlates of response inhibition has greatly advanced over the last decade. Nevertheless the specific function of regions within this stopping network remains controversial. The traditional neuroimaging approach cannot capture many processes affecting stopping performance. Despite the shortcomings of the traditional neuroimaging approach and a great progress in mathematical and computational models of stopping, model-based cognitive neuroscience approaches in human neuroimaging studies are largely lacking. To foster model-based approaches to ultimately gain a deeper understanding of the neural signature of stopping, we outline the most prominent models of response inhibition and recent advances in the field. We highlight how a model-based approach in clinical samples has improved our understanding of altered cognitive functions in these disorders. Moreover, we show how linking evidence-accumulation models and neuroimaging data improves the identification of neural pathways involved in the stopping process and helps to delineate these from neural networks of related but distinct functions. In conclusion, adopting a model-based approach is indispensable to identifying the actual neural processes underlying stopping. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
The Theoretical Basis of the Effective School Improvement Model (ESI)
ERIC Educational Resources Information Center
Scheerens, Jaap; Demeuse, Marc
2005-01-01
This article describes the process of theoretical reflection that preceded the development and empirical verification of a model of "effective school improvement". The focus is on basic mechanisms that could be seen as underlying "getting things in motion" and change in education systems. Four mechanisms are distinguished:…
Guiding and Modelling Quality Improvement in Higher Education Institutions
ERIC Educational Resources Information Center
Little, Daniel
2015-01-01
The article considers the process of creating quality improvement in higher education institutions from the point of view of current organisational theory and social-science modelling techniques. The author considers the higher education institution as a functioning complex of rules, norms and other organisational features and reviews the social…
Process Engineering with the Evolutionary Spiral Process Model. Version 01.00.06
1994-01-01
program . Process Definition and SPC-92041-CMC Provides methods for defining and Modeling Guidebook documenting processes so they can be analyzed, modified...and Program Evaluation and Review Technique (PERT) support the activity of developing a project schedule. A variety of automated tools, such as...keep the organiza- tion from becoming disoriented during the improvement program (Curtis, Kellner, and Over 1992). Analyzing and documenting how
Mathematics: Program Assessment and Improvement Planning Manual.
ERIC Educational Resources Information Center
Whitman, Nancy C.; And Others
This document provides a model for assessing a school's mathematics program and planning for program improvement. A systematic process for instructional improvement focuses upon students' needs and the identification of successful instructional strategies to meet these needs. The improvement plan and the implementation of intervention strategies…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, P.; Johannes, J.; Kudriavtsev, V.
The use of computational modeling to improve equipment and process designs for chemical vapor deposition (CVD) reactors is becoming increasingly common. Commercial codes are available that facilitate the modeling of chemically-reacting flows, but chemical reaction mechanisms must be separately developed for each system of interest. One f the products of the Watkins-Johnson Company (WJ) is a reactor marketed to semiconductor manufacturers for the atmospheric-pressure chemical vapor deposition (APCVD) of silicon oxide films. In this process, TEOS (tetraethoxysilane, Si(OC{sub 2}H{sub 5}){sub 4}) and ozone (O{sub 3}) are injected (in nitrogen and oxygen carrier gases) over hot silicon wafers that are beingmore » carried through the system on a moving belt. As part of their equipment improvement process, WJ is developing computational models of this tool. In this effort, they are collaborating with Sandia National Laboratories (SNL) to draw on Sandia`s experience base in understanding and modeling the chemistry of CVD processes.« less
LES ARM Symbiotic Simulation and Observation (LASSO) Implementation Strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gustafson Jr., WI; Vogelmann, AM
2015-09-01
This document illustrates the design of the Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) workflow to provide a routine, high-resolution modeling capability to augment the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility’s high-density observations. LASSO will create a powerful new capability for furthering ARM’s mission to advance understanding of cloud, radiation, aerosol, and land-surface processes. The combined observational and modeling elements will enable a new level of scientific inquiry by connecting processes and context to observations and providing needed statistics for details that cannot be measured. The result will be improved process understandingmore » that facilitates concomitant improvements in climate model parameterizations. The initial LASSO implementation will be for ARM’s Southern Great Plains site in Oklahoma and will focus on shallow convection, which is poorly simulated by climate models due in part to clouds’ typically small spatial scale compared to model grid spacing, and because the convection involves complicated interactions of microphysical and boundary layer processes.« less
Bachis, Giulia; Maruéjouls, Thibaud; Tik, Sovanna; Amerlinck, Youri; Melcer, Henryk; Nopens, Ingmar; Lessard, Paul; Vanrolleghem, Peter A
2015-01-01
Characterization and modelling of primary settlers have been neglected pretty much to date. However, whole plant and resource recovery modelling requires primary settler model development, as current models lack detail in describing the dynamics and the diversity of the removal process for different particulate fractions. This paper focuses on the improved modelling and experimental characterization of primary settlers. First, a new modelling concept based on particle settling velocity distribution is proposed which is then applied for the development of an improved primary settler model as well as for its characterization under addition of chemicals (chemically enhanced primary treatment, CEPT). This model is compared to two existing simple primary settler models (Otterpohl and Freund; Lessard and Beck), showing to be better than the first one and statistically comparable to the second one, but with easier calibration thanks to the ease with which wastewater characteristics can be translated into model parameters. Second, the changes in the activated sludge model (ASM)-based chemical oxygen demand fractionation between inlet and outlet induced by primary settling is investigated, showing that typical wastewater fractions are modified by primary treatment. As they clearly impact the downstream processes, both model improvements demonstrate the need for more detailed primary settler models in view of whole plant modelling.
Hospital cost structure in the USA: what's behind the costs? A business case.
Chandra, Charu; Kumar, Sameer; Ghildayal, Neha S
2011-01-01
Hospital costs in the USA are a large part of the national GDP. Medical billing and supplies processes are significant and growing contributors to hospital operations costs in the USA. This article aims to identify cost drivers associated with these processes and to suggest improvements to reduce hospital costs. A Monte Carlo simulation model that uses @Risk software facilitates cost analysis and captures variability associated with the medical billing process (administrative) and medical supplies process (variable). The model produces estimated savings for implementing new processes. Significant waste exists across the entire medical supply process that needs to be eliminated. Annual savings, by implementing the improved process, have the potential to save several billion dollars annually in US hospitals. The other analysis in this study is related to hospital billing processes. Increased spending on hospital billing processes is not entirely due to hospital inefficiency. The study lacks concrete data for accurately measuring cost savings, but there is obviously room for improvement in the two US healthcare processes. This article only looks at two specific costs associated with medical supply and medical billing processes, respectively. This study facilitates awareness of escalating US hospital expenditures. Cost categories, namely, fixed, variable and administrative, are presented to identify the greatest areas for improvement. The study will be valuable to US Congress policy makers and US healthcare industry decision makers. Medical billing process, part of a hospital's administrative costs, and hospital supplies management processes are part of variable costs. These are the two major cost drivers of US hospitals' expenditures that were examined and analyzed.
Dual Processes in Decision Making and Developmental Neuroscience: A Fuzzy-Trace Model
ERIC Educational Resources Information Center
Reyna, Valerie F.; Brainerd, Charles J.
2011-01-01
From Piaget to the present, traditional and dual-process theories have predicted improvement in reasoning from childhood to adulthood, and improvement has been observed. However, developmental reversals--that reasoning biases emerge with development--have also been observed in a growing list of paradigms. We explain how fuzzy-trace theory predicts…
ERIC Educational Resources Information Center
Takanishi, Stacey M.
2012-01-01
NCLB policies in the United States focus schools' efforts on implementing effective instructional processes to improve student outcomes. This study looks more specifically at how schools are perceived to be implementing state required curricula and benchmarks and developing teaching and learning processes that support the teaching of state…
NASA Astrophysics Data System (ADS)
Dantec-Nédélec, S.; Ottlé, C.; Wang, T.; Guglielmo, F.; Maignan, F.; Delbart, N.; Valdayskikh, V.; Radchenko, T.; Nekrasova, O.; Zakharov, V.; Jouzel, J.
2017-06-01
The ORCHIDEE land surface model has recently been updated to improve the representation of high-latitude environments. The model now includes improved soil thermodynamics and the representation of permafrost physical processes (soil thawing and freezing), as well as a new snow model to improve the representation of the seasonal evolution of the snow pack and the resulting insulation effects. The model was evaluated against data from the experimental sites of the WSibIso-Megagrant project (www.wsibiso.ru). ORCHIDEE was applied in stand-alone mode, on two experimental sites located in the Yamal Peninsula in the northwestern part of Siberia. These sites are representative of circumpolar-Arctic tundra environments and differ by their respective fractions of shrub/tree cover and soil type. After performing a global sensitivity analysis to identify those parameters that have most influence on the simulation of energy and water transfers, the model was calibrated at local scale and evaluated against in situ measurements (vertical profiles of soil temperature and moisture, as well as active layer thickness) acquired during summer 2012. The results show how sensitivity analysis can identify the dominant processes and thereby reduce the parameter space for the calibration process. We also discuss the model performance at simulating the soil temperature and water content (i.e., energy and water transfers in the soil-vegetation-atmosphere continuum) and the contribution of the vertical discretization of the hydrothermal properties. This work clearly shows, at least at the two sites used for validation, that the new ORCHIDEE vertical discretization can represent the water and heat transfers through complex cryogenic Arctic soils—soils which present multiple horizons sometimes with peat inclusions. The improved model allows us to prescribe the vertical heterogeneity of the soil hydrothermal properties.
Improvement of Storm Forecasts Using Gridded Bayesian Linear Regression for Northeast United States
NASA Astrophysics Data System (ADS)
Yang, J.; Astitha, M.; Schwartz, C. S.
2017-12-01
Bayesian linear regression (BLR) is a post-processing technique in which regression coefficients are derived and used to correct raw forecasts based on pairs of observation-model values. This study presents the development and application of a gridded Bayesian linear regression (GBLR) as a new post-processing technique to improve numerical weather prediction (NWP) of rain and wind storm forecasts over northeast United States. Ten controlled variables produced from ten ensemble members of the National Center for Atmospheric Research (NCAR) real-time prediction system are used for a GBLR model. In the GBLR framework, leave-one-storm-out cross-validation is utilized to study the performances of the post-processing technique in a database composed of 92 storms. To estimate the regression coefficients of the GBLR, optimization procedures that minimize the systematic and random error of predicted atmospheric variables (wind speed, precipitation, etc.) are implemented for the modeled-observed pairs of training storms. The regression coefficients calculated for meteorological stations of the National Weather Service are interpolated back to the model domain. An analysis of forecast improvements based on error reductions during the storms will demonstrate the value of GBLR approach. This presentation will also illustrate how the variances are optimized for the training partition in GBLR and discuss the verification strategy for grid points where no observations are available. The new post-processing technique is successful in improving wind speed and precipitation storm forecasts using past event-based data and has the potential to be implemented in real-time.
Model-data integration to improve the LPJmL dynamic global vegetation model
NASA Astrophysics Data System (ADS)
Forkel, Matthias; Thonicke, Kirsten; Schaphoff, Sibyll; Thurner, Martin; von Bloh, Werner; Dorigo, Wouter; Carvalhais, Nuno
2017-04-01
Dynamic global vegetation models show large uncertainties regarding the development of the land carbon balance under future climate change conditions. This uncertainty is partly caused by differences in how vegetation carbon turnover is represented in global vegetation models. Model-data integration approaches might help to systematically assess and improve model performances and thus to potentially reduce the uncertainty in terrestrial vegetation responses under future climate change. Here we present several applications of model-data integration with the LPJmL (Lund-Potsdam-Jena managed Lands) dynamic global vegetation model to systematically improve the representation of processes or to estimate model parameters. In a first application, we used global satellite-derived datasets of FAPAR (fraction of absorbed photosynthetic activity), albedo and gross primary production to estimate phenology- and productivity-related model parameters using a genetic optimization algorithm. Thereby we identified major limitations of the phenology module and implemented an alternative empirical phenology model. The new phenology module and optimized model parameters resulted in a better performance of LPJmL in representing global spatial patterns of biomass, tree cover, and the temporal dynamic of atmospheric CO2. Therefore, we used in a second application additionally global datasets of biomass and land cover to estimate model parameters that control vegetation establishment and mortality. The results demonstrate the ability to improve simulations of vegetation dynamics but also highlight the need to improve the representation of mortality processes in dynamic global vegetation models. In a third application, we used multiple site-level observations of ecosystem carbon and water exchange, biomass and soil organic carbon to jointly estimate various model parameters that control ecosystem dynamics. This exercise demonstrates the strong role of individual data streams on the simulated ecosystem dynamics which consequently changed the development of ecosystem carbon stocks and fluxes under future climate and CO2 change. In summary, our results demonstrate challenges and the potential of using model-data integration approaches to improve a dynamic global vegetation model.
Comparison of simulation modeling and satellite techniques for monitoring ecological processes
NASA Technical Reports Server (NTRS)
Box, Elgene O.
1988-01-01
In 1985 improvements were made in the world climatic data base for modeling and predictive mapping; in individual process models and the overall carbon-balance models; and in the interface software for mapping the simulation results. Statistical analysis of the data base was begun. In 1986 mapping was shifted to NASA-Goddard. The initial approach involving pattern comparisons was modified to a more statistical approach. A major accomplishment was the expansion and improvement of a global data base of measurements of biomass and primary production, to complement the simulation data. The main accomplishments during 1987 included: production of a master tape with all environmental and satellite data and model results for the 1600 sites; development of a complete mapping system used for the initial color maps comparing annual and monthly patterns of Normalized Difference Vegetation Index (NDVI), actual evapotranspiration, net primary productivity, gross primary productivity, and net ecosystem production; collection of more biosphere measurements for eventual improvement of the biological models; and development of some initial monthly models for primary productivity, based on satellite data.
Development of a Rolling Process Design Tool for Use in Improving Hot Roll Slab Recovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Couch, R; Becker, R; Rhee, M
2004-09-24
Lawrence Livermore National Laboratory participated in a U. S. Department of Energy/Office of Industrial Technology sponsored research project 'Development of a Rolling Process Design Tool for Use in Improving Hot Roll Slab Recovery', as a Cooperative Agreement TC-02028 with the Alcoa Technical Center (ATC). The objective of the joint project with Alcoa is to develop a numerical modeling capability to optimize the hot rolling process used to produce aluminum plate. Product lost in the rolling process and subsequent recycling, wastes resources consumed in the energy-intensive steps of remelting and reprocessing the ingot. The modeling capability developed by project partners willmore » be used to produce plate more efficiently and with reduced product loss.« less
[Lean thinking and brain-dead patient assistance in the organ donation process].
Pestana, Aline Lima; dos Santos, José Luís Guedes; Erdmann, Rolf Hermann; da Silva, Elza Lima; Erdmann, Alacoque Lorenzini
2013-02-01
Organ donation is a complex process that challenges health system professionals and managers. This study aimed to introduce a theoretical model to organize brain-dead patient assistance and the organ donation process guided by the main lean thinking ideas, which enable production improvement through planning cycles and the development of a proper environment for successful implementation. Lean thinking may make the process of organ donation more effective and efficient and may contribute to improvements in information systematization and professional qualifications for excellence of assistance. The model is configured as a reference that is available for validation and implementation by health and nursing professionals and managers in the management of potential organ donors after brain death assistance and subsequent transplantation demands.
Improving Metal Casting Process
NASA Technical Reports Server (NTRS)
1998-01-01
Don Sirois, an Auburn University research associate, and Bruce Strom, a mechanical engineering Co-Op Student, are evaluating the dimensional characteristics of an aluminum automobile engine casting. More accurate metal casting processes may reduce the weight of some cast metal products used in automobiles, such as engines. Research in low gravity has taken an important first step toward making metal products used in homes, automobiles, and aircraft less expensive, safer, and more durable. Auburn University and industry are partnering with NASA to develop one of the first accurate computer model predictions of molten metals and molding materials used in a manufacturing process called casting. Ford Motor Company's casting plant in Cleveland, Ohio is using NASA-sponsored computer modeling information to improve the casting process of automobile and light-truck engine blocks.
Yang, Lei; Lu, Jun; Dai, Ming; Ren, Li-Jie; Liu, Wei-Zong; Li, Zhen-Zhou; Gong, Xue-Hao
2016-10-06
An ultrasonic image speckle noise removal method by using total least squares model is proposed and applied onto images of cardiovascular structures such as the carotid artery. On the basis of the least squares principle, the related principle of minimum square method is applied to cardiac ultrasound image speckle noise removal process to establish the model of total least squares, orthogonal projection transformation processing is utilized for the output of the model, and the denoising processing for the cardiac ultrasound image speckle noise is realized. Experimental results show that the improved algorithm can greatly improve the resolution of the image, and meet the needs of clinical medical diagnosis and treatment of the cardiovascular system for the head and neck. Furthermore, the success in imaging of carotid arteries has strong implications in neurological complications such as stroke.
1990-12-01
studies for the continuing education of managers new to the TQM approach , for informing vendors of their responsibilities under a changed process, and...Department of Defense (DoD) is adopting a management approach known as Total Quality Management (TQM) in an effort to improve quality and productivity...individuals selected be highly knowledgeable about the operations in their shop or unit. The main function of PATs is to collect and summarize process data for
Moore, Lynne; Lavoie, André; Bourgeois, Gilles; Lapointe, Jean
2015-06-01
According to Donabedian's health care quality model, improvements in the structure of care should lead to improvements in clinical processes that should in turn improve patient outcome. This model has been widely adopted by the trauma community but has not yet been validated in a trauma system. The objective of this study was to assess the performance of an integrated trauma system in terms of structure, process, and outcome and evaluate the correlation between quality domains. Quality of care was evaluated for patients treated in a Canadian provincial trauma system (2005-2010; 57 centers, n = 63,971) using quality indicators (QIs) developed and validated previously. Structural performance was measured by transposing on-site accreditation visit reports onto an evaluation grid according to American College of Surgeons criteria. The composite process QI was calculated as the average sum of proportions of conformity to 15 process QIs derived from literature review and expert opinion. Outcome performance was measured using risk-adjusted rates of mortality, complications, and readmission as well as hospital length of stay (LOS). Correlation was assessed with Pearson's correlation coefficients. Statistically significant correlations were observed between structure and process QIs (r = 0.33), and process and outcome QIs (r = -0.33 for readmission, r = -0.27 for LOS). Significant positive correlations were also observed between outcome QIs (r = 0.37 for mortality-readmission; r = 0.39 for mortality-LOS and readmission-LOS; r = 0.45 for mortality-complications; r = 0.34 for readmission-complications; 0.63 for complications-LOS). Significant correlations between quality domains observed in this study suggest that Donabedian's structure-process-outcome model is a valid model for evaluating trauma care. Trauma centers that perform well in terms of structure also tend to perform well in terms of clinical processes, which in turn has a favorable influence on patient outcomes. Prognostic study, level III.
Improving Energy Efficiency for the Vehicle Assembly Industry: A Discrete Event Simulation Approach
NASA Astrophysics Data System (ADS)
Oumer, Abduaziz; Mekbib Atnaw, Samson; Kie Cheng, Jack; Singh, Lakveer
2016-11-01
This paper presented a Discrete Event Simulation (DES) model for investigating and improving energy efficiency in vehicle assembly line. The car manufacturing industry is one of the highest energy consuming industries. Using Rockwell Arena DES package; a detailed model was constructed for an actual vehicle assembly plant. The sources of energy considered in this research are electricity and fuel; which are the two main types of energy sources used in a typical vehicle assembly plant. The model depicts the performance measurement for process- specific energy measures of painting, welding, and assembling processes. Sound energy efficiency model within this industry has two-fold advantage: reducing CO2 emission and cost reduction associated with fuel and electricity consumption. The paper starts with an overview of challenges in energy consumption within the facilities of automotive assembly line and highlights the parameters for energy efficiency. The results of the simulation model indicated improvements for energy saving objectives and reduced costs.
Temperature modelling and prediction for activated sludge systems.
Lippi, S; Rosso, D; Lubello, C; Canziani, R; Stenstrom, M K
2009-01-01
Temperature is an important factor affecting biomass activity, which is critical to maintain efficient biological wastewater treatment, and also physiochemical properties of mixed liquor as dissolved oxygen saturation and settling velocity. Controlling temperature is not normally possible for treatment systems but incorporating factors impacting temperature in the design process, such as aeration system, surface to volume ratio, and tank geometry can reduce the range of temperature extremes and improve the overall process performance. Determining how much these design or up-grade options affect the tank temperature requires a temperature model that can be used with existing design methodologies. This paper presents a new steady state temperature model developed by incorporating the best aspects of previously published models, introducing new functions for selected heat exchange paths and improving the method for predicting the effects of covering aeration tanks. Numerical improvements with embedded reference data provide simpler formulation, faster execution, easier sensitivity analyses, using an ordinary spreadsheet. The paper presents several cases to validate the model.
SU-C-207-04: Reconstruction Artifact Reduction in X-Ray Cone Beam CT Using a Treatment Couch Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lasio, G; Hu, E; Zhou, J
2015-06-15
Purpose: to mitigate artifacts induced by the presence of the RT treatment couch in on-board CBCT and improve image quality Methods: a model of a Varian IGRT couch is constructed using a CBCT scan of the couch in air. The model is used to generate a set of forward projections (FP) of the treatment couch at specified gantry angles. The model couch forward projections are then used to process CBCT scan projections which contain the couch in addition to the scan object (Catphan phantom), in order to remove the attenuation component of the couch at any given gantry angle. Priormore » to pre-processing with the model FP, the Catphan projection data is normalized to an air scan with bowtie filter. The filtered Catphan projections are used to reconstruct the CBCT with an in-house FDK algorithm. The artifact reduction in the processed CBCT scan is assessed visually, and the image quality improvement is measured with the CNR over a few selected ROIs of the Catphan modules. Results: Sufficient match between the forward projected data and the x-ray projections is achieved to allow filtering in attenuation space. Visual improvement of the couch induced artifacts is achieved, with a moderate expense of CNR. Conclusion: Couch model-based correction of CBCT projection data has a potential for qualitative improvement of clinical CBCT scans, without requiring position specific correction data. The technique could be used to produce models of other artifact inducing devices, such as immobilization boards, and reduce their impact on patient CBCT images.« less
Brink, Adrian J; Messina, Angeliki P; Feldman, Charles; Richards, Guy A; van den Bergh, Dena
2017-04-01
Few data exist on the implementation of process measures to facilitate adherence to peri-operative antibiotic prophylaxis (PAP) guidelines in Africa. To implement an improvement model for PAP utilizing existing resources, in order to achieve a reduction in surgical site infections (SSIs) across a heterogeneous group of 34 urban and rural South African hospitals. A pharmacist-driven, prospective audit and feedback strategy involving change management and improvement principles was utilized. This 2.5 year intervention involved a pre-implementation phase to test a PAP guideline and a 'toolkit' at pilot sites. Following antimicrobial stewardship committee and clinician endorsement, the model was introduced in all institutions and a survey of baseline SSI and compliance rates with four process measures (antibiotic choice, dose, administration time and duration) was performed. The post-implementation phase involved audit, intervention and monthly feedback to facilitate improvements in compliance. For 70 weeks of standardized measurements and feedback, 24 206 surgical cases were reviewed. There was a significant improvement in compliance with all process measures (composite compliance) from 66.8% (95% CI 64.8-68.7) to 83.3% (95% CI 80.8-85.8), representing a 24.7% increase ( P < 0.0001). The SSI rate decreased by 19.7% from a mean group rate of 2.46 (95% CI 2.18-2.73) pre-intervention to 1.97 post-intervention (95% CI 1.79-2.15) ( P = 0.0029). The implementation of process improvement initiatives and principles targeted to institutional needs utilizing pharmacists can effectively improve PAP guideline compliance and sustainable patient outcomes. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Dandoy, Christopher E; Hariharan, Selena; Weiss, Brian; Demmel, Kathy; Timm, Nathan; Chiarenzelli, Janis; Dewald, Mary Katherine; Kennebeck, Stephanie; Langworthy, Shawna; Pomales, Jennifer; Rineair, Sylvia; Sandfoss, Erin; Volz-Noe, Pamela; Nagarajan, Rajaram; Alessandrini, Evaline
2016-02-01
Timely delivery of antibiotics to febrile immunocompromised (F&I) paediatric patients in the emergency department (ED) and outpatient clinic reduces morbidity and mortality. The aim of this quality improvement initiative was to increase the percentage of F&I patients who received antibiotics within goal in the clinic and ED from 25% to 90%. Using the Model of Improvement, we performed Plan-Do-Study-Act cycles to design, test and implement high-reliability interventions to decrease time to antibiotics. Pre-arrival interventions were tested and implemented, followed by post-arrival interventions in the ED. Many processes were spread successfully to the outpatient clinic. The Chronic Care Model was used, in addition to active family engagement, to inform and improve processes. The study period was from January 2010 to January 2015. Pre-arrival planning improved our F&I time to antibiotics in the ED from 137 to 88 min. This was sustained until October 2012, when further interventions including a pre-arrival huddle decreased the median time to <50 min. Implementation of the various processes to the clinic delivery system increased the mean percentage of patients receiving antibiotics within 60 min to >90%. In September 2014, we implemented a rapid response team to improve reliable venous access in the ED, which increased our mean percentage of patients receiving timely antibiotics to its highest rate (95%). This stepwise approach with pre-arrival planning using the Chronic Care Model, followed by standardisation of processes, created a sustainable improvement of timely antibiotic delivery in F&I patients. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Process-oriented Observational Metrics for CMIP6 Climate Model Assessments
NASA Astrophysics Data System (ADS)
Jiang, J. H.; Su, H.
2016-12-01
Observational metrics based on satellite observations have been developed and effectively applied during post-CMIP5 model evaluation and improvement projects. As new physics and parameterizations continue to be included in models for the upcoming CMIP6, it is important to continue objective comparisons between observations and model results. This talk will summarize the process-oriented observational metrics and methodologies for constraining climate models with A-Train satellite observations and support CMIP6 model assessments. We target parameters and processes related to atmospheric clouds and water vapor, which are critically important for Earth's radiative budget, climate feedbacks, and water and energy cycles, and thus reduce uncertainties in climate models.
Thermal analysis of void cavity for heat pipe receiver under microgravity
NASA Astrophysics Data System (ADS)
Gui, Xiaohong; Song, Xiange; Nie, Baisheng
2017-04-01
Based on theoretical analysis of PCM (Phase Change Material) solidification process, the model of improved void cavity distribution tending to high temperature region is established. Numerical results are compared with NASA (National Aeronautics and Space Administration) results. Analysis results show that the outer wall temperature, the melting ratio of PCM and the temperature gradient of PCM canister, have great difference in different void cavity distribution. The form of void distribution has a great effect on the process of phase change. Based on simulation results under the model of improved void cavity distribution, phase change heat transfer process in thermal storage container is analyzed. The main goal of the improved designing for PCM canister is to take measures in reducing the concentration distribution of void cavity by adding some foam metal into phase change material.
Stuit, Marco; Wortmann, Hans; Szirbik, Nick; Roodenburg, Jan
2011-12-01
In the healthcare domain, human collaboration processes (HCPs), which consist of interactions between healthcare workers from different (para)medical disciplines and departments, are of growing importance as healthcare delivery becomes increasingly integrated. Existing workflow-based process modelling tools for healthcare process management, which are the most commonly applied, are not suited for healthcare HCPs mainly due to their focus on the definition of task sequences instead of the graphical description of human interactions. This paper uses a case study of a healthcare HCP at a Dutch academic hospital to evaluate a novel interaction-centric process modelling method. The HCP under study is the care pathway performed by the head and neck oncology team. The evaluation results show that the method brings innovative, effective, and useful features. First, it collects and formalizes the tacit domain knowledge of the interviewed healthcare workers in individual interaction diagrams. Second, the method automatically integrates these local diagrams into a single global interaction diagram that reflects the consolidated domain knowledge. Third, the case study illustrates how the method utilizes a graphical modelling language for effective tree-based description of interactions, their composition and routing relations, and their roles. A process analysis of the global interaction diagram is shown to identify HCP improvement opportunities. The proposed interaction-centric method has wider applicability since interactions are the core of most multidisciplinary patient-care processes. A discussion argues that, although (multidisciplinary) collaboration is in many cases not optimal in the healthcare domain, it is increasingly considered a necessity to improve integration, continuity, and quality of care. The proposed method is helpful to describe, analyze, and improve the functioning of healthcare collaboration. Copyright © 2011 Elsevier Inc. All rights reserved.
Simulation and sensitivity analysis of carbon storage and fluxes in the New Jersey Pinelands
Zewei Miao; Richard G. Lathrop; Ming Xu; Inga P. La Puma; Kenneth L. Clark; John Hom; Nicholas Skowronski; Steve Van Tuyl
2011-01-01
A major challenge in modeling the carbon dynamics of vegetation communities is the proper parameterization and calibration of eco-physiological variables that are critical determinants of the ecosystem process-based model behavior. In this study, we improved and calibrated a biochemical process-based WxBGC model by using in situ AmeriFlux eddy covariance tower...
Active Interaction Mapping as a tool to elucidate hierarchical functions of biological processes.
Farré, Jean-Claude; Kramer, Michael; Ideker, Trey; Subramani, Suresh
2017-07-03
Increasingly, various 'omics data are contributing significantly to our understanding of novel biological processes, but it has not been possible to iteratively elucidate hierarchical functions in complex phenomena. We describe a general systems biology approach called Active Interaction Mapping (AI-MAP), which elucidates the hierarchy of functions for any biological process. Existing and new 'omics data sets can be iteratively added to create and improve hierarchical models which enhance our understanding of particular biological processes. The best datatypes to further improve an AI-MAP model are predicted computationally. We applied this approach to our understanding of general and selective autophagy, which are conserved in most eukaryotes, setting the stage for the broader application to other cellular processes of interest. In the particular application to autophagy-related processes, we uncovered and validated new autophagy and autophagy-related processes, expanded known autophagy processes with new components, integrated known non-autophagic processes with autophagy and predict other unexplored connections.
Barutta, Joaquin; Guex, Raphael; Ibáñez, Agustín
2010-06-01
Abstract From everyday cognition to scientific discovery, analogical processes play an important role: bringing connection, integration, and interrelation of information. Recently, a PFC model of analogy has been proposed to explain many cognitive processes and integrate general functional properties of PFC. We argue here that analogical processes do not suffice to explain the cognitive processes and functions of PFC. Moreover the model does not satisfactorily integrate specific explanatory mechanisms required for the different processes involved. Its relevance would be improved if fewer cognitive phenomena were considered and more specific predictions and explanations about those processes were stated.
The Distinctive Vocation of Business Education in Catholic Universities
ERIC Educational Resources Information Center
Goodpaster, Kenneth E.; Maines, T. Dean
2012-01-01
Catholic business schools need a process to shape their operations intentionally in light of the Catholic moral tradition. Recent developments in Catholic health care suggest a model they might follow. This model uses a method known as the "Self-Assessment and Improvement Process" (SAIP), which helps leaders deploy moral principles…
USDA-ARS?s Scientific Manuscript database
Watershed models typically are evaluated solely through comparison of in-stream water and nutrient fluxes with measured data using established performance criteria, whereas processes and responses within the interior of the watershed that govern these global fluxes often are neglected. Due to the l...
What Do HPT Consultants Do for Performance Analysis?
ERIC Educational Resources Information Center
Kang, Sung
2017-01-01
This study was conducted to contribute to the field of Human Performance Technology (HPT) through the validation of the performance analysis process of the International Society for Performance Improvement (ISPI) HPT model, the most representative and frequently utilized process model in the HPT field. The study was conducted using content…
Analyzing the Impact of a Data Analysis Process to Improve Instruction Using a Collaborative Model
ERIC Educational Resources Information Center
Good, Rebecca B.
2006-01-01
The Data Collaborative Model (DCM) assembles assessment literacy, reflective practices, and professional development into a four-component process. The sub-components include assessing students, reflecting over data, professional dialogue, professional development for the teachers, interventions for students based on data results, and re-assessing…
Hunt, Pete; Barrios, Lisa; Telljohann, Susan K; Mazyck, Donna
2015-11-01
The Whole School, Whole Community, Whole Child (WSCC) model shows the interrelationship between health and learning and the potential for improving educational outcomes by improving health outcomes. However, current descriptions do not explain how to implement the model. The existing literature, including scientific articles, programmatic guidance, and publications by national agencies and organizations, was reviewed and synthesized to describe an overview of interrelatedness of learning and health and the 10 components of the WSCC model. The literature suggests potential benefits of applying the WSCC model at the district and school level. But, the model lacks specific guidance as to how this might be made actionable. A collaborative approach to health and learning is suggested, including a 10-step systematic process to help schools and districts develop an action plan for improving health and education outcomes. Essential preliminary actions are suggested to minimize the impact of the challenges that commonly derail systematic planning processes and program implementation, such as lack of readiness, personnel shortages, insufficient resources, and competing priorities. All new models require testing and evidence to confirm their value. District and schools will need to test this model and put plans into action to show that significant, substantial, and sustainable health and academic outcomes can be achieved. © 2015 The Authors. Journal of School Health published by Wiley Periodicals, Inc. on behalf of American School Health Association.
Intercomparison of the community multiscale air quality model and CALGRID using process analysis.
O'Neill, Susan M; Lamb, Brian K
2005-08-01
This study was designed to examine the similarities and differences between two advanced photochemical air quality modeling systems: EPA Models-3/CMAQ and CALGRID/CALMET. Both modeling systems were applied to an ozone episode that occurred along the I-5 urban corridor in western Washington and Oregon during July 11-14, 1996. Both models employed the same modeling domain and used the same detailed gridded emission inventory. The CMAQ model was run using both the CB-IV and RADM2 chemical mechanisms, while CALGRID was used with the SAPRC-97 chemical mechanism. Outputfrom the Mesoscale Meteorological Model (MM5) employed with observational nudging was used in both models. The two modeling systems, representing three chemical mechanisms and two sets of meteorological inputs, were evaluated in terms of statistical performance measures for both 1- and 8-h average observed ozone concentrations. The results showed that the different versions of the systems were more similar than different, and all versions performed well in the Portland region and downwind of Seattle but performed poorly in the more rural region north of Seattle. Improving the meteorological input into the CALGRID/CALMET system with planetary boundary layer (PBL) parameters from the Models-3/CMAQ meteorology preprocessor (MCIP) improved the performance of the CALGRID/CALMET system. The 8-h ensemble case was often the best performer of all the cases indicating that the models perform better over longer analysis periods. The 1-h ensemble case, derived from all runs, was not necessarily an improvement over the five individual cases, but the standard deviation about the mean provided a measure of overall modeling uncertainty. Process analysis was applied to examine the contribution of the individual processes to the species conservation equation. The process analysis results indicated that the two modeling systems arrive at similar solutions by very different means. Transport rates are faster and exhibit greater fluctuations in the CMAQ cases than in the CALGRID cases, which lead to different placement of the urban ozone plumes. The CALGRID cases, which rely on the SAPRC97 chemical mechanism, exhibited a greater diurnal production/loss cycle of ozone concentrations per hour compared to either the RADM2 or CBIV chemical mechanisms in the CMAQ cases. These results demonstrate the need for specialized process field measurements to confirm whether we are modeling ozone with valid processes.
Application of the SEIPS Model to Analyze Medication Safety in a Crisis Residential Center.
Steele, Maria L; Talley, Brenda; Frith, Karen H
2018-02-01
Medication safety and error reduction has been studied in acute and long-term care settings, but little research is found in the literature regarding mental health settings. Because mental health settings are complex, medication administration is vulnerable to a variety of errors from transcription to administration. The purpose of this study was to analyze critical factors related to a mental health work system structure and processes that threaten safe medication administration practices. The Systems Engineering Initiative for Patient Safety (SEIPS) model provides a framework to analyze factors affecting medication safety. The model approach analyzes the work system concepts of technology, tasks, persons, environment, and organization to guide the collection of data. In the study, the Lean methodology tools were used to identify vulnerabilities in the system that could be targeted later for improvement activities. The project director completed face-to-face interviews, asked nurses to record disruptions in a log, and administered a questionnaire to nursing staff. The project director also conducted medication chart reviews and recorded medication errors using a standardized taxonomy for errors that allowed categorization of the prevalent types of medication errors. Results of the study revealed disruptions during the medication process, pharmacology training needs, and documentation processes as the primary opportunities for improvement. The project engaged nurses to identify sustainable quality improvement strategies to improve patient safety. The mental health setting carries challenges for safe medication administration practices. Through analysis of the structure, process, and outcomes of medication administration, opportunities for quality improvement and sustainable interventions were identified, including minimizing the number of distractions during medication administration, training nurses on psychotropic medications, and improving the documentation system. A task force was created to analyze the descriptive data and to establish objectives aimed at improving efficiency of the work system and care process involved in medication administration at the end of the project. Copyright © 2017 Elsevier Inc. All rights reserved.
Patterson, Sara E.; Bolivar-Medina, Jenny L.; Falbel, Tanya G.; Hedtcke, Janet L.; Nevarez-McBride, Danielle; Maule, Andrew F.; Zalapa, Juan E.
2016-01-01
As the world population grows and resources and climate conditions change, crop improvement continues to be one of the most important challenges for agriculturalists. The yield and quality of many crops is affected by abscission or shattering, and environmental stresses often hasten or alter the abscission process. Understanding this process can not only lead to genetic improvement, but also changes in cultural practices and management that will contribute to higher yields, improved quality and greater sustainability. As plant scientists, we have learned significant amounts about this process through the study of model plants such as Arabidopsis, tomato, rice, and maize. While these model systems have provided significant valuable information, we are sometimes challenged to use this knowledge effectively as variables including the economic value of the crop, the uniformity of the crop, ploidy levels, flowering and crossing mechanisms, ethylene responses, cultural requirements, responses to changes in environment, and cellular and tissue specific morphological differences can significantly influence outcomes. The value of genomic resources for lesser-studied crops such as cranberries and grapes and the orphan crop fonio will also be considered. PMID:26858730
Patterson, Sara E; Bolivar-Medina, Jenny L; Falbel, Tanya G; Hedtcke, Janet L; Nevarez-McBride, Danielle; Maule, Andrew F; Zalapa, Juan E
2015-01-01
As the world population grows and resources and climate conditions change, crop improvement continues to be one of the most important challenges for agriculturalists. The yield and quality of many crops is affected by abscission or shattering, and environmental stresses often hasten or alter the abscission process. Understanding this process can not only lead to genetic improvement, but also changes in cultural practices and management that will contribute to higher yields, improved quality and greater sustainability. As plant scientists, we have learned significant amounts about this process through the study of model plants such as Arabidopsis, tomato, rice, and maize. While these model systems have provided significant valuable information, we are sometimes challenged to use this knowledge effectively as variables including the economic value of the crop, the uniformity of the crop, ploidy levels, flowering and crossing mechanisms, ethylene responses, cultural requirements, responses to changes in environment, and cellular and tissue specific morphological differences can significantly influence outcomes. The value of genomic resources for lesser-studied crops such as cranberries and grapes and the orphan crop fonio will also be considered.
Using Dispersed Modes During Model Correlation
NASA Technical Reports Server (NTRS)
Stewart, Eric C.; Hathcock, Megan L.
2017-01-01
The model correlation process for the modal characteristics of a launch vehicle is well established. After a test, parameters within the nominal model are adjusted to reflect structural dynamics revealed during testing. However, a full model correlation process for a complex structure can take months of man-hours and many computational resources. If the analyst only has weeks, or even days, of time in which to correlate the nominal model to the experimental results, then the traditional correlation process is not suitable. This paper describes using model dispersions to assist the model correlation process and decrease the overall cost of the process. The process creates thousands of model dispersions from the nominal model prior to the test and then compares each of them to the test data. Using mode shape and frequency error metrics, one dispersion is selected as the best match to the test data. This dispersion is further improved by using a commercial model correlation software. In the three examples shown in this paper, this dispersion based model correlation process performs well when compared to models correlated using traditional techniques and saves time in the post-test analysis.
Gillison, Fiona; Stathi, Afroditi; Reddy, Prasuna; Perry, Rachel; Taylor, Gordon; Bennett, Paul; Dunbar, James; Greaves, Colin
2015-01-16
Process evaluation is important for improving theories of behavior change and behavioral intervention methods. The present study reports on the process outcomes of a pilot test of the theoretical model (the Process Model for Lifestyle Behavior Change; PMLBC) underpinning an evidence-informed, theory-driven, group-based intervention designed to promote healthy eating and physical activity for people with high cardiovascular risk. 108 people at high risk of diabetes or heart disease were randomized to a group-based weight management intervention targeting diet and physical activity plus usual care, or to usual care. The intervention comprised nine group based sessions designed to promote motivation, social support, self-regulation and understanding of the behavior change process. Weight loss, diet, physical activity and theoretically defined mediators of change were measured pre-intervention, and after four and 12 months. The intervention resulted in significant improvements in fiber intake (M between-group difference = 5.7 g/day, p < .001) but not fat consumption (-2.3 g/day, p = 0.13), that were predictive of weight loss at both four months (M between-group difference = -1.98 kg, p < .01; R(2) = 0.2, p < 0.005), and 12 months (M difference = -1.85 kg, p = 0.1; R(2) = 0.1, p < 0.01). The intervention was successful in improving the majority of specified mediators of behavior change, and the predicted mechanisms of change specified in the PMBLC were largely supported. Improvements in self-efficacy and understanding of the behavior change process were associated with engagement in coping planning and self-monitoring activities, and successful dietary change at four and 12 months. While participants reported improvements in motivational and social support variables, there was no effect of these, or of the intervention overall, on physical activity. The data broadly support the theoretical model for supporting some dietary changes, but not for physical activity. Systematic intervention design allowed us to identify where improvements to the intervention may be implemented to promote change in all proposed mediators. More work is needed to explore effective mechanisms within interventions to promote physical activity behavior.
Improved thermodynamic modeling of the no-vent fill process and correlation with experimental data
NASA Technical Reports Server (NTRS)
Taylor, William J.; Chato, David J.
1991-01-01
The United States' plans to establish a permanent manned presence in space and to explore the Solar System created the need to efficiently handle large quantities of subcritical cryogenic fluids, particularly propellants such as liquid hydrogen and liquid oxygen, in low- to zero-gravity environments. One of the key technologies to be developed for fluid handling is the ability to transfer the cryogens between storage and spacecraft tanks. The no-vent fill method was identified as one way to perform this transfer. In order to understand how to apply this method, a model of the no-vent fill process is being developed and correlated with experimental data. The verified models then can be used to design and analyze configurations for tankage and subcritical fluid depots. The development of an improved macroscopic thermodynamic model is discussed of the no-vent fill process and the analytical results from the computer program implementation of the model are correlated with experimental results for two different test tanks.
Improving the representation of photosynthesis in Earth system models
NASA Astrophysics Data System (ADS)
Rogers, A.; Medlyn, B. E.; Dukes, J.; Bonan, G. B.; von Caemmerer, S.; Dietze, M.; Kattge, J.; Leakey, A. D.; Mercado, L. M.; Niinemets, U.; Prentice, I. C. C.; Serbin, S.; Sitch, S.; Way, D. A.; Zaehle, S.
2015-12-01
Continued use of fossil fuel drives an accelerating increase in atmospheric CO2 concentration ([CO2]) and is the principal cause of global climate change. Many of the observed and projected impacts of rising [CO2] portend increasing environmental and economic risk, yet the uncertainty surrounding the projection of our future climate by Earth System Models (ESMs) is unacceptably high. Improving confidence in our estimation of future [CO2] is essential if we seek to project global change with greater confidence. There are critical uncertainties over the long term response of terrestrial CO2 uptake to global change, more specifically, over the size of the terrestrial carbon sink and over its sensitivity to rising [CO2] and temperature. Reducing the uncertainty associated with model representation of the largest CO2 flux on the planet is therefore an essential part of improving confidence in projections of global change. Here we have examined model representation of photosynthesis in seven process models including several global models that underlie the representation of photosynthesis in the land surface model component of ESMs that were part of the recent Fifth Assessment Report from the IPCC. Our approach was to focus on how physiological responses are represented by these models, and to better understand how structural and parametric differences drive variation in model responses to light, CO2, nutrients, temperature, vapor pressure deficit and soil moisture. We challenged each model to produce leaf and canopy responses to these factors to help us identify areas in which current process knowledge and emerging data sets could be used to improve model skill, and also identify knowledge gaps in current understanding that directly impact model outputs. We hope this work will provide a roadmap for the scientific activity that is necessary to advance process representation, parameterization and scaling of photosynthesis in the next generation of Earth System Models.
Song, Zirui; Rose, Sherri; Chernew, Michael E; Safran, Dana Gelb
2017-01-01
As population-based payment models become increasingly common, it is crucial to understand how such payment models affect health disparities. We evaluated health care quality and spending among enrollees in areas with lower versus higher socioeconomic status in Massachusetts before and after providers entered into the Alternative Quality Contract, a two-sided population-based payment model with substantial incentives tied to quality. We compared changes in process measures, outcome measures, and spending between enrollees in areas with lower and higher socioeconomic status from 2006 to 2012 (outcome measures were measured after the intervention only). Quality improved for all enrollees in the Alternative Quality Contract after their provider organizations entered the contract. Process measures improved 1.2 percentage points per year more among enrollees in areas with lower socioeconomic status than among those in areas with higher socioeconomic status. Outcome measure improvement was no different between the subgroups; neither were changes in spending. Larger or comparable improvements in quality among enrollees in areas with lower socioeconomic status suggest a potential narrowing of disparities. Strong pay-for-performance incentives within a population-based payment model could encourage providers to focus on improving quality for more disadvantaged populations. Project HOPE—The People-to-People Health Foundation, Inc.
NASA Astrophysics Data System (ADS)
Papalexiou, Simon Michael
2018-05-01
Hydroclimatic processes come in all "shapes and sizes". They are characterized by different spatiotemporal correlation structures and probability distributions that can be continuous, mixed-type, discrete or even binary. Simulating such processes by reproducing precisely their marginal distribution and linear correlation structure, including features like intermittency, can greatly improve hydrological analysis and design. Traditionally, modelling schemes are case specific and typically attempt to preserve few statistical moments providing inadequate and potentially risky distribution approximations. Here, a single framework is proposed that unifies, extends, and improves a general-purpose modelling strategy, based on the assumption that any process can emerge by transforming a specific "parent" Gaussian process. A novel mathematical representation of this scheme, introducing parametric correlation transformation functions, enables straightforward estimation of the parent-Gaussian process yielding the target process after the marginal back transformation, while it provides a general description that supersedes previous specific parameterizations, offering a simple, fast and efficient simulation procedure for every stationary process at any spatiotemporal scale. This framework, also applicable for cyclostationary and multivariate modelling, is augmented with flexible parametric correlation structures that parsimoniously describe observed correlations. Real-world simulations of various hydroclimatic processes with different correlation structures and marginals, such as precipitation, river discharge, wind speed, humidity, extreme events per year, etc., as well as a multivariate example, highlight the flexibility, advantages, and complete generality of the method.
NASA Astrophysics Data System (ADS)
Ai, Yuewei; Shao, Xinyu; Jiang, Ping; Li, Peigen; Liu, Yang; Yue, Chen
2015-11-01
The welded joints of dissimilar materials have been widely used in automotive, ship and space industries. The joint quality is often evaluated by weld seam geometry, microstructures and mechanical properties. To obtain the desired weld seam geometry and improve the quality of welded joints, this paper proposes a process modeling and parameter optimization method to obtain the weld seam with minimum width and desired depth of penetration for laser butt welding of dissimilar materials. During the process, Taguchi experiments are conducted on the laser welding of the low carbon steel (Q235) and stainless steel (SUS301L-HT). The experimental results are used to develop the radial basis function neural network model, and the process parameters are optimized by genetic algorithm. The proposed method is validated by a confirmation experiment. Simultaneously, the microstructures and mechanical properties of the weld seam generated from optimal process parameters are further studied by optical microscopy and tensile strength test. Compared with the unoptimized weld seam, the welding defects are eliminated in the optimized weld seam and the mechanical properties are improved. The results show that the proposed method is effective and reliable for improving the quality of welded joints in practical production.
Analysis of vehicle's safety envelope under car-following model
NASA Astrophysics Data System (ADS)
Tang, Tie-Qiao; Zhang, Jian; Chen, Liang; Shang, Hua-Yan
2017-05-01
In this paper, we propose an improved car-following model to explore the impacts of vehicle's two safety distances (i.e., the front safety distance and back safety distance) on the traffic safety during the starting process. The numerical results show that our model is prominently safer than the FVD (full velocity difference) model, i.e., our model is better than the FVD model from the perspective of the traffic safety, which shows that each driver should consider his two safety distances during his driving process.
NASA Astrophysics Data System (ADS)
Wi, S.; Freeman, S.; Brown, C.
2017-12-01
This study presents a general approach to developing computational models of human-hydrologic systems where human modification of hydrologic surface processes are significant or dominant. A river basin system is represented by a network of human-hydrologic response units (HHRUs) identified based on locations where river regulations happen (e.g., reservoir operation and diversions). Natural and human processes in HHRUs are simulated in a holistic framework that integrates component models representing rainfall-runoff, river routing, reservoir operation, flow diversion and water use processes. We illustrate the approach in a case study of the Cutzamala water system (CWS) in Mexico, a complex inter-basin water transfer system supplying the Mexico City Metropolitan Area (MCMA). The human-hydrologic system model for CWS (CUTZSIM) is evaluated in terms of streamflow and reservoir storages measured across the CWS and to water supplied for MCMA. The CUTZSIM improves the representation of hydrology and river-operation interaction and, in so doing, advances evaluation of system-wide water management consequences under altered climatic and demand regimes. The integrated modeling framework enables evaluation and simulation of model errors throughout the river basin, including errors in representation of the human component processes. Heretofore, model error evaluation, predictive error intervals and the resultant improved understanding have been limited to hydrologic processes. The general framework represents an initial step towards fuller understanding and prediction of the many and varied processes that determine the hydrologic fluxes and state variables in real river basins.
Voigt, Wieland; Hoellthaler, Josef; Magnani, Tiziana; Corrao, Vito; Valdagni, Riccardo
2014-01-01
Background Multidisciplinary care of prostate cancer is increasingly offered in specialised cancer centres. It requires the optimisation of medical and operational processes and the integration of the different medical and non-medical stakeholders. Objective To develop a standardised operational process assessment tool basing on the capability maturity model integration (CMMI) able to implement multidisciplinary care and improve process quality and efficiency. Design, Setting, and Participants Information for model development was derived from medical experts, clinical guidelines, best practice elements of renowned cancer centres, and scientific literature. Data were organised in a hierarchically structured model, consisting of 5 categories, 30 key process areas, 172 requirements, and more than 1500 criteria. Compliance with requirements was assessed through structured on-site surveys covering all relevant clinical and management processes. Comparison with best practice standards allowed to recommend improvements. ‘Act On Oncology’(AoO) was applied in a pilot study on a prostate cancer unit in Europe. Results and Limitations Several best practice elements such as multidisciplinary clinics or advanced organisational measures for patient scheduling were observed. Substantial opportunities were found in other areas such as centre management and infrastructure. As first improvements the evaluated centre administration described and formalised the organisation of the prostate cancer unit with defined personnel assignments and clinical activities and a formal agreement is being worked on to have structured access to First-Aid Posts. Conclusions In the pilot study, the AoO approach was feasible to identify opportunities for process improvements. Measures were derived that might increase the operational process quality and efficiency. PMID:25192213
Thermo-chemical modelling of a village cookstove for design improvement
NASA Astrophysics Data System (ADS)
Honkalaskar, Vijay H.; Sohoni, Milind; Bhandarkar, Upendra V.
2014-05-01
Cookstove operation comprises three basic processes, namely combustion of firewood, natural air draft due to the buoyancy induced by the temperature difference between the hearth and its surroundings, and heat transfer to the pot, stove body and surrounding atmosphere. Owing to the heterogenous and unsteady burning of solid fuel, there exist nonlinear and dynamic interrelationships among these process parameters. A steady-state analytical model of the cookstove operation is developed for its design improvement by splitting the hearth into three zones to study char combustion, volatile combustion and heat transfer to the pot bottom separately. It comprises a total of seven relations corresponding to a thorough analysis of the three basic processes. A novel method is proposed to model the combustion of wood to mimic the realities closely. Combustion space above the fuel bed is split into 1000 discrete parts to study the combustion of volatiles by considering a set of representative volatile gases. Model results are validated by comparing them with a set of water boiling tests carried on a traditional cookstove in the laboratory. It is found that the major thrust areas to improve the thermal performance are combustion of volatiles and the heat transfer to the pot. It is revealed that the existing design dimensions of the traditional cookstove are close to their optimal values. Addition of twisted-tape inserts in the hearth of the cookstove shows an improvement in the thermal performance due to increase in the heat transfer coefficient to the pot bottom and improved combustion of volatiles.
An Employee-Centered Care Model Responds to the Triple Aim: Improving Employee Health.
Fox, Kelly; McCorkle, Ruth
2018-01-01
Health care expenditures, patient satisfaction, and timely access to care will remain problematic if dramatic changes in health care delivery models are not developed and implemented. To combat this challenge, a Triple Aim approach is essential; Innovation in payment and health care delivery models is required. Using the Donabedian framework of structure, process, and outcome, this article describes a nurse-led employee-centered care model designed to improve consumers' health care experiences, improve employee health, and increase access to care while reducing health care costs for employees, age 18 and older, in a corporate environment.
Analysis of rocket engine injection combustion processes
NASA Technical Reports Server (NTRS)
Salmon, J. W.; Saltzman, D. H.
1977-01-01
Mixing methodology improvement for the JANNAF DER and CICM injection/combustion analysis computer programs was accomplished. ZOM plane prediction model development was improved for installation into the new standardized DER computer program. An intra-element mixing model developing approach was recommended for gas/liquid coaxial injection elements for possible future incorporation into the CICM computer program.
ERIC Educational Resources Information Center
Hunt, Pete; Barrios, Lisa; Telljohann, Susan K.; Mazyck, Donna
2015-01-01
Background: The Whole School, Whole Community, Whole Child (WSCC) model shows the interrelationship between health and learning and the potential for improving educational outcomes by improving health outcomes. However, current descriptions do not explain how to implement the model. Methods: The existing literature, including scientific articles,…
Educational Statistics and School Improvement. Statistics and the Federal Role in Education.
ERIC Educational Resources Information Center
Hawley, Willis D.
This paper focuses on how educational statistics might better serve the quest for educational improvement in elementary and secondary schools. A model for conceptualizing the sources and processes of school productivity is presented. The Learning Productivity Model suggests that school outcomes are the consequence of the interaction of five…
An improved car-following model with two preceding cars' average speed
NASA Astrophysics Data System (ADS)
Yu, Shao-Wei; Shi, Zhong-Ke
2015-01-01
To better describe cooperative car-following behaviors under intelligent transportation circumstances and increase roadway traffic mobility, the data of three successive following cars at a signalized intersection of Jinan in China were obtained and employed to explore the linkage between two preceding cars' average speed and car-following behaviors. The results indicate that two preceding cars' average velocity has significant effects on the following car's motion. Then an improved car-following model considering two preceding cars' average velocity was proposed and calibrated based on full velocity difference model and some numerical simulations were carried out to study how two preceding cars' average speed affected the starting process and the traffic flow evolution process with an initial small disturbance, the results indicate that the improved car-following model can qualitatively describe the impacts of two preceding cars' average velocity on traffic flow and that taking two preceding cars' average velocity into account in designing the control strategy for the cooperative adaptive cruise control system can improve the stability of traffic flow, suppress the appearance of traffic jams and increase the capacity of signalized intersections.
MacDonald-Wilson, Kim L; Hutchison, Shari L; Karpov, Irina; Wittman, Paul; Deegan, Patricia E
2017-04-01
Individual involvement in treatment decisions with providers, often through the use of decision support aids, improves quality of care. This study investigates an implementation strategy to bring decision support to community mental health centers (CMHC). Fifty-two CMHCs implemented a decision support toolkit supported by a 12-month learning collaborative using the Breakthrough Series model. Participation in learning collaborative activities was high, indicating feasibility of the implementation model. Progress by staff in meeting process aims around utilization of components of the toolkit improved significantly over time (p < .0001). Survey responses by individuals in service corroborate successful implementation. Community-based providers were able to successfully implement decision support in mental health services as evidenced by improved process outcomes and sustained practices over 1 year through the structure of the learning collaborative model.
NASA Astrophysics Data System (ADS)
Christensen, H. M.; Berner, J.; Sardeshmukh, P. D.
2017-12-01
Stochastic parameterizations have been used for more than a decade in atmospheric models. They provide a way to represent model uncertainty through representing the variability of unresolved sub-grid processes, and have been shown to have a beneficial effect on the spread and mean state for medium- and extended-range forecasts. There is increasing evidence that stochastic parameterization of unresolved processes can improve the bias in mean and variability, e.g. by introducing a noise-induced drift (nonlinear rectification), and by changing the residence time and structure of flow regimes. We present results showing the impact of including the Stochastically Perturbed Parameterization Tendencies scheme (SPPT) in coupled runs of the National Center for Atmospheric Research (NCAR) Community Atmosphere Model, version 4 (CAM4) with historical forcing. SPPT results in a significant improvement in the representation of the El Nino-Southern Oscillation in CAM4, improving the power spectrum, as well as both the inter- and intra-annual variability of tropical pacific sea surface temperatures. We use a Linear Inverse Modelling framework to gain insight into the mechanisms by which SPPT has improved ENSO-variability.
Improving Mathematics Achievement of Indonesian 5th Grade Students through Guided Discovery Learning
ERIC Educational Resources Information Center
Yurniwati; Hanum, Latipa
2017-01-01
This research aims to find information about the improvement of mathematics achievement of grade five student through guided discovery learning. This research method is classroom action research using Kemmis and Taggart model consists of three cycles. Data used in this study is learning process and learning results. Learning process data is…
Leaders in Future and Current Technology Teaming Up to Improve Ethanol
and NREL expertise to: Develop improvements in process throughput and water management for dry mill , Complete an overall process engineering model of the dry mill technology that identifies new ways to and operation of "dry mill" plants that currently produce ethanol from corn starch. Dry
The future of nearshore processes research
Elko, Nicole A.; Feddersen, Falk; Foster, Diane; Hapke, Cheryl J.; McNinch, Jesse E.; Mulligan, Ryan P.; Tuba Ӧzkan-Haller, H.; Plant, Nathaniel G.; Raubenheimer, Britt
2014-01-01
The nearshore is the transition region between land and the continental shelf including (from onshore to offshore) coastal plains, wetlands, estuaries, coastal cliffs, dunes, beaches, surf zones (regions of wave breaking), and the inner shelf (Figure ES-1). Nearshore regions are vital to the national economy, security, commerce, and recreation. The nearshore is dynamically evolving, is often densely populated, and is under increasing threat from sea level rise, long-term erosion, extreme storms, and anthropogenic influences. Worldwide, almost one billion people live at elevations within 10 m of present sea level. Long-term erosion threatens communities, infrastructure, ecosystems, and habitat. Extreme storms can cause billions of dollars of damage. Degraded water quality impacts ecosystem and human health. Nearshore processes, the complex interactions between water, sediment, biota, and humans, must be understood and predicted to manage this often highly developed yet vulnerable nearshore environment. Over the past three decades, the understanding of nearshore processes has improved. However, societal needs are growing with increased coastal urbanization and threats of future climate change, and significant scientific challenges remain. To address these challenges, members of academia, industry, and federal agencies (USGS, USACE, NPS, NOAA, FEMA, ONR) met at the “The Past and Future of Nearshore Processes Research: Reflections on the Sallenger Years and a New Vision for the Future” workshop to develop a nearshore processes research vision where societal needs and science challenges intersect. The resulting vision is comprised of three broad research themes: Long-term coastal evolution due to natural and anthropogenic processes: As global climate change alters the rates of sea level rise and potentially storm patterns and coastal urbanization increases over the coming decades, an understanding of coastal evolution is critical. Improved knowledge of long-term morphological, ecological, and societal processes and their interactions will result in an improved ability to simulate coastal change. This will enable proactive solutions for resilient coasts and better guidance for reducing coastal vulnerability.Extreme Events: Flooding, erosion, and the subsequent recovery: Hurricane Sandy caused flooding and erosion along hundreds of miles of shoreline, flooded New York City, and impacted communities and infrastructure. Overall U.S. coastal extreme event related economic losses have increased substantially. Furthermore, climate change may cause an increase in coastal extreme events and rising sea levels could increase the occurrence of extreme events. Addressing this research theme will result in an improved understanding of the physical processes during extreme events, leading to improved models of flooding, erosion, and recovery. The resulting societal benefit will be more resilient coastal communities.The physical, biological and chemical processes impacting human and ecosystem health: Nearshore regions are used for recreation, tourism, and human habitation, and provide habitat and valuable ecosystem services. These areas must be sustained for future generations, however overall coastal water quality is declining due to microbial pathogens, fertilizers, pesticides, and heavy metal contamination, threatening ecosystem and human health. To ensure sustainable nearshore regions, predictive real-time water- and sediment-based based pollutant modeling capabilities must be developed, which requires expanding our knowledge of the physics, chemistry, and biology of the nearshore. The resulting societal benefits will include better beach safety, healthier ecosystems, and improved mitigation and regulatory policies.The scientists and engineers of the U.S. nearshore community are poised to make significant progress on these research themes, which have significant societal impact. The U.S. nearshore community, including academic, government, and industry colleagues, recommends multi-agency investment into a coordinated development of observational and modeling research infrastructure to address these themes, as discussed in the whitepaper. The observational infrastructure should include development of new sensors and methods, focused observational programs, and expanded nearshore observing systems. The modeling infrastructure should include improved process representation, better model coupling, incorporation of data assimilation techniques, and testing of real-time models. The observations will provide test beds to compare and improve models.
The effect of inclusion of inlets in dual drainage modelling
NASA Astrophysics Data System (ADS)
Chang, Tsang-Jung; Wang, Chia-Ho; Chen, Albert S.; Djordjević, Slobodan
2018-04-01
In coupled sewer and surface flood modelling approaches, the flow process in gullies is often ignored although the overland flow is drained to sewer network via inlets and gullies. Therefore, the flow entering inlets is transferred to the sewer network immediately, which may lead to a different flood estimation than the reality. In this paper, we compared two modelling approach with and without considering the flow processes in gullies in the coupled sewer and surface modelling. Three historical flood events were adopted for model calibration and validation. The results showed that the inclusion of flow process in gullies can further improve the accuracy of urban flood modelling.
NASA Astrophysics Data System (ADS)
Kryuchkov, D. I.; Zalazinsky, A. G.
2017-12-01
Mathematical models and a hybrid modeling system are developed for the implementation of the experimental-calculation method for the engineering analysis and optimization of the plastic deformation of inhomogeneous materials with the purpose of improving metal-forming processes and machines. The created software solution integrates Abaqus/CAE, a subroutine for mathematical data processing, with the use of Python libraries and the knowledge base. Practical application of the software solution is exemplified by modeling the process of extrusion of a bimetallic billet. The results of the engineering analysis and optimization of the extrusion process are shown, the material damage being monitored.
Architectural Improvements and New Processing Tools for the Open XAL Online Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, Christopher K; Pelaia II, Tom; Freed, Jonathan M
The online model is the component of Open XAL providing accelerator modeling, simulation, and dynamic synchronization to live hardware. Significant architectural changes and feature additions have been recently made in two separate areas: 1) the managing and processing of simulation data, and 2) the modeling of RF cavities. Simulation data and data processing have been completely decoupled. A single class manages all simulation data while standard tools were developed for processing the simulation results. RF accelerating cavities are now modeled as composite structures where parameter and dynamics computations are distributed. The beam and hardware models both maintain their relative phasemore » information, which allows for dynamic phase slip and elapsed time computation.« less
NASA Astrophysics Data System (ADS)
Neill, Aaron; Reaney, Sim
2015-04-01
Fully-distributed, physically-based rainfall-runoff models attempt to capture some of the complexity of the runoff processes that operate within a catchment, and have been used to address a variety of issues including water quality and the effect of climate change on flood frequency. Two key issues are prevalent, however, which call into question the predictive capability of such models. The first is the issue of parameter equifinality which can be responsible for large amounts of uncertainty. The second is whether such models make the right predictions for the right reasons - are the processes operating within a catchment correctly represented, or do the predictive abilities of these models result only from the calibration process? The use of additional data sources, such as environmental tracers, has been shown to help address both of these issues, by allowing for multi-criteria model calibration to be undertaken, and by permitting a greater understanding of the processes operating in a catchment and hence a more thorough evaluation of how well catchment processes are represented in a model. Using discharge and oxygen-18 data sets, the ability of the fully-distributed, physically-based CRUM3 model to represent the runoff processes in three sub-catchments in Cumbria, NW England has been evaluated. These catchments (Morland, Dacre and Pow) are part of the of the River Eden demonstration test catchment project. The oxygen-18 data set was firstly used to derive transit-time distributions and mean residence times of water for each of the catchments to gain an integrated overview of the types of processes that were operating. A generalised likelihood uncertainty estimation procedure was then used to calibrate the CRUM3 model for each catchment based on a single discharge data set from each catchment. Transit-time distributions and mean residence times of water obtained from the model using the top 100 behavioural parameter sets for each catchment were then compared to those derived from the oxygen-18 data to see how well the model captured catchment dynamics. The value of incorporating the oxygen-18 data set, as well as discharge data sets from multiple as opposed to single gauging stations in each catchment, in the calibration process to improve the predictive capability of the model was then investigated. This was achieved by assessing by how much the identifiability of the model parameters and the ability of the model to represent the runoff processes operating in each catchment improved with the inclusion of the additional data sets with respect to the likely costs that would be incurred in obtaining the data sets themselves.
NASA Astrophysics Data System (ADS)
Kapitan, Loginn
This research created a new model which provides an integrated approach to planning the effective selection and employment of airborne sensor systems in response to accidental or intentional chemical vapor releases. The approach taken was to use systems engineering and decision analysis methods to construct a model architecture which produced a modular structure for integrating both new and existing components into a logical procedure to assess the application of airborne sensor systems to address chemical vapor hazards. The resulting integrated process model includes an internal aggregation model which allowed differentiation among alternative airborne sensor systems. Both models were developed and validated by experts and demonstrated using appropriate hazardous chemical release scenarios. The resultant prototype integrated process model or system fills a current gap in capability allowing improved planning, training and exercise for HAZMAT teams and first responders when considering the selection and employment of airborne sensor systems. Through the research process, insights into the current response structure and how current airborne capability may be most effectively used were generated. Furthermore, the resultant prototype system is tailorable for local, state, and federal application, and can potentially be modified to help evaluate investments in new airborne sensor technology and systems. Better planning, training and preparedness exercising holds the prospect for the effective application of airborne assets for improved response to large scale chemical release incidents. Improved response will result in fewer casualties and lives lost, reduced economic impact, and increased protection of critical infrastructure when faced with accidental and intentional terrorist release of hazardous industrial chemicals. With the prospect of more airborne sensor systems becoming available, this prototype system integrates existing and new tools into an effective process for the selection and employment of airborne sensors to better plan, train and exercise ahead of potential chemical release events.
NASA Technical Reports Server (NTRS)
Reil, Robin L.
2014-01-01
Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.
Improved simulation of tropospheric ozone by a global-multi-regional two-way coupling model system
NASA Astrophysics Data System (ADS)
Yan, Y.-Y.; Lin, J.-T.; Chen, J.; Hu, L.
2015-09-01
Small-scale nonlinear chemical and physical processes over pollution source regions affect the global ozone (O3) chemistry, but these processes are not captured by current global chemical transport models (CTMs) and chemistry-climate models that are limited by coarse horizontal resolutions (100-500 km, typically 200 km). These models tend to contain large (and mostly positive) tropospheric O3 biases in the Northern Hemisphere. Here we use a recently built two-way coupling system of the GEOS-Chem CTM to simulate the global tropospheric O3 in 2009. The system couples the global model (at 2.5° long. × 2° lat.) and its three nested models (at 0.667° long. × 0.5° lat.) covering Asia, North America and Europe, respectively. Benefiting from the high resolution, the nested models better capture small-scale processes than the global model alone. In the coupling system, the nested models provide results to modify the global model simulation within respective nested domains while taking the lateral boundary conditions from the global model. Due to the "coupling" effects, the two-way system significantly improves the tropospheric O3 simulation upon the global model alone, as found by comparisons with a suite of ground (1420 sites from WDCGG, GMD, EMEP, and AQS), aircraft (HIPPO and MOZAIC), and satellite measurements (two OMI products). Compared to the global model alone, the two-way coupled simulation enhances the correlation in day-to-day variation of afternoon mean O3 with the ground measurements from 0.53 to 0.68, and it reduces the mean model bias from 10.8 to 6.7 ppb in annual average afternoon O3. Regionally, the coupled model reduces the bias by 4.6 ppb over Europe, 3.9 ppb over North America, and 3.1 ppb over other regions. The two-way coupling brings O3 vertical profiles much closer to the HIPPO (for remote areas) and MOZAIC (for polluted regions) data, reducing the tropospheric (0-9 km) mean bias by 3-10 ppb at most MOZAIC sites and by 5.3 ppb for HIPPO profiles. The two-way coupled simulation also reduces the global tropospheric column ozone by 3.0 DU (9.5 %, annual mean), bringing them closer to the OMI data in all seasons. Simulation improvements are more significant in the northern hemisphere, and are primarily a result of improved representation of urban-rural contrast and other small-scale processes. The two-way coupled simulation also reduces the global tropospheric mean hydroxyl radical by 5 % with enhancements by 5 % in the lifetimes of methyl chloroform (from 5.58 to 5.87 yr) and methane (from 9.63 to 10.12 yr), bringing them closer to observation-based estimates. Improving model representations of small-scale processes are a critical step forward to understanding the global tropospheric chemistry.
NASA Astrophysics Data System (ADS)
Nijzink, R. C.; Samaniego, L.; Mai, J.; Kumar, R.; Thober, S.; Zink, M.; Schäfer, D.; Savenije, H. H. G.; Hrachowitz, M.
2015-12-01
Heterogeneity of landscape features like terrain, soil, and vegetation properties affect the partitioning of water and energy. However, it remains unclear to which extent an explicit representation of this heterogeneity at the sub-grid scale of distributed hydrological models can improve the hydrological consistency and the robustness of such models. In this study, hydrological process complexity arising from sub-grid topography heterogeneity was incorporated in the distributed mesoscale Hydrologic Model (mHM). Seven study catchments across Europe were used to test whether (1) the incorporation of additional sub-grid variability on the basis of landscape-derived response units improves model internal dynamics, (2) the application of semi-quantitative, expert-knowledge based model constraints reduces model uncertainty; and (3) the combined use of sub-grid response units and model constraints improves the spatial transferability of the model. Unconstrained and constrained versions of both, the original mHM and mHMtopo, which allows for topography-based sub-grid heterogeneity, were calibrated for each catchment individually following a multi-objective calibration strategy. In addition, four of the study catchments were simultaneously calibrated and their feasible parameter sets were transferred to the remaining three receiver catchments. In a post-calibration evaluation procedure the probabilities of model and transferability improvement, when accounting for sub-grid variability and/or applying expert-knowledge based model constraints, were assessed on the basis of a set of hydrological signatures. In terms of the Euclidian distance to the optimal model, used as overall measure for model performance with respect to the individual signatures, the model improvement achieved by introducing sub-grid heterogeneity to mHM in mHMtopo was on average 13 %. The addition of semi-quantitative constraints to mHM and mHMtopo resulted in improvements of 13 and 19 % respectively, compared to the base case of the unconstrained mHM. Most significant improvements in signature representations were, in particular, achieved for low flow statistics. The application of prior semi-quantitative constraints further improved the partitioning between runoff and evaporative fluxes. Besides, it was shown that suitable semi-quantitative prior constraints in combination with the transfer function based regularization approach of mHM, can be beneficial for spatial model transferability as the Euclidian distances for the signatures improved on average by 2 %. The effect of semi-quantitative prior constraints combined with topography-guided sub-grid heterogeneity on transferability showed a more variable picture of improvements and deteriorations, but most improvements were observed for low flow statistics.
NASA Astrophysics Data System (ADS)
Nijzink, Remko C.; Samaniego, Luis; Mai, Juliane; Kumar, Rohini; Thober, Stephan; Zink, Matthias; Schäfer, David; Savenije, Hubert H. G.; Hrachowitz, Markus
2016-03-01
Heterogeneity of landscape features like terrain, soil, and vegetation properties affects the partitioning of water and energy. However, it remains unclear to what extent an explicit representation of this heterogeneity at the sub-grid scale of distributed hydrological models can improve the hydrological consistency and the robustness of such models. In this study, hydrological process complexity arising from sub-grid topography heterogeneity was incorporated into the distributed mesoscale Hydrologic Model (mHM). Seven study catchments across Europe were used to test whether (1) the incorporation of additional sub-grid variability on the basis of landscape-derived response units improves model internal dynamics, (2) the application of semi-quantitative, expert-knowledge-based model constraints reduces model uncertainty, and whether (3) the combined use of sub-grid response units and model constraints improves the spatial transferability of the model. Unconstrained and constrained versions of both the original mHM and mHMtopo, which allows for topography-based sub-grid heterogeneity, were calibrated for each catchment individually following a multi-objective calibration strategy. In addition, four of the study catchments were simultaneously calibrated and their feasible parameter sets were transferred to the remaining three receiver catchments. In a post-calibration evaluation procedure the probabilities of model and transferability improvement, when accounting for sub-grid variability and/or applying expert-knowledge-based model constraints, were assessed on the basis of a set of hydrological signatures. In terms of the Euclidian distance to the optimal model, used as an overall measure of model performance with respect to the individual signatures, the model improvement achieved by introducing sub-grid heterogeneity to mHM in mHMtopo was on average 13 %. The addition of semi-quantitative constraints to mHM and mHMtopo resulted in improvements of 13 and 19 %, respectively, compared to the base case of the unconstrained mHM. Most significant improvements in signature representations were, in particular, achieved for low flow statistics. The application of prior semi-quantitative constraints further improved the partitioning between runoff and evaporative fluxes. In addition, it was shown that suitable semi-quantitative prior constraints in combination with the transfer-function-based regularization approach of mHM can be beneficial for spatial model transferability as the Euclidian distances for the signatures improved on average by 2 %. The effect of semi-quantitative prior constraints combined with topography-guided sub-grid heterogeneity on transferability showed a more variable picture of improvements and deteriorations, but most improvements were observed for low flow statistics.
Layout design-based research on optimization and assessment method for shipbuilding workshop
NASA Astrophysics Data System (ADS)
Liu, Yang; Meng, Mei; Liu, Shuang
2013-06-01
The research study proposes to examine a three-dimensional visualization program, emphasizing on improving genetic algorithms through the optimization of a layout design-based standard and discrete shipbuilding workshop. By utilizing a steel processing workshop as an example, the principle of minimum logistic costs will be implemented to obtain an ideological equipment layout, and a mathematical model. The objectiveness is to minimize the total necessary distance traveled between machines. An improved control operator is implemented to improve the iterative efficiency of the genetic algorithm, and yield relevant parameters. The Computer Aided Tri-Dimensional Interface Application (CATIA) software is applied to establish the manufacturing resource base and parametric model of the steel processing workshop. Based on the results of optimized planar logistics, a visual parametric model of the steel processing workshop is constructed, and qualitative and quantitative adjustments then are applied to the model. The method for evaluating the results of the layout is subsequently established through the utilization of AHP. In order to provide a mode of reference to the optimization and layout of the digitalized production workshop, the optimized discrete production workshop will possess a certain level of practical significance.
Biagianti, Bruno; Fisher, Melissa; Neilands, Torsten B; Loewy, Rachel; Vinogradov, Sophia
2016-11-01
Individuals with schizophrenia who engage in targeted cognitive training (TCT) of the auditory system show generalized cognitive improvements. The high degree of variability in cognitive gains maybe due to individual differences in the level of engagement of the underlying neural system target. 131 individuals with schizophrenia underwent 40 hours of TCT. We identified target engagement of auditory system processing efficiency by modeling subject-specific trajectories of auditory processing speed (APS) over time. Lowess analysis, mixed models repeated measures analysis, and latent growth curve modeling were used to examine whether APS trajectories were moderated by age and illness duration, and mediated improvements in cognitive outcome measures. We observed significant improvements in APS from baseline to 20 hours of training (initial change), followed by a flat APS trajectory (plateau) at subsequent time-points. Participants showed interindividual variability in the steepness of the initial APS change and in the APS plateau achieved and sustained between 20 and 40 hours. We found that participants who achieved the fastest APS plateau, showed the greatest transfer effects to untrained cognitive domains. There is a significant association between an individual's ability to generate and sustain auditory processing efficiency and their degree of cognitive improvement after TCT, independent of baseline neurocognition. APS plateau may therefore represent a behavioral measure of target engagement mediating treatment response. Future studies should examine the optimal plateau of auditory processing efficiency required to induce significant cognitive improvements, in the context of interindividual differences in neural plasticity and sensory system efficiency that characterize schizophrenia. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Henze Bancroft, Leah C; Strigel, Roberta M; Hernando, Diego; Johnson, Kevin M; Kelcz, Frederick; Kijowski, Richard; Block, Walter F
2016-03-01
Chemical shift based fat/water decomposition methods such as IDEAL are frequently used in challenging imaging environments with large B0 inhomogeneity. However, they do not account for the signal modulations introduced by a balanced steady state free precession (bSSFP) acquisition. Here we demonstrate improved performance when the bSSFP frequency response is properly incorporated into the multipeak spectral fat model used in the decomposition process. Balanced SSFP allows for rapid imaging but also introduces a characteristic frequency response featuring periodic nulls and pass bands. Fat spectral components in adjacent pass bands will experience bulk phase offsets and magnitude modulations that change the expected constructive and destructive interference between the fat spectral components. A bSSFP signal model was incorporated into the fat/water decomposition process and used to generate images of a fat phantom, and bilateral breast and knee images in four normal volunteers at 1.5 Tesla. Incorporation of the bSSFP signal model into the decomposition process improved the performance of the fat/water decomposition. Incorporation of this model allows rapid bSSFP imaging sequences to use robust fat/water decomposition methods such as IDEAL. While only one set of imaging parameters were presented, the method is compatible with any field strength or repetition time. © 2015 Wiley Periodicals, Inc.
Alvarellos-González, Alberto; Pazos, Alejandro; Porto-Pazos, Ana B.
2012-01-01
The importance of astrocytes, one part of the glial system, for information processing in the brain has recently been demonstrated. Regarding information processing in multilayer connectionist systems, it has been shown that systems which include artificial neurons and astrocytes (Artificial Neuron-Glia Networks) have well-known advantages over identical systems including only artificial neurons. Since the actual impact of astrocytes in neural network function is unknown, we have investigated, using computational models, different astrocyte-neuron interactions for information processing; different neuron-glia algorithms have been implemented for training and validation of multilayer Artificial Neuron-Glia Networks oriented toward classification problem resolution. The results of the tests performed suggest that all the algorithms modelling astrocyte-induced synaptic potentiation improved artificial neural network performance, but their efficacy depended on the complexity of the problem. PMID:22649480
Methodologies, Models and Algorithms for Patients Rehabilitation.
Fardoun, H M; Mashat, A S
2016-01-01
This editorial is part of the Focus Theme of Methods of Information in Medicine on "Methodologies, Models and Algorithms for Patients Rehabilitation". The objective of this focus theme is to present current solutions by means of technologies and human factors related to the use of Information and Communication Technologies (ICT) for improving patient rehabilitation. The focus theme examines distinctive measurements of strengthening methodologies, models and algorithms for disabled people in terms of rehabilitation and health care, and to explore the extent to which ICT is a useful tool in this process. The focus theme records a set of solutions for ICT systems developed to improve the rehabilitation process of disabled people and to help them in carrying out their daily life. The development and subsequent setting up of computers for the patients' rehabilitation process is of continuous interest and growth.
Alvarellos-González, Alberto; Pazos, Alejandro; Porto-Pazos, Ana B
2012-01-01
The importance of astrocytes, one part of the glial system, for information processing in the brain has recently been demonstrated. Regarding information processing in multilayer connectionist systems, it has been shown that systems which include artificial neurons and astrocytes (Artificial Neuron-Glia Networks) have well-known advantages over identical systems including only artificial neurons. Since the actual impact of astrocytes in neural network function is unknown, we have investigated, using computational models, different astrocyte-neuron interactions for information processing; different neuron-glia algorithms have been implemented for training and validation of multilayer Artificial Neuron-Glia Networks oriented toward classification problem resolution. The results of the tests performed suggest that all the algorithms modelling astrocyte-induced synaptic potentiation improved artificial neural network performance, but their efficacy depended on the complexity of the problem.
An intraorganizational model for developing and spreading quality improvement innovations.
Kellogg, Katherine C; Gainer, Lindsay A; Allen, Adrienne S; OʼSullivan, Tatum; Singer, Sara J
Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group's traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread.
An intraorganizational model for developing and spreading quality improvement innovations
Kellogg, Katherine C.; Gainer, Lindsay A.; Allen, Adrienne S.; O'Sullivan, Tatum; Singer, Sara J.
2017-01-01
Background: Recent policy reforms encourage quality improvement (QI) innovations in primary care, but practitioners lack clear guidance regarding spread inside organizations. Purpose: We designed this study to identify how large organizations can facilitate intraorganizational spread of QI innovations. Methodology/Approach: We conducted ethnographic observation and interviews in a large, multispecialty, community-based medical group that implemented three QI innovations across 10 primary care sites using a new method for intraorganizational process development and spread. We compared quantitative outcomes achieved through the group’s traditional versus new method, created a process model describing the steps in the new method, and identified barriers and facilitators at each step. Findings: The medical group achieved substantial improvement using its new method of intraorganizational process development and spread of QI innovations: standard work for rooming and depression screening, vaccine error rates and order compliance, and Pap smear error rates. Our model details nine critical steps for successful intraorganizational process development (set priorities, assess the current state, develop the new process, and measure and refine) and spread (develop support, disseminate information, facilitate peer-to-peer training, reinforce, and learn and adapt). Our results highlight the importance of utilizing preexisting organizational structures such as established communication channels, standardized roles, common workflows, formal authority, and performance measurement and feedback systems when developing and spreading QI processes inside an organization. In particular, we detail how formal process advocate positions in each site for each role can facilitate the spread of new processes. Practice Implications: Successful intraorganizational spread is possible and sustainable. Developing and spreading new QI processes across sites inside an organization requires creating a shared understanding of the necessary process steps, considering the barriers that may arise at each step, and leveraging preexisting organizational structures to facilitate intraorganizational process development and spread. PMID:27428788
The Use of Modeling-Based Text to Improve Students' Modeling Competencies
ERIC Educational Resources Information Center
Jong, Jing-Ping; Chiu, Mei-Hung; Chung, Shiao-Lan
2015-01-01
This study investigated the effects of a modeling-based text on 10th graders' modeling competencies. Fifteen 10th graders read a researcher-developed modeling-based science text on the ideal gas law that included explicit descriptions and representations of modeling processes (i.e., model selection, model construction, model validation, model…
Synergies Between Grace and Regional Atmospheric Modeling Efforts
NASA Astrophysics Data System (ADS)
Kusche, J.; Springer, A.; Ohlwein, C.; Hartung, K.; Longuevergne, L.; Kollet, S. J.; Keune, J.; Dobslaw, H.; Forootan, E.; Eicker, A.
2014-12-01
In the meteorological community, efforts converge towards implementation of high-resolution (< 12km) data-assimilating regional climate modelling/monitoring systems based on numerical weather prediction (NWP) cores. This is driven by requirements of improving process understanding, better representation of land surface interactions, atmospheric convection, orographic effects, and better forecasting on shorter timescales. This is relevant for the GRACE community since (1) these models may provide improved atmospheric mass separation / de-aliasing and smaller topography-induced errors, compared to global (ECMWF-Op, ERA-Interim) data, (2) they inherit high temporal resolution from NWP models, (3) parallel efforts towards improving the land surface component and coupling groundwater models; this may provide realistic hydrological mass estimates with sub-diurnal resolution, (4) parallel efforts towards re-analyses, with the aim of providing consistent time series. (5) On the other hand, GRACE can help validating models and aids in the identification of processes needing improvement. A coupled atmosphere - land surface - groundwater modelling system is currently being implemented for the European CORDEX region at 12.5 km resolution, based on the TerrSysMP platform (COSMO-EU NWP, CLM land surface and ParFlow groundwater models). We report results from Springer et al. (J. Hydromet., accept.) on validating the water cycle in COSMO-EU using GRACE and precipitation, evapotranspiration and runoff data; confirming that the model does favorably at representing observations. We show that after GRACE-derived bias correction, basin-average hydrological conditions prior to 2002 can be reconstructed better than before. Next, comparing GRACE with CLM forced by EURO-CORDEX simulations allows identifying processes needing improvement in the model. Finally, we compare COSMO-EU atmospheric pressure, a proxy for mass corrections in satellite gravimetry, with ERA-Interim over Europe at timescales shorter/longer than 1 month, and spatial scales below/above ERA resolution. We find differences between regional and global model more pronounced at high frequencies, with magnitude at sub-grid scale and larger scale corresponding to 1-3 hPa (1-3 cm EWH); relevant for the assessment of post-GRACE concepts.
How can model comparison help improving species distribution models?
Gritti, Emmanuel Stephan; Gaucherel, Cédric; Crespo-Perez, Maria-Veronica; Chuine, Isabelle
2013-01-01
Today, more than ever, robust projections of potential species range shifts are needed to anticipate and mitigate the impacts of climate change on biodiversity and ecosystem services. Such projections are so far provided almost exclusively by correlative species distribution models (correlative SDMs). However, concerns regarding the reliability of their predictive power are growing and several authors call for the development of process-based SDMs. Still, each of these methods presents strengths and weakness which have to be estimated if they are to be reliably used by decision makers. In this study we compare projections of three different SDMs (STASH, LPJ and PHENOFIT) that lie in the continuum between correlative models and process-based models for the current distribution of three major European tree species, Fagussylvatica L., Quercusrobur L. and Pinussylvestris L. We compare the consistency of the model simulations using an innovative comparison map profile method, integrating local and multi-scale comparisons. The three models simulate relatively accurately the current distribution of the three species. The process-based model performs almost as well as the correlative model, although parameters of the former are not fitted to the observed species distributions. According to our simulations, species range limits are triggered, at the European scale, by establishment and survival through processes primarily related to phenology and resistance to abiotic stress rather than to growth efficiency. The accuracy of projections of the hybrid and process-based model could however be improved by integrating a more realistic representation of the species resistance to water stress for instance, advocating for pursuing efforts to understand and formulate explicitly the impact of climatic conditions and variations on these processes.
How Can Model Comparison Help Improving Species Distribution Models?
Gritti, Emmanuel Stephan; Gaucherel, Cédric; Crespo-Perez, Maria-Veronica; Chuine, Isabelle
2013-01-01
Today, more than ever, robust projections of potential species range shifts are needed to anticipate and mitigate the impacts of climate change on biodiversity and ecosystem services. Such projections are so far provided almost exclusively by correlative species distribution models (correlative SDMs). However, concerns regarding the reliability of their predictive power are growing and several authors call for the development of process-based SDMs. Still, each of these methods presents strengths and weakness which have to be estimated if they are to be reliably used by decision makers. In this study we compare projections of three different SDMs (STASH, LPJ and PHENOFIT) that lie in the continuum between correlative models and process-based models for the current distribution of three major European tree species, Fagus sylvatica L., Quercus robur L. and Pinus sylvestris L. We compare the consistency of the model simulations using an innovative comparison map profile method, integrating local and multi-scale comparisons. The three models simulate relatively accurately the current distribution of the three species. The process-based model performs almost as well as the correlative model, although parameters of the former are not fitted to the observed species distributions. According to our simulations, species range limits are triggered, at the European scale, by establishment and survival through processes primarily related to phenology and resistance to abiotic stress rather than to growth efficiency. The accuracy of projections of the hybrid and process-based model could however be improved by integrating a more realistic representation of the species resistance to water stress for instance, advocating for pursuing efforts to understand and formulate explicitly the impact of climatic conditions and variations on these processes. PMID:23874779
Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do.
Zhao, Linlin; Wang, Wenyi; Sedykh, Alexander; Zhu, Hao
2017-06-30
Numerous chemical data sets have become available for quantitative structure-activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting.
Experimental Errors in QSAR Modeling Sets: What We Can Do and What We Cannot Do
2017-01-01
Numerous chemical data sets have become available for quantitative structure–activity relationship (QSAR) modeling studies. However, the quality of different data sources may be different based on the nature of experimental protocols. Therefore, potential experimental errors in the modeling sets may lead to the development of poor QSAR models and further affect the predictions of new compounds. In this study, we explored the relationship between the ratio of questionable data in the modeling sets, which was obtained by simulating experimental errors, and the QSAR modeling performance. To this end, we used eight data sets (four continuous endpoints and four categorical endpoints) that have been extensively curated both in-house and by our collaborators to create over 1800 various QSAR models. Each data set was duplicated to create several new modeling sets with different ratios of simulated experimental errors (i.e., randomizing the activities of part of the compounds) in the modeling process. A fivefold cross-validation process was used to evaluate the modeling performance, which deteriorates when the ratio of experimental errors increases. All of the resulting models were also used to predict external sets of new compounds, which were excluded at the beginning of the modeling process. The modeling results showed that the compounds with relatively large prediction errors in cross-validation processes are likely to be those with simulated experimental errors. However, after removing a certain number of compounds with large prediction errors in the cross-validation process, the external predictions of new compounds did not show improvement. Our conclusion is that the QSAR predictions, especially consensus predictions, can identify compounds with potential experimental errors. But removing those compounds by the cross-validation procedure is not a reasonable means to improve model predictivity due to overfitting. PMID:28691113
[Process management in the hospital pharmacy for the improvement of the patient safety].
Govindarajan, R; Perelló-Juncá, A; Parès-Marimòn, R M; Serrais-Benavente, J; Ferrandez-Martí, D; Sala-Robinat, R; Camacho-Calvente, A; Campabanal-Prats, C; Solà-Anderiu, I; Sanchez-Caparrós, S; Gonzalez-Estrada, J; Martinez-Olalla, P; Colomer-Palomo, J; Perez-Mañosas, R; Rodríguez-Gallego, D
2013-01-01
To define a process management model for a hospital pharmacy in order to measure, analyse and make continuous improvements in patient safety and healthcare quality. In order to implement process management, Igualada Hospital was divided into different processes, one of which was the Hospital Pharmacy. A multidisciplinary management team was given responsibility for each process. For each sub-process one person was identified to be responsible, and a working group was formed under his/her leadership. With the help of each working group, a risk analysis using failure modes and effects analysis (FMEA) was performed, and the corresponding improvement actions were implemented. Sub-process indicators were also identified, and different process management mechanisms were introduced. The first risk analysis with FMEA produced more than thirty preventive actions to improve patient safety. Later, the weekly analysis of errors, as well as the monthly analysis of key process indicators, permitted us to monitor process results and, as each sub-process manager participated in these meetings, also to assume accountability and responsibility, thus consolidating the culture of excellence. The introduction of different process management mechanisms, with the participation of people responsible for each sub-process, introduces a participative management tool for the continuous improvement of patient safety and healthcare quality. Copyright © 2012 SECA. Published by Elsevier Espana. All rights reserved.
NASA Astrophysics Data System (ADS)
He, G.; Zhu, H.; Xu, J.; Gao, K.; Zhu, D.
2017-09-01
The bionic research of shape is an important aspect of the research on bionic robot, and its implementation cannot be separated from the shape modeling and numerical simulation of the bionic object, which is tedious and time-consuming. In order to improve the efficiency of shape bionic design, the feet of animals living in soft soil and swamp environment are taken as bionic objects, and characteristic skeleton curve, section curve, joint rotation variable, position and other parameters are used to describe the shape and position information of bionic object’s sole, toes and flipper. The geometry modeling of the bionic object is established by using the parameterization of characteristic curves and variables. Based on this, the integration framework of parametric modeling and finite element modeling, dynamic analysis and post-processing of sinking process in soil is proposed in this paper. The examples of bionic ostrich foot and bionic duck foot are also given. The parametric modeling and integration technique can achieve rapid improved design based on bionic object, and it can also greatly improve the efficiency and quality of robot foot bionic design, and has important practical significance to improve the level of bionic design of robot foot’s shape and structure.
Xue, Lianqing; Yang, Fan; Yang, Changbing; Wei, Guanghui; Li, Wenqian; He, Xinlin
2018-01-11
Understanding the mechanism of complicated hydrological processes is important for sustainable management of water resources in an arid area. This paper carried out the simulations of water movement for the Manas River Basin (MRB) using the improved semi-distributed Topographic hydrologic model (TOPMODEL) with a snowmelt model and topographic index algorithm. A new algorithm is proposed to calculate the curve of topographic index using internal tangent circle on a conical surface. Based on the traditional model, the improved indicator of temperature considered solar radiation is used to calculate the amount of snowmelt. The uncertainty of parameters for the TOPMODEL model was analyzed using the generalized likelihood uncertainty estimation (GLUE) method. The proposed model shows that the distribution of the topographic index is concentrated in high mountains, and the accuracy of runoff simulation has certain enhancement by considering radiation. Our results revealed that the performance of the improved TOPMODEL is acceptable and comparable to runoff simulation in the MRB. The uncertainty of the simulations resulted from the parameters and structures of model, climatic and anthropogenic factors. This study is expected to serve as a valuable complement for widely application of TOPMODEL and identify the mechanism of hydrological processes in arid area.
Ajeani, Judith; Mangwi Ayiasi, Richard; Tetui, Moses; Ekirapa-Kiracho, Elizabeth; Namazzi, Gertrude; Muhumuza Kananura, Ronald; Namusoke Kiwanuka, Suzanne; Beyeza-Kashesya, Jolly
2017-08-01
There is increasing demand for trainers to shift from traditional didactic training to innovative approaches that are more results-oriented. Mentorship is one such approach that could bridge the clinical knowledge gap among health workers. This paper describes the experiences of an attempt to improve health-worker performance in maternal and newborn health in three rural districts through a mentoring process using the cascade model. The paper further highlights achievements and lessons learnt during implementation of the cascade model. The cascade model started with initial training of health workers from three districts of Pallisa, Kibuku and Kamuli from where potential local mentors were selected for further training and mentorship by central mentors. These local mentors then went on to conduct mentorship visits supported by the external mentors. The mentorship process concentrated on partograph use, newborn resuscitation, prevention and management of Post-Partum Haemorrhage (PPH), including active management of third stage of labour, preeclampsia management and management of the sick newborn. Data for this paper was obtained from key informant interviews with district-level managers and local mentors. Mentorship improved several aspects of health-care delivery, ranging from improved competencies and responsiveness to emergencies and health-worker professionalism. In addition, due to better district leadership for Maternal and Newborn Health (MNH), there were improved supplies/medicine availability, team work and innovative local problem-solving approaches. Health workers were ultimately empowered to perform better. The study demonstrated that it is possible to improve the competencies of frontline health workers through performance enhancement for MNH services using locally built capacity in clinical mentorship for Emergency Obstetric and Newborn Care (EmONC). The cascade mentoring process needed strong external mentorship support at the start to ensure improved capacity among local mentors to provide mentorship among local district staff.
Clarity versus complexity: land-use modeling as a practical tool for decision-makers
Sohl, Terry L.; Claggett, Peter
2013-01-01
The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.
MARKOV: A methodology for the solution of infinite time horizon MARKOV decision processes
Williams, B.K.
1988-01-01
Algorithms are described for determining optimal policies for finite state, finite action, infinite discrete time horizon Markov decision processes. Both value-improvement and policy-improvement techniques are used in the algorithms. Computing procedures are also described. The algorithms are appropriate for processes that are either finite or infinite, deterministic or stochastic, discounted or undiscounted, in any meaningful combination of these features. Computing procedures are described in terms of initial data processing, bound improvements, process reduction, and testing and solution. Application of the methodology is illustrated with an example involving natural resource management. Management implications of certain hypothesized relationships between mallard survival and harvest rates are addressed by applying the optimality procedures to mallard population models.
NASA Astrophysics Data System (ADS)
Rendón, A.; Posada, J. A.; Salazar, J. F.; Mejia, J.; Villegas, J.
2016-12-01
Precipitation in the complex terrain of the tropical Andes of South America can be strongly reduced during El Niño events, with impacts on numerous societally-relevant services, including hydropower generation, the main electricity source in Colombia. Simulating rainfall patterns and behavior in such areas of complex terrain has remained a challenge for regional climate models. Current data products such as ERA-Interim and other reanalysis and modelling products generally fail to correctly represent processes at scales that are relevant for these processes. Here we assess the added value to ERA-Interim by dynamical downscaling using the WRF regional climate model, including a comparison of different cumulus parameterization schemes. We found that WRF improves the representation of precipitation during the dry season of El Niño (DJF) events using a 1996-2014 observation period. Further, we use these improved capability to simulate an extreme deforestation scenario under El Niño conditions for an area in the central Andes of Colombia, where a big proportion of the country's hydropower is generated. Our results suggest that forests dampen the effects of El Niño on precipitation. In synthesis, our results illustrate the utility of regional modelling to improve data sources, as well as their potential for predicting the local-to-regional effects of global-change-type processes in regions with limited data availability.
Optimizing surface finishing processes through the use of novel solvents and systems
NASA Astrophysics Data System (ADS)
Quillen, M.; Holbrook, P.; Moore, J.
2007-03-01
As the semiconductor industry continues to implement the ITRS (International Technology Roadmap for Semiconductors) node targets that go beyond 45nm [1], the need for improved cleanliness between repeated process steps continues to grow. Wafer cleaning challenges cover many applications such as Cu/low-K integration, where trade-offs must be made between dielectric damage and residue by plasma etching and CMP or moisture uptake by aqueous cleaning products. [2-5] Some surface sensitive processes use the Marangoni tool design [6] where a conventional solvent such as IPA (isopropanol), combines with water to provide improved physical properties such as reduced contact angle and surface tension. This paper introduces the use of alternative solvents and their mixtures compared to pure IPA in removing ionics, moisture, and particles using immersion bench-chemistry models of various processes. A novel Eastman proprietary solvent, Eastman methyl acetate is observed to provide improvement in ionic, moisture capture, and particle removal, as compared to conventional IPA. [7] These benefits may be improved relative to pure IPA, simply by the addition of various additives. Some physical properties of the mixtures were found to be relatively unchanged even as measured performance improved. This report presents our attempts to cite and optimize these benefits through the use of laboratory models.
Juckem, Paul F.; Clark, Brian R.; Feinstein, Daniel T.
2017-05-04
The U.S. Geological Survey, National Water-Quality Assessment seeks to map estimated intrinsic susceptibility of the glacial aquifer system of the conterminous United States. Improved understanding of the hydrogeologic characteristics that explain spatial patterns of intrinsic susceptibility, commonly inferred from estimates of groundwater age distributions, is sought so that methods used for the estimation process are properly equipped. An important step beyond identifying relevant hydrogeologic datasets, such as glacial geology maps, is to evaluate how incorporation of these resources into process-based models using differing levels of detail could affect resulting simulations of groundwater age distributions and, thus, estimates of intrinsic susceptibility.This report describes the construction and calibration of three groundwater-flow models of northeastern Wisconsin that were developed with differing levels of complexity to provide a framework for subsequent evaluations of the effects of process-based model complexity on estimations of groundwater age distributions for withdrawal wells and streams. Preliminary assessments, which focused on the effects of model complexity on simulated water levels and base flows in the glacial aquifer system, illustrate that simulation of vertical gradients using multiple model layers improves simulated heads more in low-permeability units than in high-permeability units. Moreover, simulation of heterogeneous hydraulic conductivity fields in coarse-grained and some fine-grained glacial materials produced a larger improvement in simulated water levels in the glacial aquifer system compared with simulation of uniform hydraulic conductivity within zones. The relation between base flows and model complexity was less clear; however, the relation generally seemed to follow a similar pattern as water levels. Although increased model complexity resulted in improved calibrations, future application of the models using simulated particle tracking is anticipated to evaluate if these model design considerations are similarly important for understanding the primary modeling objective - to simulate reasonable groundwater age distributions.
Polarimetric radar and aircraft observations of saggy bright bands during MC3E
Matthew R. Kumjian; Giangrande, Scott E.; Mishra, Subashree; ...
2016-03-19
Polarimetric radar observations increasingly are used to understand cloud microphysical processes, which is critical for improving their representation in cloud and climate models. In particular, there has been recent focus on improving representations of ice collection processes (e.g., aggregation, riming), as these influence precipitation rate, heating profiles, and ultimately cloud life cycles. However, distinguishing these processes using conventional polarimetric radar observations is difficult, as they produce similar fingerprints. This necessitates improved analysis techniques and integration of complementary data sources. Furthermore, the Midlatitude Continental Convective Clouds Experiment (MC3E) provided such an opportunity.
McCormack, M. Luke; Guo, Dali; Iversen, Colleen M.; ...
2017-03-13
Trait-based approaches provide a useful framework to investigate plant strategies for resource acquisition, growth, and competition, as well as plant impacts on ecosystem processes. Despite significant progress capturing trait variation within and among stems and leaves, identification of trait syndromes within fine-root systems and between fine roots and other plant organs is limited. Here we discuss three underappreciated areas where focused measurements of fine-root traits can make significant contributions to ecosystem science. These include assessment of spatiotemporal variation in fine-root traits, integration of mycorrhizal fungi into fine-root-trait frameworks, and the need for improved scaling of traits measured on individual rootsmore » to ecosystem-level processes. Progress in each of these areas is providing opportunities to revisit how below-ground processes are represented in terrestrial biosphere models. Targeted measurements of fine-root traits with clear linkages to ecosystem processes and plant responses to environmental change are strongly needed to reduce empirical and model uncertainties. Further identifying how and when suites of root and whole-plant traits are coordinated or decoupled will ultimately provide a powerful tool for modeling plant form and function at local and global scales.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCormack, M. Luke; Guo, Dali; Iversen, Colleen M.
Trait-based approaches provide a useful framework to investigate plant strategies for resource acquisition, growth, and competition, as well as plant impacts on ecosystem processes. Despite significant progress capturing trait variation within and among stems and leaves, identification of trait syndromes within fine-root systems and between fine roots and other plant organs is limited. Here we discuss three underappreciated areas where focused measurements of fine-root traits can make significant contributions to ecosystem science. These include assessment of spatiotemporal variation in fine-root traits, integration of mycorrhizal fungi into fine-root-trait frameworks, and the need for improved scaling of traits measured on individual rootsmore » to ecosystem-level processes. Progress in each of these areas is providing opportunities to revisit how below-ground processes are represented in terrestrial biosphere models. Targeted measurements of fine-root traits with clear linkages to ecosystem processes and plant responses to environmental change are strongly needed to reduce empirical and model uncertainties. Further identifying how and when suites of root and whole-plant traits are coordinated or decoupled will ultimately provide a powerful tool for modeling plant form and function at local and global scales.« less
Constituents Make the Difference: Improving the Value of Rehabilitation Research.
ERIC Educational Resources Information Center
Menz, Fredrick E.
The participatory research model used by the Rehabilitation Research and Training Center at the University of Wisconsin-Stout is discussed, with a focus on the value added to the research process and relevance of research applications when research is rehabilitation-need based and the research-to-applications process model is used. Information is…
NASA Astrophysics Data System (ADS)
Yang, J.; Zammit, C.; McMillan, H. K.
2016-12-01
As in most countries worldwide, water management in lowland areas is a big concern for New Zealand due to its economic importance for water related human activities. As a result, the estimation of available water resources in these areas (e.g., for irrigation and water supply purpose) is crucial and often requires an understanding of complex hydrological processes, which are often characterized by strong interactions between surface water and groundwater (usually expressed as losing and gaining rivers). These processes are often represented and simulated using integrated physically based hydrological models. However models with physically based groundwater modules typically require large amount of non-readily available geologic and aquifer information and are computationally intensive. Instead, this paper presents a conceptual groundwater model that is fully integrated into New Zealand's national hydrological model TopNet based on TopModel concepts (Beven, 1992). Within this conceptual framework, the integrated model can simulate not only surface processes, but also groundwater processes and surface water-groundwater interaction processes (including groundwater flow, river-groundwater interaction, and groundwater interaction with external watersheds). The developed model was applied to two New Zealand catchments with different hydro-geological and climate characteristics (Pareora catchment in the Canterbury Plains and Grey catchment on the West Coast). Previous studies have documented strong interactions between the river and groundwater, based on the analysis of a large number of concurrent flow measurements and associated information along the river main stem. Application of the integrated hydrological model indicates flow simulation (compared to the original hydrological model conceptualisation) during low flow conditions are significantly improved and further insights on local river dynamics are gained. Due to its conceptual characteristics and low level of data requirement, the integrated model could be used at local and national scales to improve the simulation of hydrological processes in non-topographically driven areas (where groundwater processes are important), and to assess impact of climate change on the integrated hydrological cycle in these areas.
Expert models and modeling processes associated with a computer-modeling tool
NASA Astrophysics Data System (ADS)
Zhang, Baohui; Liu, Xiufeng; Krajcik, Joseph S.
2006-07-01
Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using think aloud technique and video recording, we captured their computer screen modeling activities and thinking processes. We also interviewed them the day following their modeling sessions to further probe the rationale of their modeling practices. We analyzed both the audio-video transcripts and the experts' models. We found the experts' modeling processes followed the linear sequence built in the modeling program with few instances of moving back and forth. They specified their goals up front and spent a long time thinking through an entire model before acting. They specified relationships with accurate and convincing evidence. Factors (i.e., variables) in expert models were clustered, and represented by specialized technical terms. Based on the above findings, we made suggestions for improving model-based science teaching and learning using Model-It.
Statistical prediction with Kanerva's sparse distributed memory
NASA Technical Reports Server (NTRS)
Rogers, David
1989-01-01
A new viewpoint of the processing performed by Kanerva's sparse distributed memory (SDM) is presented. In conditions of near- or over-capacity, where the associative-memory behavior of the model breaks down, the processing performed by the model can be interpreted as that of a statistical predictor. Mathematical results are presented which serve as the framework for a new statistical viewpoint of sparse distributed memory and for which the standard formulation of SDM is a special case. This viewpoint suggests possible enhancements to the SDM model, including a procedure for improving the predictiveness of the system based on Holland's work with genetic algorithms, and a method for improving the capacity of SDM even when used as an associative memory.
Adaptive segmentation of cerebrovascular tree in time-of-flight magnetic resonance angiography.
Hao, J T; Li, M L; Tang, F L
2008-01-01
Accurate segmentation of the human vasculature is an important prerequisite for a number of clinical procedures, such as diagnosis, image-guided neurosurgery and pre-surgical planning. In this paper, an improved statistical approach to extracting whole cerebrovascular tree in time-of-flight magnetic resonance angiography is proposed. Firstly, in order to get a more accurate segmentation result, a localized observation model is proposed instead of defining the observation model over the entire dataset. Secondly, for the binary segmentation, an improved Iterative Conditional Model (ICM) algorithm is presented to accelerate the segmentation process. The experimental results showed that the proposed algorithm can obtain more satisfactory segmentation results and save more processing time than conventional approaches, simultaneously.
Development of an Improved Simulator for Chemical and Microbial EOR Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, Gary A.; Sepehrnoori, Kamy; Delshad, Mojdeh
2000-09-11
The objective of this research was to extend the capability of an existing simulator (UTCHEM) to improved oil recovery methods that use surfactants, polymers, gels, alkaline chemicals, microorganisms and foam as well as various combinations of these in both conventional and naturally fractured oil reservoirs. Task 1 is the addition of a dual-porosity model for chemical improved of recovery processes in naturally fractured oil reservoirs. Task 2 is the addition of a foam model. Task 3 addresses several numerical and coding enhancements that will greatly improve the versatility and performance of UTCHEM. Task 4 is the enhancements of physical propertymore » models.« less
Parameter estimation for terrain modeling from gradient data. [navigation system for Martian rover
NASA Technical Reports Server (NTRS)
Dangelo, K. R.
1974-01-01
A method is developed for modeling terrain surfaces for use on an unmanned Martian roving vehicle. The modeling procedure employs a two-step process which uses gradient as well as height data in order to improve the accuracy of the model's gradient. Least square approximation is used in order to stochastically determine the parameters which describe the modeled surface. A complete error analysis of the modeling procedure is included which determines the effect of instrumental measurement errors on the model's accuracy. Computer simulation is used as a means of testing the entire modeling process which includes the acquisition of data points, the two-step modeling process and the error analysis. Finally, to illustrate the procedure, a numerical example is included.
Global Sensitivity Analysis for Process Identification under Model Uncertainty
NASA Astrophysics Data System (ADS)
Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.
2015-12-01
The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.
[Service quality in health care: the application of the results of marketing research].
Verheggen, F W; Harteloh, P P
1993-01-01
This paper deals with quality assurance in health care and its relation to quality assurance in trade and industry. We present the service quality model--a model of quality from marketing research--and discuss how it can be applied to health care. Traditional quality assurance appears to have serious flaws. It lacks a general theory of the sources of hazards in the complex process of patient care and tends to stagnate, for no real improvement takes place. Departing from this criticism, modern quality assurance in health care is marked by: defining quality in a preferential sense as "fitness for use"; the use of theories and models of trade and industry (process-control); an emphasis on analyzing the process, instead of merely inspecting it; use of the Deming problem solving technique (plan, do, check, act); improvement of the process of care by altering perceptions of parties involved. We present an experience of application and utilization of this method in the University Hospital Maastricht, The Netherlands. The successful application of this model requires a favorable corporate culture and motivation of the health care workers. This model provides a useful framework to uplift the traditional approach to quality assurance in health care.
Some Reflections on Strategic Planning in Public Libraries.
ERIC Educational Resources Information Center
Palmour, Vernon E.
1985-01-01
Presents the Public Library Association's planning model for strategic planning in public libraries. The development of the model is explained, the basic steps of the planning process are described, and improvements to the model are suggested. (CLB)
SWARM : a scientific workflow for supporting Bayesian approaches to improve metabolic models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, X.; Stevens, R.; Mathematics and Computer Science
2008-01-01
With the exponential growth of complete genome sequences, the analysis of these sequences is becoming a powerful approach to build genome-scale metabolic models. These models can be used to study individual molecular components and their relationships, and eventually study cells as systems. However, constructing genome-scale metabolic models manually is time-consuming and labor-intensive. This property of manual model-building process causes the fact that much fewer genome-scale metabolic models are available comparing to hundreds of genome sequences available. To tackle this problem, we design SWARM, a scientific workflow that can be utilized to improve genome-scale metabolic models in high-throughput fashion. SWARM dealsmore » with a range of issues including the integration of data across distributed resources, data format conversions, data update, and data provenance. Putting altogether, SWARM streamlines the whole modeling process that includes extracting data from various resources, deriving training datasets to train a set of predictors and applying Bayesian techniques to assemble the predictors, inferring on the ensemble of predictors to insert missing data, and eventually improving draft metabolic networks automatically. By the enhancement of metabolic model construction, SWARM enables scientists to generate many genome-scale metabolic models within a short period of time and with less effort.« less
Prospects for improving the representation of coastal and shelf seas in global ocean models
NASA Astrophysics Data System (ADS)
Holt, Jason; Hyder, Patrick; Ashworth, Mike; Harle, James; Hewitt, Helene T.; Liu, Hedong; New, Adrian L.; Pickles, Stephen; Porter, Andrew; Popova, Ekaterina; Icarus Allen, J.; Siddorn, John; Wood, Richard
2017-02-01
Accurately representing coastal and shelf seas in global ocean models represents one of the grand challenges of Earth system science. They are regions of immense societal importance through the goods and services they provide, hazards they pose and their role in global-scale processes and cycles, e.g. carbon fluxes and dense water formation. However, they are poorly represented in the current generation of global ocean models. In this contribution, we aim to briefly characterise the problem, and then to identify the important physical processes, and their scales, needed to address this issue in the context of the options available to resolve these scales globally and the evolving computational landscape.We find barotropic and topographic scales are well resolved by the current state-of-the-art model resolutions, e.g. nominal 1/12°, and still reasonably well resolved at 1/4°; here, the focus is on process representation. We identify tides, vertical coordinates, river inflows and mixing schemes as four areas where modelling approaches can readily be transferred from regional to global modelling with substantial benefit. In terms of finer-scale processes, we find that a 1/12° global model resolves the first baroclinic Rossby radius for only ˜ 8 % of regions < 500 m deep, but this increases to ˜ 70 % for a 1/72° model, so resolving scales globally requires substantially finer resolution than the current state of the art.We quantify the benefit of improved resolution and process representation using 1/12° global- and basin-scale northern North Atlantic nucleus for a European model of the ocean (NEMO) simulations; the latter includes tides and a k-ɛ vertical mixing scheme. These are compared with global stratification observations and 19 models from CMIP5. In terms of correlation and basin-wide rms error, the high-resolution models outperform all these CMIP5 models. The model with tides shows improved seasonal cycles compared to the high-resolution model without tides. The benefits of resolution are particularly apparent in eastern boundary upwelling zones.To explore the balance between the size of a globally refined model and that of multiscale modelling options (e.g. finite element, finite volume or a two-way nesting approach), we consider a simple scale analysis and a conceptual grid refining approach. We put this analysis in the context of evolving computer systems, discussing model turnaround time, scalability and resource costs. Using a simple cost model compared to a reference configuration (taken to be a 1/4° global model in 2011) and the increasing performance of the UK Research Councils' computer facility, we estimate an unstructured mesh multiscale approach, resolving process scales down to 1.5 km, would use a comparable share of the computer resource by 2021, the two-way nested multiscale approach by 2022, and a 1/72° global model by 2026. However, we also note that a 1/12° global model would not have a comparable computational cost to a 1° global model in 2017 until 2027. Hence, we conclude that for computationally expensive models (e.g. for oceanographic research or operational oceanography), resolving scales to ˜ 1.5 km would be routinely practical in about a decade given substantial effort on numerical and computational development. For complex Earth system models, this extends to about 2 decades, suggesting the focus here needs to be on improved process parameterisation to meet these challenges.
Improving Global Health Education: Development of a Global Health Competency Model
Ablah, Elizabeth; Biberman, Dorothy A.; Weist, Elizabeth M.; Buekens, Pierre; Bentley, Margaret E.; Burke, Donald; Finnegan, John R.; Flahault, Antoine; Frenk, Julio; Gotsch, Audrey R.; Klag, Michael J.; Lopez, Mario Henry Rodriguez; Nasca, Philip; Shortell, Stephen; Spencer, Harrison C.
2014-01-01
Although global health is a recommended content area for the future of education in public health, no standardized global health competency model existed for master-level public health students. Without such a competency model, academic institutions are challenged to ensure that students are able to demonstrate the knowledge, skills, and attitudes (KSAs) needed for successful performance in today's global health workforce. The Association of Schools of Public Health (ASPH) sought to address this need by facilitating the development of a global health competency model through a multistage modified-Delphi process. Practitioners and academic global health experts provided leadership and guidance throughout the competency development process. The resulting product, the Global Health Competency Model 1.1, includes seven domains and 36 competencies. The Global Health Competency Model 1.1 provides a platform for engaging educators, students, and global health employers in discussion of the KSAs needed to improve human health on a global scale. PMID:24445206
NASA Astrophysics Data System (ADS)
Arsenault, Richard; Poissant, Dominique; Brissette, François
2015-11-01
This paper evaluated the effects of parametric reduction of a hydrological model on five regionalization methods and 267 catchments in the province of Quebec, Canada. The Sobol' variance-based sensitivity analysis was used to rank the model parameters by their influence on the model results and sequential parameter fixing was performed. The reduction in parameter correlations improved parameter identifiability, however this improvement was found to be minimal and was not transposed in the regionalization mode. It was shown that 11 of the HSAMI models' 23 parameters could be fixed with little or no loss in regionalization skill. The main conclusions were that (1) the conceptual lumped models used in this study did not represent physical processes sufficiently well to warrant parameter reduction for physics-based regionalization methods for the Canadian basins examined and (2) catchment descriptors did not adequately represent the relevant hydrological processes, namely snow accumulation and melt.
The Skill Development Processes of Apprenticeship.
ERIC Educational Resources Information Center
Wolek, Francis W.
1999-01-01
Case studies of apprenticeship in the Japanese tea ceremony, traditional crafts, and strategic thinking illustrate novices' growth in internal knowledge through reflective practice of skilled processes. As skilled experts, adult educators are engaged in continually improving the skilled processes they model. (SK)
Applying a Continuous Quality Improvement Model To Assess Institutional Effectiveness.
ERIC Educational Resources Information Center
Roberts, Keith
This handbook outlines techniques and processes for improving institutional effectiveness and ensuring continuous quality improvement, based on strategic planning activities at Wisconsin's Milwaukee Area Technical College (MATC). First, institutional effectiveness is defined and 17 core indicators of effectiveness developed by the Wisconsin…
Theory of constraints for publicly funded health systems.
Sadat, Somayeh; Carter, Michael W; Golden, Brian
2013-03-01
Originally developed in the context of publicly traded for-profit companies, theory of constraints (TOC) improves system performance through leveraging the constraint(s). While the theory seems to be a natural fit for resource-constrained publicly funded health systems, there is a lack of literature addressing the modifications required to adopt TOC and define the goal and performance measures. This paper develops a system dynamics representation of the classical TOC's system-wide goal and performance measures for publicly traded for-profit companies, which forms the basis for developing a similar model for publicly funded health systems. The model is then expanded to include some of the factors that affect system performance, providing a framework to apply TOC's process of ongoing improvement in publicly funded health systems. Future research is required to more accurately define the factors affecting system performance and populate the model with evidence-based estimates for various parameters in order to use the model to guide TOC's process of ongoing improvement.
The uncertainty of crop yield projections is reduced by improved temperature response functions.
Wang, Enli; Martre, Pierre; Zhao, Zhigan; Ewert, Frank; Maiorano, Andrea; Rötter, Reimund P; Kimball, Bruce A; Ottman, Michael J; Wall, Gerard W; White, Jeffrey W; Reynolds, Matthew P; Alderman, Phillip D; Aggarwal, Pramod K; Anothai, Jakarat; Basso, Bruno; Biernath, Christian; Cammarano, Davide; Challinor, Andrew J; De Sanctis, Giacomo; Doltra, Jordi; Fereres, Elias; Garcia-Vila, Margarita; Gayler, Sebastian; Hoogenboom, Gerrit; Hunt, Leslie A; Izaurralde, Roberto C; Jabloun, Mohamed; Jones, Curtis D; Kersebaum, Kurt C; Koehler, Ann-Kristin; Liu, Leilei; Müller, Christoph; Naresh Kumar, Soora; Nendel, Claas; O'Leary, Garry; Olesen, Jørgen E; Palosuo, Taru; Priesack, Eckart; Eyshi Rezaei, Ehsan; Ripoche, Dominique; Ruane, Alex C; Semenov, Mikhail A; Shcherbak, Iurii; Stöckle, Claudio; Stratonovitch, Pierre; Streck, Thilo; Supit, Iwan; Tao, Fulu; Thorburn, Peter; Waha, Katharina; Wallach, Daniel; Wang, Zhimin; Wolf, Joost; Zhu, Yan; Asseng, Senthold
2017-07-17
Increasing the accuracy of crop productivity estimates is a key element in planning adaptation strategies to ensure global food security under climate change. Process-based crop models are effective means to project climate impact on crop yield, but have large uncertainty in yield simulations. Here, we show that variations in the mathematical functions currently used to simulate temperature responses of physiological processes in 29 wheat models account for >50% of uncertainty in simulated grain yields for mean growing season temperatures from 14 °C to 33 °C. We derived a set of new temperature response functions that when substituted in four wheat models reduced the error in grain yield simulations across seven global sites with different temperature regimes by 19% to 50% (42% average). We anticipate the improved temperature responses to be a key step to improve modelling of crops under rising temperature and climate change, leading to higher skill of crop yield projections.
The Uncertainty of Crop Yield Projections Is Reduced by Improved Temperature Response Functions
NASA Technical Reports Server (NTRS)
Wang, Enli; Martre, Pierre; Zhao, Zhigan; Ewert, Frank; Maiorano, Andrea; Rotter, Reimund P.; Kimball, Bruce A.; Ottman, Michael J.; White, Jeffrey W.; Reynolds, Matthew P.;
2017-01-01
Increasing the accuracy of crop productivity estimates is a key element in planning adaptation strategies to ensure global food security under climate change. Process-based crop models are effective means to project climate impact on crop yield, but have large uncertainty in yield simulations. Here, we show that variations in the mathematical functions currently used to simulate temperature responses of physiological processes in 29 wheat models account for is greater than 50% of uncertainty in simulated grain yields for mean growing season temperatures from 14 C to 33 C. We derived a set of new temperature response functions that when substituted in four wheat models reduced the error in grain yield simulations across seven global sites with different temperature regimes by 19% to 50% (42% average). We anticipate the improved temperature responses to be a key step to improve modelling of crops under rising temperature and climate change, leading to higher skill of crop yield projections.
Computational modeling of soot nucleation
NASA Astrophysics Data System (ADS)
Chung, Seung-Hyun
Recent studies indicate that soot is the second most significant driver of climate change---behind CO2, but ahead of methane---and increased levels of soot particles in the air are linked to health hazards such as heart disease and lung cancer. Within the soot formation process, soot nucleation is the least understood step, and current experimental findings are still limited. This thesis presents computational modeling studies of the major pathways of the soot nucleation process. In this study, two regimes of soot nucleation---chemical growth and physical agglomeration---were evaluated and the results demonstrated that combustion conditions determine the relative importance of these two routes. Also, the dimerization process of polycyclic aromatic hydrocarbons, which has been regarded as one of the most important physical agglomeration processes in soot formation, was carefully examined with a new method for obtaining the nucleation rate using molecular dynamics simulation. The results indicate that the role of pyrene dimerization, which is the commonly accepted model, is expected to be highly dependent on various flame temperature conditions and may not be a key step in the soot nucleation process. An additional pathway, coronene dimerization in this case, needed to be included to improve the match with experimental data. The results of this thesis provide insight on the soot nucleation process and can be utilized to improve current soot formation models.
Deng, Yu; Huang, Zhigang; Wang, Wenbing; Chen, Yinghuai; Guo, Zhongning; Chen, Ying
2017-01-01
Aiming to improve the laser-induced forward transfer (LIFT) cell isolation process, a polydimethylsiloxane (PDMS) layer with micro-hole arrays was employed to improve the cell separation precision, and a microchip with heater was developed to maintain the working area at 100% humidity and 37°C with the purpose to preserve the viability of the isolated cells. A series of experiments were conducted to verify the contributions of the optimization to LIFT cell isolation process as well as to study the effect of laser pulse energy, laser spot size and the titanium thickness on cell isolation. With 40µm laser spot size and 40nm thick of titanium, laser energy threshold for 100% single cell isolating succeed ratio is 7µJ. According to the staining images and proliferation ratios, the chip did help to improve the cell availability and the cells can recover from the juries at least a day earlier comparing to the samples processed without the chip. With a Lattice Boltzmann model, the cell isolation process is numerically studied and it turns out that the micro-hole makes the isolation process shift to a micro-syringe injection model leading to the lower laser energy threshold for cell separation and fewer injuries. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
A Mixed Kijima Model Using the Weibull-Based Generalized Renewal Processes
2015-01-01
Generalized Renewal Processes are useful for approaching the rejuvenation of dynamical systems resulting from planned or unplanned interventions. We present new perspectives for the Generalized Renewal Processes in general and for the Weibull-based Generalized Renewal Processes in particular. Disregarding from literature, we present a mixed Generalized Renewal Processes approach involving Kijima Type I and II models, allowing one to infer the impact of distinct interventions on the performance of the system under study. The first and second theoretical moments of this model are introduced as well as its maximum likelihood estimation and random sampling approaches. In order to illustrate the usefulness of the proposed Weibull-based Generalized Renewal Processes model, some real data sets involving improving, stable, and deteriorating systems are used. PMID:26197222
Empowering Oncology Nurses to Lead Change Through a Shared Governance Project.
Gordon, Jeanine N
2016-11-01
Nurses at the bed- or chairside are knowledgeable about clinical and operational concerns that need improvement and, consequently, are in the best position to generate and evaluate practical options and potential solutions to improve efficacy and care processes. Implementation of a shared governance model is effective in engaging staff nurses to make meaningful and sustainable change in patient care processes.
ERIC Educational Resources Information Center
Pietz, Victoria Lynn
2014-01-01
Continuous Quality Improvement (CQI) programs are growing in popularity in higher education settings and a key component is the use of work groups, which require active employee involvement. The problem addressed in this research was the lack of employee engagement in the Quality Review Process (QRP), which is a statewide CQI model developed by…
Application of Ensemble Detection and Analysis to Modeling Uncertainty in Non Stationary Process
NASA Technical Reports Server (NTRS)
Racette, Paul
2010-01-01
Characterization of non stationary and nonlinear processes is a challenge in many engineering and scientific disciplines. Climate change modeling and projection, retrieving information from Doppler measurements of hydrometeors, and modeling calibration architectures and algorithms in microwave radiometers are example applications that can benefit from improvements in the modeling and analysis of non stationary processes. Analyses of measured signals have traditionally been limited to a single measurement series. Ensemble Detection is a technique whereby mixing calibrated noise produces an ensemble measurement set. The collection of ensemble data sets enables new methods for analyzing random signals and offers powerful new approaches to studying and analyzing non stationary processes. Derived information contained in the dynamic stochastic moments of a process will enable many novel applications.
Improving a Spectral Bin Microphysical Scheme Using TRMM Satellite Observations
NASA Technical Reports Server (NTRS)
Li, Xiaowen; Tao, Wei-Kuo; Matsui, Toshihisa; Liu, Chuntao; Masunaga, Hirohiko
2010-01-01
Comparisons between cloud model simulations and observations are crucial in validating model performance and improving physical processes represented in the mod Tel.hese modeled physical processes are idealized representations and almost always have large rooms for improvements. In this study, we use data from two different sensors onboard TRMM (Tropical Rainfall Measurement Mission) satellite to improve the microphysical scheme in the Goddard Cumulus Ensemble (GCE) model. TRMM observed mature-stage squall lines during late spring, early summer in central US over a 9-year period are compiled and compared with a case simulation by GCE model. A unique aspect of the GCE model is that it has a state-of-the-art spectral bin microphysical scheme, which uses 33 different bins to represent particle size distribution of each of the seven hydrometeor species. A forward radiative transfer model calculates TRMM Precipitation Radar (PR) reflectivity and TRMM Microwave Imager (TMI) 85 GHz brightness temperatures from simulated particle size distributions. Comparisons between model outputs and observations reveal that the model overestimates sizes of snow/aggregates in the stratiform region of the squall line. After adjusting temperature-dependent collection coefficients among ice-phase particles, PR comparisons become good while TMI comparisons worsen. Further investigations show that the partitioning between graupel (a high-density form of aggregate), and snow (a low-density form of aggregate) needs to be adjusted in order to have good comparisons in both PR reflectivity and TMI brightness temperature. This study shows that long-term satellite observations, especially those with multiple sensors, can be very useful in constraining model microphysics. It is also the first study in validating and improving a sophisticated spectral bin microphysical scheme according to long-term satellite observations.
A Collaborative Team Teaching Model for a MSW Capstone Course.
Moore, Rebecca M; Darby, Kathleen H; Blake, Michelle E
2016-01-01
This exploratory study was embedded in a formative process for the purposes of improving content delivery to an evidence-based practice class, and improving students' performance on a comprehensive exam. A learning and teaching model was utilized by faculty from a three-university collaborative graduate social work program to examine the extent to which course texts and assignments explicitly supported the process, application, and evaluation of evidence-based practices. The model was grounded in a collaborative culture, allowing each faculty to share their collective skills and knowledge across a range of practice settings as they revised the course curriculum. As a result, faculty found they had created a unique community that allowed a wider context for learning and professional development that translated into the classroom. Students enrolled in the revised course across all three universities showed improvement on the comprehensive exam. When faculty themselves invest in collaborative learning and teaching, students benefit.
Process Simulation of Aluminium Sheet Metal Deep Drawing at Elevated Temperatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winklhofer, Johannes; Trattnig, Gernot; Lind, Christoph
Lightweight design is essential for an economic and environmentally friendly vehicle. Aluminium sheet metal is well known for its ability to improve the strength to weight ratio of lightweight structures. One disadvantage of aluminium is that it is less formable than steel. Therefore complex part geometries can only be realized by expensive multi-step production processes. One method for overcoming this disadvantage is deep drawing at elevated temperatures. In this way the formability of aluminium sheet metal can be improved significantly, and the number of necessary production steps can thereby be reduced. This paper introduces deep drawing of aluminium sheet metalmore » at elevated temperatures, a corresponding simulation method, a characteristic process and its optimization. The temperature and strain rate dependent material properties of a 5xxx series alloy and their modelling are discussed. A three dimensional thermomechanically coupled finite element deep drawing simulation model and its validation are presented. Based on the validated simulation model an optimised process strategy regarding formability, time and cost is introduced.« less
Wang, Wenqiang
2018-01-01
To exploit the adsorption capacity of commercial powdered activated carbon (PAC) and to improve the efficiency of Cr(VI) removal from aqueous solutions, the adsorption of Cr(VI) by commercial PAC and the countercurrent two-stage adsorption (CTA) process was investigated. Different adsorption kinetics models and isotherms were compared, and the pseudo-second-order model and the Langmuir and Freundlich models fit the experimental data well. The Cr(VI) removal efficiency was >80% and was improved by 37% through the CTA process compared with the conventional single-stage adsorption process when the initial Cr(VI) concentration was 50 mg/L with a PAC dose of 1.250 g/L and a pH of 3. A calculation method for calculating the effluent Cr(VI) concentration and the PAC dose was developed for the CTA process, and the validity of the method was confirmed by a deviation of <5%. Copyright © 2017. Published by Elsevier Ltd.
Achieving performance breakthroughs in an HMO business process through quality planning.
Hanan, K B
1993-01-01
Kaiser Permanente's Georgia Region commissioned a quality planning team to design a new process to improve payments to its suppliers and vendors. The result of the team's effort was a 73 percent reduction in cycle time. This team's experiences point to the advantages of process redesign as a quality planning model, as well as some general guidelines for its most effective use in teams. If quality planning project teams are carefully configured, sufficiently expert in the existing process, and properly supported by management, organizations can achieve potentially dramatic improvements in process performance using this approach.
Transforming BIM to BEM: Generation of Building Geometry for the NASA Ames Sustainability Base BIM
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Donnell, James T.; Maile, Tobias; Rose, Cody
Typical processes of whole Building Energy simulation Model (BEM) generation are subjective, labor intensive, time intensive and error prone. Essentially, these typical processes reproduce already existing data, i.e. building models already created by the architect. Accordingly, Lawrence Berkeley National Laboratory (LBNL) developed a semi-automated process that enables reproducible conversions of Building Information Model (BIM) representations of building geometry into a format required by building energy modeling (BEM) tools. This is a generic process that may be applied to all building energy modeling tools but to date has only been used for EnergyPlus. This report describes and demonstrates each stage inmore » the semi-automated process for building geometry using the recently constructed NASA Ames Sustainability Base throughout. This example uses ArchiCAD (Graphisoft, 2012) as the originating CAD tool and EnergyPlus as the concluding whole building energy simulation tool. It is important to note that the process is also applicable for professionals that use other CAD tools such as Revit (“Revit Architecture,” 2012) and DProfiler (Beck Technology, 2012) and can be extended to provide geometry definitions for BEM tools other than EnergyPlus. Geometry Simplification Tool (GST) was used during the NASA Ames project and was the enabling software that facilitated semi-automated data transformations. GST has now been superseded by Space Boundary Tool (SBT-1) and will be referred to as SBT-1 throughout this report. The benefits of this semi-automated process are fourfold: 1) reduce the amount of time and cost required to develop a whole building energy simulation model, 2) enable rapid generation of design alternatives, 3) improve the accuracy of BEMs and 4) result in significantly better performing buildings with significantly lower energy consumption than those created using the traditional design process, especially if the simulation model was used as a predictive benchmark during operation. Developing BIM based criteria to support the semi-automated process should result in significant reliable improvements and time savings in the development of BEMs. In order to define successful BIMS, CAD export of IFC based BIMs for BEM must adhere to a standard Model View Definition (MVD) for simulation as provided by the concept design BIM MVD (buildingSMART, 2011). In order to ensure wide scale adoption, companies would also need to develop their own material libraries to support automated activities and undertake a pilot project to improve understanding of modeling conventions and design tool features and limitations.« less
Noise shaping in populations of coupled model neurons.
Mar, D J; Chow, C C; Gerstner, W; Adams, R W; Collins, J J
1999-08-31
Biological information-processing systems, such as populations of sensory and motor neurons, may use correlations between the firings of individual elements to obtain lower noise levels and a systemwide performance improvement in the dynamic range or the signal-to-noise ratio. Here, we implement such correlations in networks of coupled integrate-and-fire neurons using inhibitory coupling and demonstrate that this can improve the system dynamic range and the signal-to-noise ratio in a population rate code. The improvement can surpass that expected for simple averaging of uncorrelated elements. A theory that predicts the resulting power spectrum is developed in terms of a stochastic point-process model in which the instantaneous population firing rate is modulated by the coupling between elements.
A Model to Translate Evidence-Based Interventions Into Community Practice
Christiansen, Ann L.; Peterson, Donna J.; Guse, Clare E.; Maurana, Cheryl A.; Brandenburg, Terry
2012-01-01
There is a tension between 2 alternative approaches to implementing community-based interventions. The evidence-based public health movement emphasizes the scientific basis of prevention by disseminating rigorously evaluated interventions from academic and governmental agencies to local communities. Models used by local health departments to incorporate community input into their planning, such as the community health improvement process (CHIP), emphasize community leadership in identifying health problems and developing and implementing health improvement strategies. Each approach has limitations. Modifying CHIP to formally include consideration of evidence-based interventions in both the planning and evaluation phases leads to an evidence-driven community health improvement process that can serve as a useful framework for uniting the different approaches while emphasizing community ownership, priorities, and wisdom. PMID:22397341
NASA Astrophysics Data System (ADS)
Falconi, Stefanie M.; Palmer, Richard N.
2017-02-01
Increased requirements for public involvement in water resources management (WRM) over the past century have stimulated the development of more collaborative decision-making methods. Participatory modeling (PM) uses computer models to inform and engage stakeholders in the planning process in order to influence collaborative decisions in WRM. Past evaluations of participatory models focused on process and final outcomes, yet, were hindered by diversity of purpose and inconsistent documentation. This paper presents a two-stage framework for evaluating PM based on mechanisms for improving model effectiveness as participatory tools. The five dimensions characterize the "who, when, how, and why" of each participatory effort (stage 1). Models are evaluated as "boundary objects," a concept used to describe tools that bridge understanding and translate different bodies of knowledge to improve credibility, salience, and legitimacy (stage 2). This evaluation framework is applied to five existing case studies from the literature. Though the goals of participation can be diverse, the novel contribution of the two-stage proposed framework is the flexibility it has to evaluate a wide range of cases that differ in scope, modeling approach, and participatory context. Also, the evaluation criteria provide a structured vocabulary based on clear mechanisms that extend beyond previous process-based and outcome-based evaluations. Effective models are those that take advantage of mechanisms that facilitate dialogue and resolution and improve the accessibility and applicability of technical knowledge. Furthermore, the framework can help build more complete records and systematic documentation of evidence to help standardize the field of PM.
NASA Astrophysics Data System (ADS)
Val Martin, M.; Heald, C. L.; Arnold, S. R.
2014-04-01
Dry deposition is an important removal process controlling surface ozone. We examine the representation of this ozone loss mechanism in the Community Earth System Model. We first correct the dry deposition parameterization by coupling the leaf and stomatal vegetation resistances to the leaf area index, an omission which has adversely impacted over a decade of ozone simulations using both the Model for Ozone and Related chemical Tracers (MOZART) and Community Atmospheric Model-Chem (CAM-Chem) global models. We show that this correction increases O3 dry deposition velocities over vegetated regions and improves the simulated seasonality in this loss process. This enhanced removal reduces the previously reported bias in summertime surface O3 simulated over eastern U.S. and Europe. We further optimize the parameterization by scaling down the stomatal resistance used in the Community Land Model to observed values. This in turn further improves the simulation of dry deposition velocity of O3, particularly over broadleaf forested regions. The summertime surface O3 bias is reduced from 30 ppb to 14 ppb over eastern U.S. and 13 ppb to 5 ppb over Europe from the standard to the optimized scheme, respectively. O3 deposition processes must therefore be accurately coupled to vegetation phenology within 3-D atmospheric models, as a first step toward improving surface O3 and simulating O3 responses to future and past vegetation changes.
2013-09-01
processes used in space system acquisitions, simply implementing a data exchange specification would not fundamentally improve how information is...instruction, searching existing data sources , gathering and maintaining the data needed, and completing and reviewing the collection of information ...and manage the configuration of all critical program models, processes , and tools used throughout the DoD. Second, mandate a data exchange
Meteorological data-processing package
NASA Technical Reports Server (NTRS)
Billingsly, J. B.; Braken, P. A.
1979-01-01
METPAK, meteorological data-processing package of satellite data used to develop cloud-tracking maps, is given. Data can develop and enhance numerical prediction models for mesoscale phenomena and improve ability to detect and predict storms.
Wang, Kefeng; Peng, Changhui; Zhu, Qiuan; ...
2017-09-28
Microbial physiology plays a critical role in the biogeochemical cycles of the Earth system. However, most traditional soil carbon models are lacking in terms of the representation of key microbial processes that control the soil carbon response to global climate change. In this study, the improved process-based model TRIPLEX-GHG was developed by coupling it with the new MEND (Microbial-ENzyme-mediated Decomposition) model to estimate total global soil organic carbon (SOC) and global soil microbial carbon. The new model (TRIPLEX-MICROBE) shows considerable improvement over the previous version (TRIPLEX-GHG) in simulating SOC. We estimated the global soil carbon stock to be approximately 1195more » Pg C, with 348 Pg C located in the high northern latitudes, which is in good agreement with the well-regarded Harmonized World Soil Database (HWSD) and the Northern Circumpolar Soil Carbon Database (NCSCD). We also estimated the global soil microbial carbon to be 21 Pg C, similar to the 23 Pg C estimated. We found that the microbial carbon quantity in the latitudinal direction showed reversions at approximately 30°N, near the equator and at 25°S. A sensitivity analysis suggested that the tundra ecosystem exhibited the highest sensitivity to a 1°C increase or decrease in temperature in terms of dissolved organic carbon (DOC), microbial biomass carbon (MBC) and mineral-associated organic carbon (MOC). Furthermore, our work represents the first step towards a new generation of ecosystem process models capable of integrating key microbial processes into soil carbon cycles.« less
NASA Astrophysics Data System (ADS)
Wang, Kefeng; Peng, Changhui; Zhu, Qiuan; Zhou, Xiaolu; Wang, Meng; Zhang, Kerou; Wang, Gangsheng
2017-10-01
Microbial physiology plays a critical role in the biogeochemical cycles of the Earth system. However, most traditional soil carbon models are lacking in terms of the representation of key microbial processes that control the soil carbon response to global climate change. In this study, the improved process-based model TRIPLEX-GHG was developed by coupling it with the new MEND (Microbial-ENzyme-mediated Decomposition) model to estimate total global soil organic carbon (SOC) and global soil microbial carbon. The new model (TRIPLEX-MICROBE) shows considerable improvement over the previous version (TRIPLEX-GHG) in simulating SOC. We estimated the global soil carbon stock to be approximately 1195 Pg C, with 348 Pg C located in the high northern latitudes, which is in good agreement with the well-regarded Harmonized World Soil Database (HWSD) and the Northern Circumpolar Soil Carbon Database (NCSCD). We also estimated the global soil microbial carbon to be 21 Pg C, similar to the 23 Pg C estimated by Xu et al. (2014). We found that the microbial carbon quantity in the latitudinal direction showed reversions at approximately 30°N, near the equator and at 25°S. A sensitivity analysis suggested that the tundra ecosystem exhibited the highest sensitivity to a 1°C increase or decrease in temperature in terms of dissolved organic carbon (DOC), microbial biomass carbon (MBC), and mineral-associated organic carbon (MOC). However, our work represents the first step toward a new generation of ecosystem process models capable of integrating key microbial processes into soil carbon cycles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Kefeng; Peng, Changhui; Zhu, Qiuan
Microbial physiology plays a critical role in the biogeochemical cycles of the Earth system. However, most traditional soil carbon models are lacking in terms of the representation of key microbial processes that control the soil carbon response to global climate change. In this study, the improved process-based model TRIPLEX-GHG was developed by coupling it with the new MEND (Microbial-ENzyme-mediated Decomposition) model to estimate total global soil organic carbon (SOC) and global soil microbial carbon. The new model (TRIPLEX-MICROBE) shows considerable improvement over the previous version (TRIPLEX-GHG) in simulating SOC. We estimated the global soil carbon stock to be approximately 1195more » Pg C, with 348 Pg C located in the high northern latitudes, which is in good agreement with the well-regarded Harmonized World Soil Database (HWSD) and the Northern Circumpolar Soil Carbon Database (NCSCD). We also estimated the global soil microbial carbon to be 21 Pg C, similar to the 23 Pg C estimated. We found that the microbial carbon quantity in the latitudinal direction showed reversions at approximately 30°N, near the equator and at 25°S. A sensitivity analysis suggested that the tundra ecosystem exhibited the highest sensitivity to a 1°C increase or decrease in temperature in terms of dissolved organic carbon (DOC), microbial biomass carbon (MBC) and mineral-associated organic carbon (MOC). Furthermore, our work represents the first step towards a new generation of ecosystem process models capable of integrating key microbial processes into soil carbon cycles.« less
Simulation models and designs for advanced Fischer-Tropsch technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, G.N.; Kramer, S.J.; Tam, S.S.
1995-12-31
Process designs and economics were developed for three grass-roots indirect Fischer-Tropsch coal liquefaction facilities. A baseline and an alternate upgrading design were developed for a mine-mouth plant located in southern Illinois using Illinois No. 6 coal, and one for a mine-mouth plane located in Wyoming using Power River Basin coal. The alternate design used close-coupled ZSM-5 reactors to upgrade the vapor stream leaving the Fischer-Tropsch reactor. ASPEN process simulation models were developed for all three designs. These results have been reported previously. In this study, the ASPEN process simulation model was enhanced to improve the vapor/liquid equilibrium calculations for themore » products leaving the slurry bed Fischer-Tropsch reactors. This significantly improved the predictions for the alternate ZSM-5 upgrading design. Another model was developed for the Wyoming coal case using ZSM-5 upgrading of the Fischer-Tropsch reactor vapors. To date, this is the best indirect coal liquefaction case. Sensitivity studies showed that additional cost reductions are possible.« less
NASA Astrophysics Data System (ADS)
Ghafuri, Mohazabeh; Golfar, Bahareh; Nosrati, Mohsen; Hoseinkhani, Saman
2014-12-01
The process of ATP production is one of the most vital processes in living cells which happens with a high efficiency. Thermodynamic evaluation of this process and the factors involved in oxidative phosphorylation can provide a valuable guide for increasing the energy production efficiency in research and industry. Although energy transduction has been studied qualitatively in several researches, there are only few brief reviews based on mathematical models on this subject. In our previous work, we suggested a mathematical model for ATP production based on non-equilibrium thermodynamic principles. In the present study, based on the new discoveries on the respiratory chain of animal mitochondria, Golfar's model has been used to generate improved results for the efficiency of oxidative phosphorylation and the rate of energy loss. The results calculated from the modified coefficients for the proton pumps of the respiratory chain enzymes are closer to the experimental results and validate the model.
NASA Astrophysics Data System (ADS)
Bytheway, Janice L.
Forecast models have seen vast improvements in recent years, via increased spatial and temporal resolution, rapid updating, assimilation of more observational data, and continued development and improvement of the representation of the atmosphere. One such model is the High Resolution Rapid Refresh (HRRR) model, a 3 km, hourly-updated, convection-allowing model that has been in development since 2010 and running operationally over the contiguous US since 2014. In 2013, the HRRR became the only US model to assimilate radar reflectivity via diabatic assimilation, a process in which the observed reflectivity is used to induce a latent heating perturbation in the model initial state in order to produce precipitation in those areas where it is indicated by the radar. In order to support the continued development and improvement of the HRRR model with regard to forecasts of convective precipitation, the concept of an assessment is introduced. The assessment process aims to connect model output with observations by first validating model performance then attempting to connect that performance to model assumptions, parameterizations and processes to identify areas for improvement. Observations from remote sensing platforms such as radar and satellite can provide valuable information about three-dimensional storm structure and microphysical properties for use in the assessment, including estimates of surface rainfall, hydrometeor types and size distributions, and column moisture content. A features-based methodology is used to identify warm season convective precipitating objects in the 2013, 2014, and 2015 versions of HRRR precipitation forecasts, Stage IV multisensor precipitation products, and Global Precipitation Measurement (GPM) core satellite observations. Quantitative precipitation forecasts (QPFs) are evaluated for biases in hourly rainfall intensity, total rainfall, and areal coverage in both the US Central Plains (29-49N, 85-105W) and US Mountain West (29-49N, 105-125W). Features identified in the model and Stage IV were tracked through time in order to evaluate forecasts through several hours of the forecast period. The 2013 version of the model was found to produce significantly stronger convective storms than observed, with a slight southerly displacement from the observed storms during the peak hours of convective activity (17-00 UTC). This version of the model also displayed a strong relationship between atmospheric water vapor content and cloud thickness over the central plains. In the 2014 and 2015 versions of the model, storms in the western US were found to be smaller and weaker than the observed, and satellite products (brightness temperatures and reflectivities) simulated using model output indicated that many of the forecast storms contained too much ice above the freezing level. Model upgrades intended to decrease the biases seen in early versions include changes to the reflectivity assimilation, the addition of sub-grid scale cloud parameterizations, changes to the representation of surface processes and the addition of aerosol processes to the microphysics. The effects of these changes are evident in each successive version of the model, with reduced biases in intensity, elimination of the southerly bias, and improved representation of the onset of convection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cort, K. A.; Hostick, D. J.; Belzer, D. B.
The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge for future research.
Dynamics Modelling of Biolistic Gene Guns
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, M.; Tao, W.; Pianetta, P.A.
2009-06-04
The gene transfer process using biolistic gene guns is a highly dynamic process. To achieve good performance, the process needs to be well understood and controlled. Unfortunately, no dynamic model is available in the open literature for analysing and controlling the process. This paper proposes such a model. Relationships of the penetration depth with the helium pressure, the penetration depth with the acceleration distance, and the penetration depth with the micro-carrier radius are presented. Simulations have also been conducted. The results agree well with experimental results in the open literature. The contribution of this paper includes a dynamic model formore » improving and manipulating performance of the biolistic gene gun.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagopian, C.R.; Lewis, P.J.; McDonald, J.J.
1983-02-01
Improvements and innovations in styrene production since 1966 are outlined. Rigorous process models are attributed to the changes. Such models are used to evaluate the effects of changing raw material costs, utility costs, and available catalyst choices. The process model can also evaluate the best operating configuration and catalyst choice for a plant. All specified innovations are incorporated in the Mobil/Badger ethylbenzene and the Cosden/Badger styrene processes (both of which are schematicized). Badger's training programs are reviewed. Badger's Styrenics Business Team converts information into plant design basis. A reaction model with input derived from isothermal and adiabatic pilot plant unitsmore » is at the heart of complete computer simulation of ethylbenzene and styrene processes.« less
NASA Technical Reports Server (NTRS)
Tian, Jianhui; Porter, Adam; Zelkowitz, Marvin V.
1992-01-01
Identification of high cost modules has been viewed as one mechanism to improve overall system reliability, since such modules tend to produce more than their share of problems. A decision tree model was used to identify such modules. In this current paper, a previously developed axiomatic model of program complexity is merged with the previously developed decision tree process for an improvement in the ability to identify such modules. This improvement was tested using data from the NASA Software Engineering Laboratory.
Thomas, Elizabeth Anne
2011-06-01
The occupational health services department for a manufacturing division of a high-technology firm was redesigned from an outsourced model, in which most services were provided by an outside clinic vendor, to an in-house service model, in which services were provided by an on-site nurse practitioner. The redesign and implementation, accomplished by a cross-functional team using Total Quality Management processes, resulted in a comprehensive occupational health services department that realized significant cost reduction, increased compliance with regulatory and company requirements, and improved employee satisfaction. Implications of this project for occupational health nurses are discussed.
Simulation of generation of new ideas for new product development and IT services
NASA Astrophysics Data System (ADS)
Nasiopoulos, Dimitrios K.; Sakas, Damianos P.; Vlachos, D. S.; Mavrogianni, Amanda
2015-02-01
This paper describes a dynamic model of the New Product Development (NPD) process. The model has been occurring from best practice noticed in our research conducted at a range of situations. The model contributes to determine and put an IT company's NPD activities into the frame of the overall NPD process[1]. It has been found to be a useful tool for organizing data on IT company's NPD activities without enforcement an excessively restrictive research methodology refers to the model of NPD. The framework, which strengthens the model, will help to promote a research of the methods undertaken within an IT company's NPD process, thus promoting understanding and improvement of the simulation process[2]. IT companies tested many techniques with several different practices designed to improve the validity and efficacy of their NPD process[3]. Supported by the model, this research examines how widely accepted stated tactics are and what impact these best tactics have on NPD performance. The main assumption of this study is that simulation of generation of new ideas[4] will lead to greater NPD effectiveness and more successful products in IT companies. With the model implementation, practices concern the implementation strategies of NPD (product selection, objectives, leadership, marketing strategy and customer satisfaction) are all more widely accepted than best practices related with controlling the application of NPD (process control, measurements, results). In linking simulation with impact, our results states product success depends on developing strong products and ensuring organizational emphasis, through proper project selection. Project activities strengthens both product and project success. IT products and services success also depends on monitoring the NPD procedure through project management and ensuring team consistency with group rewards. Sharing experiences between projects can positively influence the NPD process.
Humbird, David; Trendewicz, Anna; Braun, Robert; ...
2017-01-12
A biomass fast pyrolysis reactor model with detailed reaction kinetics and one-dimensional fluid dynamics was implemented in an equation-oriented modeling environment (Aspen Custom Modeler). Portions of this work were detailed in previous publications; further modifications have been made here to improve stability and reduce execution time of the model to make it compatible for use in large process flowsheets. The detailed reactor model was integrated into a larger process simulation in Aspen Plus and was stable for different feedstocks over a range of reactor temperatures. Sample results are presented that indicate general agreement with experimental results, but with higher gasmore » losses caused by stripping of the bio-oil by the fluidizing gas in the simulated absorber/condenser. Lastly, this integrated modeling approach can be extended to other well-defined, predictive reactor models for fast pyrolysis, catalytic fast pyrolysis, as well as other processes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humbird, David; Trendewicz, Anna; Braun, Robert
A biomass fast pyrolysis reactor model with detailed reaction kinetics and one-dimensional fluid dynamics was implemented in an equation-oriented modeling environment (Aspen Custom Modeler). Portions of this work were detailed in previous publications; further modifications have been made here to improve stability and reduce execution time of the model to make it compatible for use in large process flowsheets. The detailed reactor model was integrated into a larger process simulation in Aspen Plus and was stable for different feedstocks over a range of reactor temperatures. Sample results are presented that indicate general agreement with experimental results, but with higher gasmore » losses caused by stripping of the bio-oil by the fluidizing gas in the simulated absorber/condenser. Lastly, this integrated modeling approach can be extended to other well-defined, predictive reactor models for fast pyrolysis, catalytic fast pyrolysis, as well as other processes.« less
Watson, Douglas S; Kerchner, Kristi R; Gant, Sean S; Pedersen, Joseph W; Hamburger, James B; Ortigosa, Allison D; Potgieter, Thomas I
2016-01-01
Tangential flow microfiltration (MF) is a cost-effective and robust bioprocess separation technique, but successful full scale implementation is hindered by the empirical, trial-and-error nature of scale-up. We present an integrated approach leveraging at-line process analytical technology (PAT) and mass balance based modeling to de-risk MF scale-up. Chromatography-based PAT was employed to improve the consistency of an MF step that had been a bottleneck in the process used to manufacture a therapeutic protein. A 10-min reverse phase ultra high performance liquid chromatography (RP-UPLC) assay was developed to provide at-line monitoring of protein concentration. The method was successfully validated and method performance was comparable to previously validated methods. The PAT tool revealed areas of divergence from a mass balance-based model, highlighting specific opportunities for process improvement. Adjustment of appropriate process controls led to improved operability and significantly increased yield, providing a successful example of PAT deployment in the downstream purification of a therapeutic protein. The general approach presented here should be broadly applicable to reduce risk during scale-up of filtration processes and should be suitable for feed-forward and feed-back process control. © 2015 American Institute of Chemical Engineers.
Scalable Parameter Estimation for Genome-Scale Biochemical Reaction Networks
Kaltenbacher, Barbara; Hasenauer, Jan
2017-01-01
Mechanistic mathematical modeling of biochemical reaction networks using ordinary differential equation (ODE) models has improved our understanding of small- and medium-scale biological processes. While the same should in principle hold for large- and genome-scale processes, the computational methods for the analysis of ODE models which describe hundreds or thousands of biochemical species and reactions are missing so far. While individual simulations are feasible, the inference of the model parameters from experimental data is computationally too intensive. In this manuscript, we evaluate adjoint sensitivity analysis for parameter estimation in large scale biochemical reaction networks. We present the approach for time-discrete measurement and compare it to state-of-the-art methods used in systems and computational biology. Our comparison reveals a significantly improved computational efficiency and a superior scalability of adjoint sensitivity analysis. The computational complexity is effectively independent of the number of parameters, enabling the analysis of large- and genome-scale models. Our study of a comprehensive kinetic model of ErbB signaling shows that parameter estimation using adjoint sensitivity analysis requires a fraction of the computation time of established methods. The proposed method will facilitate mechanistic modeling of genome-scale cellular processes, as required in the age of omics. PMID:28114351
Knowing the SCOR: using business metrics to gain measurable improvements.
Malin, Jane H
2006-07-01
By using the Supply Chain Operations Reference model, one New York hospital was able to define and measure its supply chains, determine the weak links in its processes, and identify necessary improvements.
Southeast Atmosphere Studies: learning from model-observation syntheses
NASA Astrophysics Data System (ADS)
Mao, Jingqiu; Carlton, Annmarie; Cohen, Ronald C.; Brune, William H.; Brown, Steven S.; Wolfe, Glenn M.; Jimenez, Jose L.; Pye, Havala O. T.; Ng, Nga Lee; Xu, Lu; McNeill, V. Faye; Tsigaridis, Kostas; McDonald, Brian C.; Warneke, Carsten; Guenther, Alex; Alvarado, Matthew J.; de Gouw, Joost; Mickley, Loretta J.; Leibensperger, Eric M.; Mathur, Rohit; Nolte, Christopher G.; Portmann, Robert W.; Unger, Nadine; Tosca, Mika; Horowitz, Larry W.
2018-02-01
Concentrations of atmospheric trace species in the United States have changed dramatically over the past several decades in response to pollution control strategies, shifts in domestic energy policy and economics, and economic development (and resulting emission changes) elsewhere in the world. Reliable projections of the future atmosphere require models to not only accurately describe current atmospheric concentrations, but to do so by representing chemical, physical and biological processes with conceptual and quantitative fidelity. Only through incorporation of the processes controlling emissions and chemical mechanisms that represent the key transformations among reactive molecules can models reliably project the impacts of future policy, energy and climate scenarios. Efforts to properly identify and implement the fundamental and controlling mechanisms in atmospheric models benefit from intensive observation periods, during which collocated measurements of diverse, speciated chemicals in both the gas and condensed phases are obtained. The Southeast Atmosphere Studies (SAS, including SENEX, SOAS, NOMADSS and SEAC4RS) conducted during the summer of 2013 provided an unprecedented opportunity for the atmospheric modeling community to come together to evaluate, diagnose and improve the representation of fundamental climate and air quality processes in models of varying temporal and spatial scales.This paper is aimed at discussing progress in evaluating, diagnosing and improving air quality and climate modeling using comparisons to SAS observations as a guide to thinking about improvements to mechanisms and parameterizations in models. The effort focused primarily on model representation of fundamental atmospheric processes that are essential to the formation of ozone, secondary organic aerosol (SOA) and other trace species in the troposphere, with the ultimate goal of understanding the radiative impacts of these species in the southeast and elsewhere. Here we address questions surrounding four key themes: gas-phase chemistry, aerosol chemistry, regional climate and chemistry interactions, and natural and anthropogenic emissions. We expect this review to serve as a guidance for future modeling efforts.
NASA Astrophysics Data System (ADS)
Desjardins, R.; Smith, W.; Qi, Z.; Grant, B.; VanderZaag, A.
2017-12-01
Biophysical models are needed for assessing science-based mitigation options to improve the efficiency and sustainability of agricultural cropping systems. In order to account for trade-offs between environmental indicators such as GHG emissions, soil C change, and water quality it is important that models can encapsulate the complex array of interrelated biogeochemical processes controlling water, nutrient and energy flows in the agroecosystem. The Denitrification Decomposition (DNDC) model is one of the most widely used process-based models, and is arguably the most sophisticated for estimating GHG emissions and soil C&N cycling, however, the model simulates only simple cascade water flow. The purpose of this study was to compare the performance of DNDC to a comprehensive water flow model, the Root Zone Water Quality Model (RZWQM2), to determine which processes in DNDC may be limiting and recommend improvements. Both models were calibrated and validated for simulating crop biomass, soil hydrology, and nitrogen loss to tile drains using detailed observations from a corn-soybean rotation in Iowa, with and without cover crops. Results indicated that crop yields, biomass and the annual estimation of nitrogen and water loss to tiles drains were well simulated by both models (NSE > 0.6 in all cases); however, RZWQM2 performed much better for simulating soil water content, and the dynamics of daily water flow (DNDC: NSE -0.32 to 0.28; RZWQM2: NSE 0.34 to 0.70) to tile drains. DNDC overestimated soil water content near the soil surface and underestimated it deeper in the profile which was presumably caused by the lack of a root distribution algorithm, the inability to simulate a heterogeneous profile and lack of a water table. We recommend these improvements along with the inclusion of enhanced water flow and a mechanistic tile drainage sub-model. The accurate temporal simulation of water and N strongly impacts several biogeochemical processes.
Southeast Atmosphere Studies: learning from model-observation syntheses
Mao, Jingqiu; Carlton, Annmarie; Cohen, Ronald C.; Brune, William H.; Brown, Steven S.; Wolfe, Glenn M.; Jimenez, Jose L.; Pye, Havala O. T.; Ng, Nga Lee; Xu, Lu; McNeill, V. Faye; Tsigaridis, Kostas; McDonald, Brian C.; Warneke, Carsten; Guenther, Alex; Alvarado, Matthew J.; de Gouw, Joost; Mickley, Loretta J.; Leibensperger, Eric M.; Mathur, Rohit; Nolte, Christopher G.; Portmann, Robert W.; Unger, Nadine; Tosca, Mika; Horowitz, Larry W.
2018-01-01
Concentrations of atmospheric trace species in the United States have changed dramatically over the past several decades in response to pollution control strategies, shifts in domestic energy policy and economics, and economic development (and resulting emission changes) elsewhere in the world. Reliable projections of the future atmosphere require models to not only accurately describe current atmospheric concentrations, but to do so by representing chemical, physical and biological processes with conceptual and quantitative fidelity. Only through incorporation of the processes controlling emissions and chemical mechanisms that represent the key transformations among reactive molecules can models reliably project the impacts of future policy, energy and climate scenarios. Efforts to properly identify and implement the fundamental and controlling mechanisms in atmospheric models benefit from intensive observation periods, during which collocated measurements of diverse, speciated chemicals in both the gas and condensed phases are obtained. The Southeast Atmosphere Studies (SAS, including SENEX, SOAS, NOMADSS and SEAC4RS) conducted during the summer of 2013 provided an unprecedented opportunity for the atmospheric modeling community to come together to evaluate, diagnose and improve the representation of fundamental climate and air quality processes in models of varying temporal and spatial scales. This paper is aimed at discussing progress in evaluating, diagnosing and improving air quality and climate modeling using comparisons to SAS observations as a guide to thinking about improvements to mechanisms and parameterizations in models. The effort focused primarily on model representation of fundamental atmospheric processes that are essential to the formation of ozone, secondary organic aerosol (SOA) and other trace species in the troposphere, with the ultimate goal of understanding the radiative impacts of these species in the southeast and elsewhere. Here we address questions surrounding four key themes: gas-phase chemistry, aerosol chemistry, regional climate and chemistry interactions, and natural and anthropogenic emissions. We expect this review to serve as a guidance for future modeling efforts.
Southeast Atmosphere Studies: Learning from Model-Observation Syntheses
NASA Technical Reports Server (NTRS)
Mao, Jingqiu; Carlton, Annmarie; Cohen, Ronald C.; Brune, William H.; Brown, Steven S.; Wolfe, Glenn M.; Jimenez, Jose L.; Pye, Havala O. T.; Ng, Nga Lee; Xu, Lu;
2018-01-01
Concentrations of atmospheric trace species in the United States have changed dramatically over the past several decades in response to pollution control strategies, shifts in domestic energy policy and economics, and economic development (and resulting emission changes) elsewhere in the world. Reliable projections of the future atmosphere require models to not only accurately describe current atmospheric concentrations, but to do so by representing chemical, physical and biological processes with conceptual and quantitative fidelity. Only through incorporation of the processes controlling emissions and chemical mechanisms that represent the key transformations among reactive molecules can models reliably project the impacts of future policy, energy and climate scenarios. Efforts to properly identify and implement the fundamental and controlling mechanisms in atmospheric models benefit from intensive observation periods, during which collocated measurements of diverse, speciated chemicals in both the gas and condensed phases are obtained. The Southeast Atmosphere Studies (SAS, including SENEX, SOAS, NOMADSS and SEAC4RS) conducted during the summer of 2013 provided an unprecedented opportunity for the atmospheric modeling community to come together to evaluate, diagnose and improve the representation of fundamental climate and air quality processes in models of varying temporal and spatial scales. This paper is aimed at discussing progress in evaluating, diagnosing and improving air quality and climate modeling using comparisons to SAS observations as a guide to thinking about improvements to mechanisms and parameterizations in models. The effort focused primarily on model representation of fundamental atmospheric processes that are essential to the formation of ozone, secondary organic aerosol (SOA) and other trace species in the troposphere, with the ultimate goal of understanding the radiative impacts of these species in the southeast and elsewhere. Here we address questions surrounding four key themes: gas-phase chemistry, aerosol chemistry, regional climate and chemistry interactions, and natural and anthropogenic emissions. We expect this review to serve as a guidance for future modeling efforts.
Performance measurement integrated information framework in e-Manufacturing
NASA Astrophysics Data System (ADS)
Teran, Hilaida; Hernandez, Juan Carlos; Vizán, Antonio; Ríos, José
2014-11-01
The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.
NASA Astrophysics Data System (ADS)
Shu, Hui; Zhou, Xideng
2014-05-01
The single-vendor single-buyer integrated production inventory system has been an object of study for a long time, but little is known about the effect of investing in reducing setup cost reduction and process-quality improvement for an integrated inventory system in which the products are sold with free minimal repair warranty. The purpose of this article is to minimise the integrated cost by optimising simultaneously the number of shipments and the shipment quantity, the setup cost, and the process quality. An efficient algorithm procedure is proposed for determining the optimal decision variables. A numerical example is presented to illustrate the results of the proposed models graphically. Sensitivity analysis of the model with respect to key parameters of the system is carried out. The paper shows that the proposed integrated model can result in significant savings in the integrated cost.
Optimization of the production process using virtual model of a workspace
NASA Astrophysics Data System (ADS)
Monica, Z.
2015-11-01
Optimization of the production process is an element of the design cycle consisting of: problem definition, modelling, simulation, optimization and implementation. Without the use of simulation techniques, the only thing which could be achieved is larger or smaller improvement of the process, not the optimization (i.e., the best result it is possible to get for the conditions under which the process works). Optimization is generally management actions that are ultimately bring savings in time, resources, and raw materials and improve the performance of a specific process. It does not matter whether it is a service or manufacturing process. Optimizing the savings generated by improving and increasing the efficiency of the processes. Optimization consists primarily of organizational activities that require very little investment, or rely solely on the changing organization of work. Modern companies operating in a market economy shows a significant increase in interest in modern methods of production management and services. This trend is due to the high competitiveness among companies that want to achieve success are forced to continually modify the ways to manage and flexible response to changing demand. Modern methods of production management, not only imply a stable position of the company in the sector, but also influence the improvement of health and safety within the company and contribute to the implementation of more efficient rules for standardization work in the company. This is why in the paper is presented the application of such developed environment like Siemens NX to create the virtual model of a production system and to simulate as well as optimize its work. The analyzed system is the robotized workcell consisting of: machine tools, industrial robots, conveyors, auxiliary equipment and buffers. In the program could be defined the control program realizing the main task in the virtual workcell. It is possible, using this tool, to optimize both the object trajectory and the cooperation process.
NASA Technical Reports Server (NTRS)
Hartman, Brian Davis
1995-01-01
A key drawback to estimating geodetic and geodynamic parameters over time based on satellite laser ranging (SLR) observations is the inability to accurately model all the forces acting on the satellite. Errors associated with the observations and the measurement model can detract from the estimates as well. These 'model errors' corrupt the solutions obtained from the satellite orbit determination process. Dynamical models for satellite motion utilize known geophysical parameters to mathematically detail the forces acting on the satellite. However, these parameters, while estimated as constants, vary over time. These temporal variations must be accounted for in some fashion to maintain meaningful solutions. The primary goal of this study is to analyze the feasibility of using a sequential process noise filter for estimating geodynamic parameters over time from the Laser Geodynamics Satellite (LAGEOS) SLR data. This evaluation is achieved by first simulating a sequence of realistic LAGEOS laser ranging observations. These observations are generated using models with known temporal variations in several geodynamic parameters (along track drag and the J(sub 2), J(sub 3), J(sub 4), and J(sub 5) geopotential coefficients). A standard (non-stochastic) filter and a stochastic process noise filter are then utilized to estimate the model parameters from the simulated observations. The standard non-stochastic filter estimates these parameters as constants over consecutive fixed time intervals. Thus, the resulting solutions contain constant estimates of parameters that vary in time which limits the temporal resolution and accuracy of the solution. The stochastic process noise filter estimates these parameters as correlated process noise variables. As a result, the stochastic process noise filter has the potential to estimate the temporal variations more accurately since the constraint of estimating the parameters as constants is eliminated. A comparison of the temporal resolution of solutions obtained from standard sequential filtering methods and process noise sequential filtering methods shows that the accuracy is significantly improved using process noise. The results show that the positional accuracy of the orbit is improved as well. The temporal resolution of the resulting solutions are detailed, and conclusions drawn about the results. Benefits and drawbacks of using process noise filtering in this type of scenario are also identified.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahowald, Natalie
Soils in natural and managed ecosystems and wetlands are well known sources of methane, nitrous oxides, and reactive nitrogen gases, but the magnitudes of gas flux to the atmosphere are still poorly constrained. Thus, the reasons for the large increases in atmospheric concentrations of methane and nitrous oxide since the preindustrial time period are not well understood. The low atmospheric concentrations of methane and nitrous oxide, despite being more potent greenhouse gases than carbon dioxide, complicate empirical studies to provide explanations. In addition to climate concerns, the emissions of reactive nitrogen gases from soils are important to the changing nitrogenmore » balance in the earth system, subject to human management, and may change substantially in the future. Thus improved modeling of the emission fluxes of these species from the land surface is important. Currently, there are emission modules for methane and some nitrogen species in the Community Earth System Model’s Community Land Model (CLM-ME/N); however, there are large uncertainties and problems in the simulations, resulting in coarse estimates. In this proposal, we seek to improve these emission modules by combining state-of-the-art process modules for emissions, available data, and new optimization methods. In earth science problems, we often have substantial data and knowledge of processes in disparate systems, and thus we need to combine data and a general process level understanding into a model for projections of future climate that are as accurate as possible. The best methodologies for optimization of parameters in earth system models are still being developed. In this proposal we will develop and apply surrogate algorithms that a) were especially developed for computationally expensive simulations like CLM-ME/N models; b) were (in the earlier surrogate optimization Stochastic RBF) demonstrated to perform very well on computationally expensive complex partial differential equations in earth science with limited numbers of simulations; and, c) will be (as part of the proposed research) significantly improved both by adding asynchronous parallelism, early truncation of unsuccessful simulations, and the improvement of both serial and parallel performance by the use of derivative and sensitivity information from global and local surrogate approximations S(x). The algorithm development and testing will be focused on the CLM-ME/N model application, but the methods are general and are expected to also perform well on optimization for parameter estimation of other climate models and other classes of continuous multimodal optimization problems arising from complex simulation models. In addition, this proposal will compile available datasets of emissions of methane, nitrous oxides and reactive nitrogen species and develop protocols for site level comparisons with the CLM-ME/N. Once the model parameters are optimized against site level data, the model will be simulated at the global level and compared to atmospheric concentration measurements for the current climate, and future emissions will be estimated using climate change as simulated by the CESM. This proposal combines experts in earth system modeling, optimization, computer science, and process level understanding of soil gas emissions in an interdisciplinary team in order to improve the modeling of methane and nitrogen gas emissions. This proposal thus meets the requirements of the SciDAC RFP, by integrating state-of-the-art computer science and earth system to build an improved earth system model.« less
Preform Characterization in VARTM Process Model Development
NASA Technical Reports Server (NTRS)
Grimsley, Brian W.; Cano, Roberto J.; Hubert, Pascal; Loos, Alfred C.; Kellen, Charles B.; Jensen, Brian J.
2004-01-01
Vacuum-Assisted Resin Transfer Molding (VARTM) is a Liquid Composite Molding (LCM) process where both resin injection and fiber compaction are achieved under pressures of 101.3 kPa or less. Originally developed over a decade ago for marine composite fabrication, VARTM is now considered a viable process for the fabrication of aerospace composites (1,2). In order to optimize and further improve the process, a finite element analysis (FEA) process model is being developed to include the coupled phenomenon of resin flow, preform compaction and resin cure. The model input parameters are obtained from resin and fiber-preform characterization tests. In this study, the compaction behavior and the Darcy permeability of a commercially available carbon fabric are characterized. The resulting empirical model equations are input to the 3- Dimensional Infiltration, version 5 (3DINFILv.5) process model to simulate infiltration of a composite panel.
NASA Astrophysics Data System (ADS)
Jonny, Zagloed, Teuku Yuri M.
2017-11-01
This paper aims to present an integrated health care model for Indonesian health care industry. Based on previous researches, there are two health care models in the industry such as decease- and patient-centered care models. In their developments, the patient-centered care model is widely applied due to its capability in reducing cost and improving quality simultaneously. However, there is still no comprehensive model resulting in cost reduction, quality improvement, patient satisfaction and hospital profitability simultaneously. Therefore, this research is intended to develop that model. In doing so, first, a conceptual model using Kano's Model, Quality Function Deployment (QFD) and Balanced Scorecard (BSC) is developed to generate several important elements of the model as required by stakeholders. Then, a case study of an Indonesian hospital is presented to evaluate the validity of the model using correlation analysis. As a result, it can be concluded that the model is validated implying several managerial insights among its elements such as l) leadership (r=0.85) and context of the organization (r=0.77) improve operations; 2) planning (r=0.96), support process (r=0.87) and continual improvement (r=0.95) also improve operations; 3) operations improve customer satisfaction (r=0.89) and financial performance (r=0.93) and 4) customer satisfaction improves the financial performance (0.98).
Modeling Perceptual Decision Processes
2014-09-17
Ratcliff, & Wagenmakers, in press). Previous research suggests that playing action video games improves performance on sensory, perceptual, and...estimate the contribution of several underlying psychological processes. Their analysis indicated that playing action video games leads to faster...third condition in which no video games were played at all. Behavioral data and diffusion model parameters showed similar practice effects for the
A Model for Long Range Planning for Seminole Community College.
ERIC Educational Resources Information Center
Miner, Norris
A model for long-range planning designed to maximize involvement of college personnel, to improve communication among various areas of the college, to provide a process for evaluation of long-range plans and the planning process, to adjust to changing conditions, to utilize data developed at a level useful for actual operations, and to have…
NASA Astrophysics Data System (ADS)
Choirunnisa, N. L.; Prabowo, P.; Suryanti, S.
2018-01-01
The main objective of this study is to describe the effectiveness of 5E instructional model-based learning to improve primary school students’ science process skills. The science process skills is important for students as it is the foundation for enhancing the mastery of concepts and thinking skills needed in the 21st century. The design of this study was experimental involving one group pre-test and post-test design. The result of this study shows that (1) the implementation of learning in both of classes, IVA and IVB, show that the percentage of learning implementation increased which indicates a better quality of learning and (2) the percentage of students’ science process skills test results on the aspects of observing, formulating hypotheses, determining variable, interpreting data and communicating increased as well.
NASA Astrophysics Data System (ADS)
Frederick, B. C.; Gooch, B. T.; Richter, T.; Young, D. A.; Blankenship, D. D.; Aitken, A.; Siegert, M. J.
2013-12-01
Topography, sediment distribution and heat flux are all key boundary conditions governing the stability of the East Antarctic ice sheet (EAIS). Recent scientific scrutiny has been focused on several large, deep, interior EAIS basins including the submarine basal topography characterizing the Aurora Subglacial Basin (ASB). Numerical ice sheet models require accurate deformable sediment distribution and lithologic character constraints to estimate overall flow velocities and potential instability. To date, such estimates across the ASB have been derived from low-resolution satellite data or historic aerogeophysical surveys conducted prior to the advent of GPS. These rough basal condition estimates have led to poorly-constrained ice sheet stability models for this remote 200,000 sq km expanse of the ASB. Here we present a significantly improved quantitative model characterizing the subglacial lithology and sediment in the ASB region. The product of comprehensive ICECAP (2008-2013) aerogeophysical data processing, this sedimentary basin model details the expanse and thickness of probable Wilkes Land subglacial sedimentary deposits and density contrast boundaries indicative of distinct subglacial lithologic units. As part of the process, BEDMAP2 subglacial topographic results were improved through the additional incorporation of ice-penetrating radar data collected during ICECAP field seasons 2010-2013. Detailed potential field data pre-processing was completed as well as a comprehensive evaluation of crustal density contrasts based on the gravity power spectrum, a subsequent high pass data filter was also applied to remove longer crustal wavelengths from the gravity dataset prior to inversion. Gridded BEDMAP2+ ice and bed radar surfaces were then utilized to establish bounding density models for the 3D gravity inversion process to yield probable sedimentary basin anomalies. Gravity inversion results were iteratively evaluated against radar along-track RMS deviation and gravity and magnetic depth to basement results. This geophysical data processing methodology provides a substantial improvement over prior Wilkes Land sedimentary basin estimates yielding a higher resolution model based upon iteration of several aerogeophysical datasets concurrently. This more detailed subglacial sedimentary basin model for Wilkes Land, East Antarctica will not only contribute to vast improvements on EAIS ice sheet model constraints, but will also provide significant quantifiable controls for subglacial hydrologic and geothermal flux estimates that are also sizable contributors to the cold-based, deep interior basal ice dynamics dominant in the Wilkes Land region.
NASA Astrophysics Data System (ADS)
Wei, Jiangfeng; Dirmeyer, Paul A.; Yang, Zong-Liang; Chen, Haishan
2017-10-01
Through a series of model simulations with an atmospheric general circulation model coupled to three different land surface models, this study investigates the impacts of land model ensembles and coupled model ensemble on precipitation simulation. It is found that coupling an ensemble of land models to an atmospheric model has a very minor impact on the improvement of precipitation climatology and variability, but a simple ensemble average of the precipitation from three individually coupled land-atmosphere models produces better results, especially for precipitation variability. The generally weak impact of land processes on precipitation should be the main reason that the land model ensembles do not improve precipitation simulation. However, if there are big biases in the land surface model or land surface data set, correcting them could improve the simulated climate, especially for well-constrained regional climate simulations.
ERIC Educational Resources Information Center
Zeytun, Aysel Sen; Cetinkaya, Bulent; Erbas, Ayhan Kursat
2017-01-01
This paper investigates how prospective teachers develop mathematical models while they engage in modeling tasks. The study was conducted in an undergraduate elective course aiming to improve prospective teachers' mathematical modeling abilities, while enhancing their pedagogical knowledge for the integrating of modeling tasks into their future…
NASA Technical Reports Server (NTRS)
Vairo, Daniel M.
1998-01-01
The removal and installation of sting-mounted wind tunnel models in the National Transonic Facility (NTF) is a multi-task process having a large impact on the annual throughput of the facility. Approximately ten model removal and installation cycles occur annually at the NTF with each cycle requiring slightly over five days to complete. The various tasks of the model changeover process were modeled in Microsoft Project as a template to provide a planning, tracking, and management tool. The template can also be used as a tool to evaluate improvements to this process. This document describes the development of the template and provides step-by-step instructions on its use and as a planning and tracking tool. A secondary role of this document is to provide an overview of the model changeover process and briefly describe the tasks associated with it.
Climbing the ladder: capability maturity model integration level 3
NASA Astrophysics Data System (ADS)
Day, Bryce; Lutteroth, Christof
2011-02-01
This article details the attempt to form a complete workflow model for an information and communication technologies (ICT) company in order to achieve a capability maturity model integration (CMMI) maturity rating of 3. During this project, business processes across the company's core and auxiliary sectors were documented and extended using modern enterprise modelling tools and a The Open Group Architectural Framework (TOGAF) methodology. Different challenges were encountered with regard to process customisation and tool support for enterprise modelling. In particular, there were problems with the reuse of process models, the integration of different project management methodologies and the integration of the Rational Unified Process development process framework that had to be solved. We report on these challenges and the perceived effects of the project on the company. Finally, we point out research directions that could help to improve the situation in the future.
Fuzzy control of burnout of multilayer ceramic actuators
NASA Astrophysics Data System (ADS)
Ling, Alice V.; Voss, David; Christodoulou, Leo
1996-08-01
To improve the yield and repeatability of the burnout process of multilayer ceramic actuators (MCAs), an intelligent processing of materials (IPM-based) control system has been developed for the manufacture of MCAs. IPM involves the active (ultimately adaptive) control of a material process using empirical or analytical models and in situ sensing of critical process states (part features and process parameters) to modify the processing conditions in real time to achieve predefined product goals. Thus, the three enabling technologies for the IPM burnout control system are process modeling, in situ sensing and intelligent control. This paper presents the design of an IPM-based control strategy for the burnout process of MCAs.
Omega-3 production by fermentation of Yarrowia lipolytica: From fed-batch to continuous.
Xie, Dongming; Miller, Edward; Sharpe, Pamela; Jackson, Ethel; Zhu, Quinn
2017-04-01
The omega-3 fatty acid, cis-5,8,11,14,17-eicosapentaenoic acid (C20:5; EPA) has wide-ranging benefits in improving heart health, immune function, and mental health. A sustainable source of EPA production through fermentation of metabolically engineered Yarrowia lipolytica has been developed. In this paper, key fed-batch fermentation conditions were identified to achieve 25% EPA in the yeast biomass, which is so far the highest EPA titer reported in the literature. Dynamic models of the EPA fermentation process were established for analyzing, optimizing, and scaling up the fermentation process. In addition, model simulations were used to develop a two-stage continuous process and compare to single-stage continuous and fed- batch processes. The two stage continuous process, which is equipped with a smaller growth fermentor (Stage 1) and a larger production fermentor (Stage 2), was found to be a superior process to achieve high titer, rate, and yield of EPA. A two-stage continuous fermentation experiment with Y. lipolytica strain Z7334 was designed using the model simulation and then tested in a 2 L and 5 L fermentation system for 1,008 h. Compared with the standard 2 L fed-batch process, the two-stage continuous fermentation process improved the overall EPA productivity by 80% and EPA concentration in the fermenter by 40% while achieving comparable EPA titer in biomass and similar conversion yield from glucose. During the long-term experiment it was also found that the Y. lipolytica strain evolved to reduce byproduct and increase lipid production. This is one of the few continuous fermentation examples that demonstrated improved productivity and concentration of a final product with similar conversion yield compared with a fed-batch process. This paper suggests the two-stage continuous fermentation could be an effective process to achieve improved production of omega-3 and other fermentation products where non-growth or partially growth associated kinetics characterize the process. Biotechnol. Bioeng. 2017;114: 798-812. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Automation of energy demand forecasting
NASA Astrophysics Data System (ADS)
Siddique, Sanzad
Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.
A methodology proposal for collaborative business process elaboration using a model-driven approach
NASA Astrophysics Data System (ADS)
Mu, Wenxin; Bénaben, Frédérick; Pingaud, Hervé
2015-05-01
Business process management (BPM) principles are commonly used to improve processes within an organisation. But they can equally be applied to supporting the design of an Information System (IS). In a collaborative situation involving several partners, this type of BPM approach may be useful to support the design of a Mediation Information System (MIS), which would ensure interoperability between the partners' ISs (which are assumed to be service oriented). To achieve this objective, the first main task is to build a collaborative business process cartography. The aim of this article is to present a method for bringing together collaborative information and elaborating collaborative business processes from the information gathered (by using a collaborative situation framework, an organisational model, an informational model, a functional model and a metamodel and by using model transformation rules).
Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO
NASA Technical Reports Server (NTRS)
Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael
2014-01-01
For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.
Beta-decay half-lives for short neutron rich nuclei involved into the r-process
NASA Astrophysics Data System (ADS)
Panov, I.; Lutostansky, Yu; Thielemann, F.-K.
2018-01-01
The beta-strength function model based on Finite Fermi-Systems Theory is applied for calculations of the beta-decay half-lives for short neutron rich nuclei involved into the r- process. It is shown that the accuracy of beta-decay half-lives of short-lived neutron-rich nuclei is improving with increasing neutron excess and can be used for modeling of nucleosynthesis of heavy nuclei in the r-process.
Melt-processing high-T{sub c} superconductors under an elevated magnetic field [Final report no. 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
John B. Vander Sande
2001-09-05
This report presents models for crystallographic texture development for high temperature superconducting oxides processed in the absence of a magnetic field and in the presence of a high magnetic field. The results of the models are confirmed through critical experiments. Processing thick films and tapes of high temperature superconducting oxides under a high magnetic field (5-10T) improves the critical current density exhibited.
Computer-Aided Process Model For Carbon/Phenolic Materials
NASA Technical Reports Server (NTRS)
Letson, Mischell A.; Bunker, Robert C.
1996-01-01
Computer program implements thermochemical model of processing of carbon-fiber/phenolic-matrix composite materials into molded parts of various sizes and shapes. Directed toward improving fabrication of rocket-engine-nozzle parts, also used to optimize fabrication of other structural components, and material-property parameters changed to apply to other materials. Reduces costs by reducing amount of laboratory trial and error needed to optimize curing processes and to predict properties of cured parts.
Moreno, Janette V; Girard, Anita S; Foad, Wendy
2018-03-01
In 2012, an academic medical center successfully overhauled a 15-year-old shared governance to align 6 house-wide and 30 unit-based councils with the new Magnet Recognition Program® and the organization's operating system, using the processes of LEAN methodology. The redesign improved cross-council communication structures, facilitated effective shared decision-making processes, increased staff engagement, and improved clinical outcomes. The innovative structural and process elements of the new model are replicable in other health institutions.
People Capability Maturity Model. SM.
1995-09-01
People Capability Maturity Model SM .^^^^_ -——’ Bill Curtis William E. ] Sally Mille] Hefley r Accesion For t NTIS DTIC...People CMM The P-CMM adapts the architecture and the maturity framework underlying the CMM for use with people-related improvement issues. The CMM...focuses on helping organizations improve their software development processes. By adapting the maturity framework and the CMM architecture
Shared ownership: what's the future?
Roth, Karen L
2013-01-01
The status of library consortia has evolved over time in terms of their composition and alternative negotiating models. New purchasing models may allow improved library involvement in the acquisitions process and improved methods for meeting users' future needs. Ever-increasing costs of library resources and the need to reduce expenses make it necessary to continue the exploration of library consortia for group purchases.
ERIC Educational Resources Information Center
Liu, Ran; Koedinger, Kenneth R.
2017-01-01
As the use of educational technology becomes more ubiquitous, an enormous amount of learning process data is being produced. Educational data mining seeks to analyze and model these data, with the ultimate goal of improving learning outcomes. The most firmly grounded and rigorous evaluation of an educational data mining discovery is whether it…
What Drives Teachers to Improve? The Role of Teacher Mindset in Professional Learning
ERIC Educational Resources Information Center
Gero, Greg Philip
2013-01-01
Teacher quality has received increasing focus over the past decade, yet, by some measures, teachers rarely improve after their first few years of teaching, and not all teachers seem driven to improve. Traditional models of professional learning have emphasized the processes that teachers take part in as a facilitator of their improvement. Research…
Assimilating the Future for Better Forecasts and Earlier Warnings
NASA Astrophysics Data System (ADS)
Du, H.; Wheatcroft, E.; Smith, L. A.
2016-12-01
Multi-model ensembles have become popular tools to account for some of the uncertainty due to model inadequacy in weather and climate simulation-based predictions. The current multi-model forecasts focus on combining single model ensemble forecasts by means of statistical post-processing. Assuming each model is developed independently or with different primary target variables, each is likely to contain different dynamical strengths and weaknesses. Using statistical post-processing, such information is only carried by the simulations under a single model ensemble: no advantage is taken to influence simulations under the other models. A novel methodology, named Multi-model Cross Pollination in Time, is proposed for multi-model ensemble scheme with the aim of integrating the dynamical information regarding the future from each individual model operationally. The proposed approach generates model states in time via applying data assimilation scheme(s) to yield truly "multi-model trajectories". It is demonstrated to outperform traditional statistical post-processing in the 40-dimensional Lorenz96 flow. Data assimilation approaches are originally designed to improve state estimation from the past to the current time. The aim of this talk is to introduce a framework that uses data assimilation to improve model forecasts at future time (not to argue for any one particular data assimilation scheme). Illustration of applying data assimilation "in the future" to provide early warning of future high-impact events is also presented.
Biosecurity through Public Health System Design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beyeler, Walter E.; Finley, Patrick D.; Arndt, William
We applied modeling and simulation to examine the real-world tradeoffs between developingcountry public-health improvement and the need to improve the identification, tracking, and security of agents with bio-weapons potential. Traditionally, the international community has applied facility-focused strategies for improving biosecurity and biosafety. This work examines how system-level assessments and improvements can foster biosecurity and biosafety. We modeled medical laboratory resources and capabilities to identify scenarios where biosurveillance goals are transparently aligned with public health needs, and resource are distributed in a way that maximizes their ability to serve patients while minimizing security a nd safety risks. Our modeling platform simulatesmore » key processes involved in healthcare system operation, such as sample collection, transport, and analysis at medical laboratories. The research reported here extends the prior art by provided two key compone nts for comparative performance assessment: a model of patient interaction dynamics, and the capability to perform uncertainty quantification. In addition, we have outlined a process for incorporating quantitative biosecurity and biosafety risk measures. Two test problems were used to exercise these research products examine (a) Systemic effects of technological innovation and (b) Right -sizing of laboratory networks.« less
Leger, Marianne; Neill, Joanna C
2016-09-01
Sex is often overlooked in animal and human research. Cognitive impairment associated with schizophrenia (CIAS) remains an unmet clinical need, as current antipsychotic medication does not provide clinically meaningful improvements. One explanation could be lack of appreciation of gender differences in CIAS. Animal models play a critical role in drug development and improved translation to the clinic is an on-going process. Our systematic review aims to evaluate how well the animal studies translate into clinical findings. Supporting clinical results, our review highlights a male working memory advantage and a female advantage for visual memory and social cognition in rodent models for schizophrenia. Not investigated in animals, a female advantage for attention and speed of processing has been found in schizophrenia patients. Sex differences in reasoning and problem solving are poorly investigated in both human and animal studies. Overall, our review provides evidence of good translation from the animal models into the clinic when sexual dimorphism is assessed. Enhanced understanding of these sex differences will improve the management of CIAS. Copyright © 2016 Elsevier Ltd. All rights reserved.
Warren, Jeffrey M; Hanson, Paul J; Iversen, Colleen M; Kumar, Jitendra; Walker, Anthony P; Wullschleger, Stan D
2015-01-01
There is wide breadth of root function within ecosystems that should be considered when modeling the terrestrial biosphere. Root structure and function are closely associated with control of plant water and nutrient uptake from the soil, plant carbon (C) assimilation, partitioning and release to the soils, and control of biogeochemical cycles through interactions within the rhizosphere. Root function is extremely dynamic and dependent on internal plant signals, root traits and morphology, and the physical, chemical and biotic soil environment. While plant roots have significant structural and functional plasticity to changing environmental conditions, their dynamics are noticeably absent from the land component of process-based Earth system models used to simulate global biogeochemical cycling. Their dynamic representation in large-scale models should improve model veracity. Here, we describe current root inclusion in models across scales, ranging from mechanistic processes of single roots to parameterized root processes operating at the landscape scale. With this foundation we discuss how existing and future root functional knowledge, new data compilation efforts, and novel modeling platforms can be leveraged to enhance root functionality in large-scale terrestrial biosphere models by improving parameterization within models, and introducing new components such as dynamic root distribution and root functional traits linked to resource extraction. No claim to original US Government works. New Phytologist © 2014 New Phytologist Trust.
A road map for integrating eco-evolutionary processes into biodiversity models.
Thuiller, Wilfried; Münkemüller, Tamara; Lavergne, Sébastien; Mouillot, David; Mouquet, Nicolas; Schiffers, Katja; Gravel, Dominique
2013-05-01
The demand for projections of the future distribution of biodiversity has triggered an upsurge in modelling at the crossroads between ecology and evolution. Despite the enthusiasm around these so-called biodiversity models, most approaches are still criticised for not integrating key processes known to shape species ranges and community structure. Developing an integrative modelling framework for biodiversity distribution promises to improve the reliability of predictions and to give a better understanding of the eco-evolutionary dynamics of species and communities under changing environments. In this article, we briefly review some eco-evolutionary processes and interplays among them, which are essential to provide reliable projections of species distributions and community structure. We identify gaps in theory, quantitative knowledge and data availability hampering the development of an integrated modelling framework. We argue that model development relying on a strong theoretical foundation is essential to inspire new models, manage complexity and maintain tractability. We support our argument with an example of a novel integrated model for species distribution modelling, derived from metapopulation theory, which accounts for abiotic constraints, dispersal, biotic interactions and evolution under changing environmental conditions. We hope such a perspective will motivate exciting and novel research, and challenge others to improve on our proposed approach. © 2013 John Wiley & Sons Ltd/CNRS.
2014-01-01
Background Recruitment is a major challenge for many trials; just over half reach their targets and almost a third resort to grant extensions. The economic and societal implications of this shortcoming are significant. Yet, we have a limited understanding of the processes that increase the probability that recruitment targets will be achieved. Accordingly, there is an urgent need to bring analytical rigour to the task of improving recruitment, thereby increasing the likelihood that trials reach their recruitment targets. This paper presents a conceptual framework that can be used to improve recruitment to clinical trials. Methods Using a case-study approach, we reviewed the range of initiatives that had been undertaken to improve recruitment in the txt2stop trial using qualitative (semi-structured interviews with the principal investigator) and quantitative (recruitment) data analysis. Later, the txt2stop recruitment practices were compared to a previous model of marketing a trial and to key constructs in social marketing theory. Results Post hoc, we developed a recruitment optimisation model to serve as a conceptual framework to improve recruitment to clinical trials. A core premise of the model is that improving recruitment needs to be an iterative, learning process. The model describes three essential activities: i) recruitment phase monitoring, ii) marketing research, and iii) the evaluation of current performance. We describe the initiatives undertaken by the txt2stop trial and the results achieved, as an example of the use of the model. Conclusions Further research should explore the impact of adopting the recruitment optimisation model when applied to other trials. PMID:24886627
Galli, Leandro; Knight, Rosemary; Robertson, Steven; Hoile, Elizabeth; Oladapo, Olubukola; Francis, David; Free, Caroline
2014-05-22
Recruitment is a major challenge for many trials; just over half reach their targets and almost a third resort to grant extensions. The economic and societal implications of this shortcoming are significant. Yet, we have a limited understanding of the processes that increase the probability that recruitment targets will be achieved. Accordingly, there is an urgent need to bring analytical rigour to the task of improving recruitment, thereby increasing the likelihood that trials reach their recruitment targets. This paper presents a conceptual framework that can be used to improve recruitment to clinical trials. Using a case-study approach, we reviewed the range of initiatives that had been undertaken to improve recruitment in the txt2stop trial using qualitative (semi-structured interviews with the principal investigator) and quantitative (recruitment) data analysis. Later, the txt2stop recruitment practices were compared to a previous model of marketing a trial and to key constructs in social marketing theory. Post hoc, we developed a recruitment optimisation model to serve as a conceptual framework to improve recruitment to clinical trials. A core premise of the model is that improving recruitment needs to be an iterative, learning process. The model describes three essential activities: i) recruitment phase monitoring, ii) marketing research, and iii) the evaluation of current performance. We describe the initiatives undertaken by the txt2stop trial and the results achieved, as an example of the use of the model. Further research should explore the impact of adopting the recruitment optimisation model when applied to other trials.
NASA Astrophysics Data System (ADS)
Wang, Li; Zhang, Fan; Zhang, Hongbo; Scott, Christopher A.; Zeng, Chen; Shi, Xiaonan
2018-01-01
Precipitation is one of the most critical inputs for models used to improve understanding of hydrological processes. In high mountain areas, it is challenging to generate a reliable precipitation data set capturing the spatial and temporal heterogeneity due to the harsh climate, extreme terrain and the lack of observations. This study conducts intensive observation of precipitation in the Mabengnong catchment in the southeast of the Tibetan Plateau during July to August 2013. Because precipitation is greatly influenced by altitude, the observed data are used to characterize the precipitation gradient (PG) and hourly distribution (HD), showing that the average PG is 0.10, 0.28 and 0.26 mm/d/100 m and the average duration is around 0.1, 0.8 and 5.2 h for trace, light and moderate rain, respectively. A distributed biosphere hydrological model based on water and energy budgets with improved physical process for snow (WEB-DHM-S) is applied to simulate the hydrological processes with gridded precipitation data derived from a lower altitude meteorological station and the PG and HD characterized for the study area. The observed runoff, MODIS/Terra snow cover area (SCA) data, and MODIS/Terra land surface temperature (LST) data are used for model calibration and validation. Runoff, SCA and LST simulations all show reasonable results. Sensitivity analyses illustrate that runoff is largely underestimated without considering PG, indicating that short-term intensive precipitation observation has the potential to greatly improve hydrological modelling of poorly gauged high mountain catchments.
Online Knowledge-Based Model for Big Data Topic Extraction.
Khan, Muhammad Taimoor; Durrani, Mehr; Khalid, Shehzad; Aziz, Furqan
2016-01-01
Lifelong machine learning (LML) models learn with experience maintaining a knowledge-base, without user intervention. Unlike traditional single-domain models they can easily scale up to explore big data. The existing LML models have high data dependency, consume more resources, and do not support streaming data. This paper proposes online LML model (OAMC) to support streaming data with reduced data dependency. With engineering the knowledge-base and introducing new knowledge features the learning pattern of the model is improved for data arriving in pieces. OAMC improves accuracy as topic coherence by 7% for streaming data while reducing the processing cost to half.
Schoppe, Oliver; King, Andrew J.; Schnupp, Jan W.H.; Harper, Nicol S.
2016-01-01
Adaptation to stimulus statistics, such as the mean level and contrast of recently heard sounds, has been demonstrated at various levels of the auditory pathway. It allows the nervous system to operate over the wide range of intensities and contrasts found in the natural world. Yet current standard models of the response properties of auditory neurons do not incorporate such adaptation. Here we present a model of neural responses in the ferret auditory cortex (the IC Adaptation model), which takes into account adaptation to mean sound level at a lower level of processing: the inferior colliculus (IC). The model performs high-pass filtering with frequency-dependent time constants on the sound spectrogram, followed by half-wave rectification, and passes the output to a standard linear–nonlinear (LN) model. We find that the IC Adaptation model consistently predicts cortical responses better than the standard LN model for a range of synthetic and natural stimuli. The IC Adaptation model introduces no extra free parameters, so it improves predictions without sacrificing parsimony. Furthermore, the time constants of adaptation in the IC appear to be matched to the statistics of natural sounds, suggesting that neurons in the auditory midbrain predict the mean level of future sounds and adapt their responses appropriately. SIGNIFICANCE STATEMENT An ability to accurately predict how sensory neurons respond to novel stimuli is critical if we are to fully characterize their response properties. Attempts to model these responses have had a distinguished history, but it has proven difficult to improve their predictive power significantly beyond that of simple, mostly linear receptive field models. Here we show that auditory cortex receptive field models benefit from a nonlinear preprocessing stage that replicates known adaptation properties of the auditory midbrain. This improves their predictive power across a wide range of stimuli but keeps model complexity low as it introduces no new free parameters. Incorporating the adaptive coding properties of neurons will likely improve receptive field models in other sensory modalities too. PMID:26758822
NASA Astrophysics Data System (ADS)
Son, J.; Medina-Cetina, Z.
2017-12-01
We discuss the comparison between deterministic and stochastic optimization approaches to the nonlinear geophysical full-waveform inverse problem, based on the seismic survey data from Mississippi Canyon in the Northern Gulf of Mexico. Since the subsea engineering and offshore construction projects actively require reliable ground models from various site investigations, the primary goal of this study is to reconstruct the accurate subsurface information of the soil and rock material profiles under the seafloor. The shallow sediment layers have naturally formed heterogeneous formations which may cause unwanted marine landslides or foundation failures of underwater infrastructure. We chose the quasi-Newton and simulated annealing as deterministic and stochastic optimization algorithms respectively. Seismic forward modeling based on finite difference method with absorbing boundary condition implements the iterative simulations in the inverse modeling. We briefly report on numerical experiments using a synthetic data as an offshore ground model which contains shallow artificial target profiles of geomaterials under the seafloor. We apply the seismic migration processing and generate Voronoi tessellation on two-dimensional space-domain to improve the computational efficiency of the imaging stratigraphical velocity model reconstruction. We then report on the detail of a field data implementation, which shows the complex geologic structures in the Northern Gulf of Mexico. Lastly, we compare the new inverted image of subsurface site profiles in the space-domain with the previously processed seismic image in the time-domain at the same location. Overall, stochastic optimization for seismic inversion with migration and Voronoi tessellation show significant promise to improve the subsurface imaging of ground models and improve the computational efficiency required for the full waveform inversion. We anticipate that by improving the inversion process of shallow layers from geophysical data will better support the offshore site investigation.
2007-12-05
yield record setting carrier lifetime values and very low concentrations of point defects. Epiwafers delivered for fabrication of RF static induction ...boules and on improved furnace uniformity (adding rotation, etc.). Pareto analysis was performed on wafer yield loss at the start of every quarter...100mm PVT process. Work focused on modeling the process for longer (50 mm) boules and on improved furnace uniformity. Pareto analysis was performed
An expert panel process to evaluate habitat restoration actions in the Columbia River estuary.
Krueger, Kirk L; Bottom, Daniel L; Hood, W Gregory; Johnson, Gary E; Jones, Kim K; Thom, Ronald M
2017-03-01
We describe a process for evaluating proposed ecosystem restoration projects intended to improve survival of juvenile salmon in the Columbia River estuary (CRE). Changes in the Columbia River basin (northwestern USA), including hydropower development, have contributed to the listing of 13 salmon stocks as endangered or threatened under the U.S. Endangered Species Act. Habitat restoration in the CRE, from Bonneville Dam to the ocean, is part of a basin-wide, legally mandated effort to mitigate federal hydropower impacts on salmon survival. An Expert Regional Technical Group (ERTG) was established in 2009 to improve and implement a process for assessing and assigning "survival benefit units" (SBUs) to restoration actions. The SBU concept assumes site-specific restoration projects will increase juvenile salmon survival during migration through the 234 km CRE. Assigned SBUs are used to inform selection of restoration projects and gauge mitigation progress. The ERTG standardized the SBU assessment process to improve its scientific integrity, repeatability, and transparency. In lieu of experimental data to quantify the survival benefits of individual restoration actions, the ERTG adopted a conceptual model composed of three assessment criteria-certainty of success, fish opportunity improvements, and habitat capacity improvements-to evaluate restoration projects. Based on these criteria, an algorithm assigned SBUs by integrating potential fish density as an indicator of salmon performance. Between 2009 and 2014, the ERTG assessed SBUs for 55 proposed projects involving a total of 181 restoration actions located across 8 of 9 reaches of the CRE, largely relying on information provided in a project template based on the conceptual model, presentations, discussions with project sponsors, and site visits. Most projects restored tidal inundation to emergent wetlands, improved riparian function, and removed invasive vegetation. The scientific relationship of geomorphic and salmonid responses to restoration actions remains the foremost concern. Although not designed to establish a broad strategy for estuary restoration, the scoring process has adaptively influenced the types, designs, and locations of restoration proposals. The ERTG process may be a useful model for others who have unique ecosystem restoration goals and share some of our common challenges. Copyright © 2016 Elsevier Ltd. All rights reserved.
Integration of Tuyere, Raceway and Shaft Models for Predicting Blast Furnace Process
NASA Astrophysics Data System (ADS)
Fu, Dong; Tang, Guangwu; Zhao, Yongfu; D'Alessio, John; Zhou, Chenn Q.
2018-06-01
A novel modeling strategy is presented for simulating the blast furnace iron making process. Such physical and chemical phenomena are taking place across a wide range of length and time scales, and three models are developed to simulate different regions of the blast furnace, i.e., the tuyere model, the raceway model and the shaft model. This paper focuses on the integration of the three models to predict the entire blast furnace process. Mapping output and input between models and an iterative scheme are developed to establish communications between models. The effects of tuyere operation and burden distribution on blast furnace fuel efficiency are investigated numerically. The integration of different models provides a way to realistically simulate the blast furnace by improving the modeling resolution on local phenomena and minimizing the model assumptions.
Ma, H. -Y.; Chuang, C. C.; Klein, S. A.; ...
2015-11-06
Here, we present an improved procedure of generating initial conditions (ICs) for climate model hindcast experiments with specified sea surface temperature and sea ice. The motivation is to minimize errors in the ICs and lead to a better evaluation of atmospheric parameterizations' performance in the hindcast mode. We apply state variables (horizontal velocities, temperature and specific humidity) from the operational analysis/reanalysis for the atmospheric initial states. Without a data assimilation system, we apply a two-step process to obtain other necessary variables to initialize both the atmospheric (e.g., aerosols and clouds) and land models (e.g., soil moisture). First, we nudge onlymore » the model horizontal velocities towards operational analysis/reanalysis values, given a 6-hour relaxation time scale, to obtain all necessary variables. Compared to the original strategy in which horizontal velocities, temperature and specific humidity are nudged, the revised approach produces a better representation of initial aerosols and cloud fields which are more consistent and closer to observations and model's preferred climatology. Second, we obtain land ICs from an offline land model simulation forced with observed precipitation, winds, and surface fluxes. This approach produces more realistic soil moisture in the land ICs. With this refined procedure, the simulated precipitation, clouds, radiation, and surface air temperature over land are improved in the Day 2 mean hindcasts. Following this procedure, we propose a “Core” integration suite which provides an easily repeatable test allowing model developers to rapidly assess the impacts of various parameterization changes on the fidelity of modelled cloud-associated processes relative to observations.« less
NASA Astrophysics Data System (ADS)
Ma, H.-Y.; Chuang, C. C.; Klein, S. A.; Lo, M.-H.; Zhang, Y.; Xie, S.; Zheng, X.; Ma, P.-L.; Zhang, Y.; Phillips, T. J.
2015-12-01
We present an improved procedure of generating initial conditions (ICs) for climate model hindcast experiments with specified sea surface temperature and sea ice. The motivation is to minimize errors in the ICs and lead to a better evaluation of atmospheric parameterizations' performance in the hindcast mode. We apply state variables (horizontal velocities, temperature, and specific humidity) from the operational analysis/reanalysis for the atmospheric initial states. Without a data assimilation system, we apply a two-step process to obtain other necessary variables to initialize both the atmospheric (e.g., aerosols and clouds) and land models (e.g., soil moisture). First, we nudge only the model horizontal velocities toward operational analysis/reanalysis values, given a 6 h relaxation time scale, to obtain all necessary variables. Compared to the original strategy in which horizontal velocities, temperature, and specific humidity are nudged, the revised approach produces a better representation of initial aerosols and cloud fields which are more consistent and closer to observations and model's preferred climatology. Second, we obtain land ICs from an off-line land model simulation forced with observed precipitation, winds, and surface fluxes. This approach produces more realistic soil moisture in the land ICs. With this refined procedure, the simulated precipitation, clouds, radiation, and surface air temperature over land are improved in the Day 2 mean hindcasts. Following this procedure, we propose a "Core" integration suite which provides an easily repeatable test allowing model developers to rapidly assess the impacts of various parameterization changes on the fidelity of modeled cloud-associated processes relative to observations.
Applying PCI in Combination Swivel Head Wrench
NASA Astrophysics Data System (ADS)
Chen, Tsang-Chiang; Yang, Chun-Ming; Hsu, Chang-Hsien; Hung, Hsiang-Wen
2017-09-01
Taiwan’s traditional industries are subject to competition in the era of globalization and environmental change, the industry is facing economic pressure and shock, and now sustainable business can only continue to improve production efficiency and quality of technology, in order to stabilize the market, to obtain high occupancy. The use of process capability indices to monitor the quality of the ratchet wrench to find the key function of the dual-use ratchet wrench, the actual measurement data, The use of process capability Cpk index analysis, and draw Process Capability Analysis Chart model. Finally, this study explores the current situation of this case and proposes a lack of improvement and improvement methods to improve the overall quality and thereby enhance the overall industry.
Akbari, Azam; Omidkhah, Mohammadreza; Darian, Jafar Towfighi
2014-03-01
A new heterogeneous sonocatalytic system consisting of a MoO3/Al2O3 catalyst and H2O2 combined with ultrasonication was studied to improve and accelerate the oxidation of model sulfur compounds of diesel, resulting in a significant enhancement in the process efficiency. The influence of ultrasound on properties, activity and stability of the catalyst was studied in detail by means of GC-FID, PSD, SEM and BET techniques. Above 98% conversion of DBT in model diesel containing 1000 μg/g sulfur was obtained by new ultrasound-assisted desulfurization at H2O2/sulfur molar ratio of 3, temperature of 318 K and catalyst dosage of 30 g/L after 30 min reaction, contrary to the 55% conversion obtained during the silent process. This improvement was considerably affected by operation parameters and catalyst properties. The effects of main process variables were investigated using response surface methodology in silent process compared to ultrasonication. Ultrasound provided a good dispersion of catalyst and oxidant by breakage of hydrogen bonding and deagglomeration of them in the oil phase. Deposition of impurities on the catalyst surface caused a quick deactivation in silent experiments resulting only 5% of DBT oxidation after 6 cycles of silent reaction by recycled catalyst. Above 95% of DBT was oxidized after 6 ultrasound-assisted cycles showing a great improvement in stability by cleaning the surface during ultrasonication. A considerable particle size reduction was also observed after 3 h sonication that could provide more dispersion of catalyst in model fuel.
NASA Astrophysics Data System (ADS)
Bouda, Martin; Saiers, James E.
2017-12-01
Root system architecture (RSA) can significantly affect plant access to water, total transpiration, as well as its partitioning by soil depth, with implications for surface heat, water, and carbon budgets. Despite recent advances in land surface model (LSM) descriptions of plant hydraulics, descriptions of RSA have not been included because of their three-dimensional complexity, which makes them generally too computationally costly. Here we demonstrate a new, process-based 1D layered model that captures the dynamic shifts in water potential gradients of 3D RSA under different soil moisture conditions: the RSA stencil. Using root systems calibrated to the rooting profiles of four plant functional types (PFT) of the Community Land Model, we show that the RSA stencil predicts plant water potentials within 2% to the outputs of a full 3D model, under the same assumptions on soil moisture heterogeneity, despite its trivial computational cost, resulting in improved predictions of water uptake and soil moisture compared to a model without RSA in a transient simulation. Our results suggest that LSM predictions of soil moisture dynamics and dependent variables can be improved by the implementation of this model, calibrated for individual PFTs using field observations.
NASA Astrophysics Data System (ADS)
Lahmers, T. M.; Castro, C. L.; Gupta, H. V.; Gochis, D.; Dugger, A. L.; Smith, M.
2016-12-01
The NOAA National Water Model (NWM), which is based on the WRF-Hydro architecture, became operational in June of 2016 to produce streamflow forecasts nationwide. In order to improve the physical process representation of NWM/WRF-Hydro, a parameterized channel infiltration function is added to the Muskingum-Cunge channel routing scheme. Representation of transmission losses along streams was previously not supported by WRF-Hydro, even though most channels in the southwest CONUS have a high depth to groundwater, and are consequently a source for recharge throughout the region. The LSM, routing grid, baseflow bucket model, and channel parameters of the modified version of NWM/WRF-Hydro are calibrated using spatial regularization in selected basins in the Midwest and Southwest CONUS. WRF-Hydro is calibrated and tested in the Verde, San Pedro, Little Sioux, Nishnabotna, and Wapsipinicon basins. The model is forced with NCEP Stage-IV and NLDAS-2 precipitation for calibration, and the effects of the precipitation climatology, including extreme events, on model performance are considered. This work advances the regional performance of WRF-Hydro through process enhancement and calibration that is highly relevant for improving model fidelity in semi-arid climates.
On improving the communication between models and data.
Dietze, Michael C; Lebauer, David S; Kooper, Rob
2013-09-01
The potential for model-data synthesis is growing in importance as we enter an era of 'big data', greater connectivity and faster computation. Realizing this potential requires that the research community broaden its perspective about how and why they interact with models. Models can be viewed as scaffolds that allow data at different scales to inform each other through our understanding of underlying processes. Perceptions of relevance, accessibility and informatics are presented as the primary barriers to broader adoption of models by the community, while an inability to fully utilize the breadth of expertise and data from the community is a primary barrier to model improvement. Overall, we promote a community-based paradigm to model-data synthesis and highlight some of the tools and techniques that facilitate this approach. Scientific workflows address critical informatics issues in transparency, repeatability and automation, while intuitive, flexible web-based interfaces make running and visualizing models more accessible. Bayesian statistics provides powerful tools for assimilating a diversity of data types and for the analysis of uncertainty. Uncertainty analyses enable new measurements to target those processes most limiting our predictive ability. Moving forward, tools for information management and data assimilation need to be improved and made more accessible. © 2013 John Wiley & Sons Ltd.
Heany, Julia; Torres, Jennifer; Zagar, Cynthia; Kostelec, Tiffany
2018-06-05
Introduction In order to achieve the positive outcomes with parents and children demonstrated by many home visiting models, home visiting services must be well implemented. The Michigan Home Visiting Initiative developed a tool and procedure for monitoring implementation quality across models referred to as Michigan's Home Visiting Quality Assurance System (MHVQAS). This study field tested the MHVQAS. This article focuses on one of the study's evaluation questions: Can the MHVQAS be applied across models? Methods Eight local implementing agencies (LIAs) from four home visiting models (Healthy Families America, Early Head Start-Home Based, Parents as Teachers, Maternal Infant Health Program) and five reviewers participated in the study by completing site visits, tracking their time and costs, and completing surveys about the process. LIAs also submitted their most recent review by their model developer. The researchers conducted participant observation of the review process. Results Ratings on the MHVQAS were not significantly different between models. There were some differences in interrater reliability and perceived reliability between models. There were no significant differences between models in perceived validity, satisfaction with the review process, or cost to participate. Observational data suggested that cross-model applicability could be improved by assisting sites in relating the requirements of the tool to the specifics of their model. Discussion The MHVQAS shows promise as a tool and process to monitor implementation quality of home visiting services across models. The results of the study will be used to make improvements before the MHVQAS is used in practice.
NASA Technical Reports Server (NTRS)
Hoppa, Mary Ann; Wilson, Larry W.
1994-01-01
There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.
Clinical time series prediction: Toward a hierarchical dynamical system framework.
Liu, Zitao; Hauskrecht, Milos
2015-09-01
Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.
Kornecki, Martin; Strube, Jochen
2018-03-16
Productivity improvements of mammalian cell culture in the production of recombinant proteins have been made by optimizing cell lines, media, and process operation. This led to enhanced titers and process robustness without increasing the cost of the upstream processing (USP); however, a downstream bottleneck remains. In terms of process control improvement, the process analytical technology (PAT) initiative, initiated by the American Food and Drug Administration (FDA), aims to measure, analyze, monitor, and ultimately control all important attributes of a bioprocess. Especially, spectroscopic methods such as Raman or near-infrared spectroscopy enable one to meet these analytical requirements, preferably in-situ. In combination with chemometric techniques like partial least square (PLS) or principal component analysis (PCA), it is possible to generate soft sensors, which estimate process variables based on process and measurement models for the enhanced control of bioprocesses. Macroscopic kinetic models can be used to simulate cell metabolism. These models are able to enhance the process understanding by predicting the dynamic of cells during cultivation. In this article, in-situ turbidity (transmission, 880 nm) and ex-situ Raman spectroscopy (785 nm) measurements are combined with an offline macroscopic Monod kinetic model in order to predict substrate concentrations. Experimental data of Chinese hamster ovary cultivations in bioreactors show a sufficiently linear correlation (R² ≥ 0.97) between turbidity and total cell concentration. PLS regression of Raman spectra generates a prediction model, which was validated via offline viable cell concentration measurement (RMSE ≤ 13.82, R² ≥ 0.92). Based on these measurements, the macroscopic Monod model can be used to determine different process attributes, e.g., glucose concentration. In consequence, it is possible to approximately calculate (R² ≥ 0.96) glucose concentration based on online cell concentration measurements using turbidity or Raman spectroscopy. Future approaches will use these online substrate concentration measurements with turbidity and Raman measurements, in combination with the kinetic model, in order to control the bioprocess in terms of feeding strategies, by employing an open platform communication (OPC) network-either in fed-batch or perfusion mode, integrated into a continuous operation of upstream and downstream.
Kornecki, Martin; Strube, Jochen
2018-01-01
Productivity improvements of mammalian cell culture in the production of recombinant proteins have been made by optimizing cell lines, media, and process operation. This led to enhanced titers and process robustness without increasing the cost of the upstream processing (USP); however, a downstream bottleneck remains. In terms of process control improvement, the process analytical technology (PAT) initiative, initiated by the American Food and Drug Administration (FDA), aims to measure, analyze, monitor, and ultimately control all important attributes of a bioprocess. Especially, spectroscopic methods such as Raman or near-infrared spectroscopy enable one to meet these analytical requirements, preferably in-situ. In combination with chemometric techniques like partial least square (PLS) or principal component analysis (PCA), it is possible to generate soft sensors, which estimate process variables based on process and measurement models for the enhanced control of bioprocesses. Macroscopic kinetic models can be used to simulate cell metabolism. These models are able to enhance the process understanding by predicting the dynamic of cells during cultivation. In this article, in-situ turbidity (transmission, 880 nm) and ex-situ Raman spectroscopy (785 nm) measurements are combined with an offline macroscopic Monod kinetic model in order to predict substrate concentrations. Experimental data of Chinese hamster ovary cultivations in bioreactors show a sufficiently linear correlation (R2 ≥ 0.97) between turbidity and total cell concentration. PLS regression of Raman spectra generates a prediction model, which was validated via offline viable cell concentration measurement (RMSE ≤ 13.82, R2 ≥ 0.92). Based on these measurements, the macroscopic Monod model can be used to determine different process attributes, e.g., glucose concentration. In consequence, it is possible to approximately calculate (R2 ≥ 0.96) glucose concentration based on online cell concentration measurements using turbidity or Raman spectroscopy. Future approaches will use these online substrate concentration measurements with turbidity and Raman measurements, in combination with the kinetic model, in order to control the bioprocess in terms of feeding strategies, by employing an open platform communication (OPC) network—either in fed-batch or perfusion mode, integrated into a continuous operation of upstream and downstream. PMID:29547557
Cost Models for MMC Manufacturing Processes
NASA Technical Reports Server (NTRS)
Elzey, Dana M.; Wadley, Haydn N. G.
1996-01-01
The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.
How to Begin a Quality Improvement Project.
Silver, Samuel A; Harel, Ziv; McQuillan, Rory; Weizman, Adam V; Thomas, Alison; Chertow, Glenn M; Nesrallah, Gihad; Bell, Chaim M; Chan, Christopher T
2016-05-06
Quality improvement involves a combined effort among health care staff and stakeholders to diagnose and treat problems in the health care system. However, health care professionals often lack training in quality improvement methods, which makes it challenging to participate in improvement efforts. This article familiarizes health care professionals with how to begin a quality improvement project. The initial steps involve forming an improvement team that possesses expertise in the quality of care problem, leadership, and change management. Stakeholder mapping and analysis are useful tools at this stage, and these are reviewed to help identify individuals who might have a vested interest in the project. Physician engagement is a particularly important component of project success, and the knowledge that patients/caregivers can offer as members of a quality improvement team should not be overlooked. After a team is formed, an improvement framework helps to organize the scientific process of system change. Common quality improvement frameworks include Six Sigma, Lean, and the Model for Improvement. These models are contrasted, with a focus on the Model for Improvement, because it is widely used and applicable to a variety of quality of care problems without advanced training. It involves three steps: setting aims to focus improvement, choosing a balanced set of measures to determine if improvement occurs, and testing new ideas to change the current process. These new ideas are evaluated using Plan-Do-Study-Act cycles, where knowledge is gained by testing changes and reflecting on their effect. To show the real world utility of the quality improvement methods discussed, they are applied to a hypothetical quality improvement initiative that aims to promote home dialysis (home hemodialysis and peritoneal dialysis). This provides an example that kidney health care professionals can use to begin their own quality improvement projects. Copyright © 2016 by the American Society of Nephrology.
How to Begin a Quality Improvement Project
Harel, Ziv; McQuillan, Rory; Weizman, Adam V.; Thomas, Alison; Chertow, Glenn M.; Nesrallah, Gihad; Bell, Chaim M.; Chan, Christopher T.
2016-01-01
Quality improvement involves a combined effort among health care staff and stakeholders to diagnose and treat problems in the health care system. However, health care professionals often lack training in quality improvement methods, which makes it challenging to participate in improvement efforts. This article familiarizes health care professionals with how to begin a quality improvement project. The initial steps involve forming an improvement team that possesses expertise in the quality of care problem, leadership, and change management. Stakeholder mapping and analysis are useful tools at this stage, and these are reviewed to help identify individuals who might have a vested interest in the project. Physician engagement is a particularly important component of project success, and the knowledge that patients/caregivers can offer as members of a quality improvement team should not be overlooked. After a team is formed, an improvement framework helps to organize the scientific process of system change. Common quality improvement frameworks include Six Sigma, Lean, and the Model for Improvement. These models are contrasted, with a focus on the Model for Improvement, because it is widely used and applicable to a variety of quality of care problems without advanced training. It involves three steps: setting aims to focus improvement, choosing a balanced set of measures to determine if improvement occurs, and testing new ideas to change the current process. These new ideas are evaluated using Plan-Do-Study-Act cycles, where knowledge is gained by testing changes and reflecting on their effect. To show the real world utility of the quality improvement methods discussed, they are applied to a hypothetical quality improvement initiative that aims to promote home dialysis (home hemodialysis and peritoneal dialysis). This provides an example that kidney health care professionals can use to begin their own quality improvement projects. PMID:27016497
2009-09-01
NII)/CIO Assistant Secretary of Defense for Networks and Information Integration/Chief Information Officer CMMI Capability Maturity Model...a Web-based portal to share knowledge about software process-related methodologies, such as the SEI’s Capability Maturity Model Integration ( CMMI ...19 SEI’s IDEALSM model, and Lean Six Sigma.20 For example, the portal features content areas such as software acquisition management, the SEI CMMI
NASA Astrophysics Data System (ADS)
Marsh, C.; Pomeroy, J. W.; Wheater, H. S.
2017-12-01
Accurate management of water resources is necessary for social, economic, and environmental sustainability worldwide. In locations with seasonal snowcovers, the accurate prediction of these water resources is further complicated due to frozen soils, solid-phase precipitation, blowing snow transport, and snowcover-vegetation-atmosphere interactions. Complex process interactions and feedbacks are a key feature of hydrological systems and may result in emergent phenomena, i.e., the arising of novel and unexpected properties within a complex system. One example is the feedback associated with blowing snow redistribution, which can lead to drifts that cause locally-increased soil moisture, thus increasing plant growth that in turn subsequently impacts snow redistribution, creating larger drifts. Attempting to simulate these emergent behaviours is a significant challenge, however, and there is concern that process conceptualizations within current models are too incomplete to represent the needed interactions. An improved understanding of the role of emergence in hydrological systems often requires high resolution distributed numerical hydrological models that incorporate the relevant process dynamics. The Canadian Hydrological Model (CHM) provides a novel tool for examining cold region hydrological systems. Key features include efficient terrain representation, allowing simulations at various spatial scales, reduced computational overhead, and a modular process representation allowing for an alternative-hypothesis framework. Using both physics-based and conceptual process representations sourced from long term process studies and the current cold regions literature allows for comparison of process representations and importantly, their ability to produce emergent behaviours. Examining the system in a holistic, process-based manner can hopefully derive important insights and aid in development of improved process representations.
A Unified Data Assimilation Strategy for Regional Coupled Atmosphere-Ocean Prediction Systems
NASA Astrophysics Data System (ADS)
Xie, Lian; Liu, Bin; Zhang, Fuqing; Weng, Yonghui
2014-05-01
Improving tropical cyclone (TC) forecasts is a top priority in weather forecasting. Assimilating various observational data to produce better initial conditions for numerical models using advanced data assimilation techniques has been shown to benefit TC intensity forecasts, whereas assimilating large-scale environmental circulation into regional models by spectral nudging or Scale-Selective Data Assimilation (SSDA) has been demonstrated to improve TC track forecasts. Meanwhile, taking into account various air-sea interaction processes by high-resolution coupled air-sea modelling systems has also been shown to improve TC intensity forecasts. Despite the advances in data assimilation and air-sea coupled models, large errors in TC intensity and track forecasting remain. For example, Hurricane Nate (2011) has brought considerable challenge for the TC operational forecasting community, with very large intensity forecast errors (27, 25, and 40 kts for 48, 72, and 96 h, respectively) for the official forecasts. Considering the slow-moving nature of Hurricane Nate, it is reasonable to hypothesize that air-sea interaction processes played a critical role in the intensity change of the storm, and accurate representation of the upper ocean dynamics and thermodynamics is necessary to quantitatively describe the air-sea interaction processes. Currently, data assimilation techniques are generally only applied to hurricane forecasting in stand-alone atmospheric or oceanic model. In fact, most of the regional hurricane forecasting models only included data assimilation techniques for improving the initial condition of the atmospheric model. In such a situation, the benefit of adjustments in one model (atmospheric or oceanic) by assimilating observational data can be compromised by errors from the other model. Thus, unified data assimilation techniques for coupled air-sea modelling systems, which not only simultaneously assimilate atmospheric and oceanic observations into the coupled air-sea modelling system, but also nudging the large-scale environmental flow in the regional model towards global model forecasts are of increasing necessity. In this presentation, we will outline a strategy for an integrated approach in air-sea coupled data assimilation and discuss its benefits and feasibility from incremental results for select historical hurricane cases.
An improved PSO-SVM model for online recognition defects in eddy current testing
NASA Astrophysics Data System (ADS)
Liu, Baoling; Hou, Dibo; Huang, Pingjie; Liu, Banteng; Tang, Huayi; Zhang, Wubo; Chen, Peihua; Zhang, Guangxin
2013-12-01
Accurate and rapid recognition of defects is essential for structural integrity and health monitoring of in-service device using eddy current (EC) non-destructive testing. This paper introduces a novel model-free method that includes three main modules: a signal pre-processing module, a classifier module and an optimisation module. In the signal pre-processing module, a kind of two-stage differential structure is proposed to suppress the lift-off fluctuation that could contaminate the EC signal. In the classifier module, multi-class support vector machine (SVM) based on one-against-one strategy is utilised for its good accuracy. In the optimisation module, the optimal parameters of classifier are obtained by an improved particle swarm optimisation (IPSO) algorithm. The proposed IPSO technique can improve convergence performance of the primary PSO through the following strategies: nonlinear processing of inertia weight, introductions of the black hole and simulated annealing model with extremum disturbance. The good generalisation ability of the IPSO-SVM model has been validated through adding additional specimen into the testing set. Experiments show that the proposed algorithm can achieve higher recognition accuracy and efficiency than other well-known classifiers and the superiorities are more obvious with less training set, which contributes to online application.
Improving operating room productivity via parallel anesthesia processing.
Brown, Michael J; Subramanian, Arun; Curry, Timothy B; Kor, Daryl J; Moran, Steven L; Rohleder, Thomas R
2014-01-01
Parallel processing of regional anesthesia may improve operating room (OR) efficiency in patients undergoes upper extremity surgical procedures. The purpose of this paper is to evaluate whether performing regional anesthesia outside the OR in parallel increases total cases per day, improve efficiency and productivity. Data from all adult patients who underwent regional anesthesia as their primary anesthetic for upper extremity surgery over a one-year period were used to develop a simulation model. The model evaluated pure operating modes of regional anesthesia performed within and outside the OR in a parallel manner. The scenarios were used to evaluate how many surgeries could be completed in a standard work day (555 minutes) and assuming a standard three cases per day, what was the predicted end-of-day time overtime. Modeling results show that parallel processing of regional anesthesia increases the average cases per day for all surgeons included in the study. The average increase was 0.42 surgeries per day. Where it was assumed that three cases per day would be performed by all surgeons, the days going to overtime was reduced by 43 percent with parallel block. The overtime with parallel anesthesia was also projected to be 40 minutes less per day per surgeon. Key limitations include the assumption that all cases used regional anesthesia in the comparisons. Many days may have both regional and general anesthesia. Also, as a case study, single-center research may limit generalizability. Perioperative care providers should consider parallel administration of regional anesthesia where there is a desire to increase daily upper extremity surgical case capacity. Where there are sufficient resources to do parallel anesthesia processing, efficiency and productivity can be significantly improved. Simulation modeling can be an effective tool to show practice change effects at a system-wide level.
NASA Astrophysics Data System (ADS)
Brachet, N.; Mialle, P.; Brown, D.; Coyne, J.; Drob, D.; Virieux, J.; Garcés, M.
2009-04-01
The International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty (CTBTO) Preparatory Commission in Vienna is pursuing its automatic processing effort for the return of infrasound data processing into operations in 2009. Concurrently, work is also underway to further improve this process by enhancing the modeling of the infrasound propagation in the atmosphere and then by labeling the phases in order to improve the event categorization and location. In 2008, the IDC acquired WASP-3D Sph (Windy Atmospheric Sonic Propagation) (Virieux et al., 2004) a 3-D ray-tracing based long range propagation software that accounts for the heterogeneity of the atmosphere. Once adapted to the IDC environment, WASP-3 Sph has been used to improve the understanding of infrasound wave propagation and has been compared with the 1-D ray tracing Taupc software (Garcés and Drob, 2007) at the IDC. In addition to performing the infrasound propagation simulation, different atmospheric models are available at the IDC, either real-time: ECMWF (European Centre for Middle-range Weather Forecast), or empiric: HWM93 (Horizontal Wind Model) and HWM07 (Drob, 2008), used in their initial format or interpolated into G2S (Ground to Space) model. The IDC infrasound reference database is used for testing, comparing and validating the various propagation software and atmospheric specifications. Moreover all the performed simulations are giving feedback on the quality of the infrasound reference events and provide useful information to improve their location by refining infrasonic wave propagation characteristics. The results of this study are presented for a selection of reference events and they will help the IDC designing and defining short and mid-term enhancements of the infrasound automatic and interactive processing to take into account the spatial and temporal heterogeneities of the atmosphere.
Improving Permafrost Hydrology Prediction Through Data-Model Integration
NASA Astrophysics Data System (ADS)
Wilson, C. J.; Andresen, C. G.; Atchley, A. L.; Bolton, W. R.; Busey, R.; Coon, E.; Charsley-Groffman, L.
2017-12-01
The CMIP5 Earth System Models were unable to adequately predict the fate of the 16GT of permafrost carbon in a warming climate due to poor representation of Arctic ecosystem processes. The DOE Office of Science Next Generation Ecosystem Experiment, NGEE-Arctic project aims to reduce uncertainty in the Arctic carbon cycle and its impact on the Earth's climate system by improved representation of the coupled physical, chemical and biological processes that drive how much buried carbon will be converted to CO2 and CH4, how fast this will happen, which form will dominate, and the degree to which increased plant productivity will offset increased soil carbon emissions. These processes fundamentally depend on permafrost thaw rate and its influence on surface and subsurface hydrology through thermal erosion, land subsidence and changes to groundwater flow pathways as soil, bedrock and alluvial pore ice and massive ground ice melts. LANL and its NGEE colleagues are co-developing data and models to better understand controls on permafrost degradation and improve prediction of the evolution of permafrost and its impact on Arctic hydrology. The LANL Advanced Terrestrial Simulator was built using a state of the art HPC software framework to enable the first fully coupled 3-dimensional surface-subsurface thermal-hydrology and land surface deformation simulations to simulate the evolution of the physical Arctic environment. Here we show how field data including hydrology, snow, vegetation, geochemistry and soil properties, are informing the development and application of the ATS to improve understanding of controls on permafrost stability and permafrost hydrology. The ATS is being used to inform parameterizations of complex coupled physical, ecological and biogeochemical processes for implementation in the DOE ACME land model, to better predict the role of changing Arctic hydrology on the global climate system. LA-UR-17-26566.
Case Studies in Modelling, Control in Food Processes.
Glassey, J; Barone, A; Montague, G A; Sabou, V
This chapter discusses the importance of modelling and control in increasing food process efficiency and ensuring product quality. Various approaches to both modelling and control in food processing are set in the context of the specific challenges in this industrial sector and latest developments in each area are discussed. Three industrial case studies are used to demonstrate the benefits of advanced measurement, modelling and control in food processes. The first case study illustrates the use of knowledge elicitation from expert operators in the process for the manufacture of potato chips (French fries) and the consequent improvements in process control to increase the consistency of the resulting product. The second case study highlights the economic benefits of tighter control of an important process parameter, moisture content, in potato crisp (chips) manufacture. The final case study describes the use of NIR spectroscopy in ensuring effective mixing of dry multicomponent mixtures and pastes. Practical implementation tips and infrastructure requirements are also discussed.